14:51:08.356 INFO MiniDFSCluster - starting cluster: numNameNodes=1, numDataNodes=1
14:51:08.586 INFO NameNode - Formatting using clusterid: testClusterID
14:51:08.596 INFO FSEditLog - Edit logging is async:true
14:51:08.614 INFO FSNamesystem - KeyProvider: null
14:51:08.615 INFO FSNamesystem - fsLock is fair: true
14:51:08.615 INFO FSNamesystem - Detailed lock hold time metrics enabled: false
14:51:08.615 INFO FSNamesystem - fsOwner = runner (auth:SIMPLE)
14:51:08.615 INFO FSNamesystem - supergroup = supergroup
14:51:08.615 INFO FSNamesystem - isPermissionEnabled = true
14:51:08.615 INFO FSNamesystem - isStoragePolicyEnabled = true
14:51:08.616 INFO FSNamesystem - HA Enabled: false
14:51:08.647 INFO Util - dfs.datanode.fileio.profiling.sampling.percentage set to 0. Disabling file IO profiling
14:51:08.651 INFO deprecation - hadoop.configured.node.mapping is deprecated. Instead, use net.topology.configured.node.mapping
14:51:08.651 INFO DatanodeManager - dfs.block.invalidate.limit : configured=1000, counted=60, effected=1000
14:51:08.651 INFO DatanodeManager - dfs.namenode.datanode.registration.ip-hostname-check=true
14:51:08.653 INFO BlockManager - dfs.namenode.startup.delay.block.deletion.sec is set to 000:00:00:00.000
14:51:08.653 INFO BlockManager - The block deletion will start around 2026 Mar 04 14:51:08
14:51:08.654 INFO GSet - Computing capacity for map BlocksMap
14:51:08.654 INFO GSet - VM type = 64-bit
14:51:08.655 INFO GSet - 2.0% max memory 3.4 GB = 70 MB
14:51:08.655 INFO GSet - capacity = 2^23 = 8388608 entries
14:51:08.658 INFO BlockManager - Storage policy satisfier is disabled
14:51:08.659 INFO BlockManager - dfs.block.access.token.enable = false
14:51:08.662 INFO BlockManagerSafeMode - dfs.namenode.safemode.threshold-pct = 0.999
14:51:08.662 INFO BlockManagerSafeMode - dfs.namenode.safemode.min.datanodes = 0
14:51:08.662 INFO BlockManagerSafeMode - dfs.namenode.safemode.extension = 0
14:51:08.663 INFO BlockManager - defaultReplication = 1
14:51:08.663 INFO BlockManager - maxReplication = 512
14:51:08.663 INFO BlockManager - minReplication = 1
14:51:08.663 INFO BlockManager - maxReplicationStreams = 2
14:51:08.663 INFO BlockManager - redundancyRecheckInterval = 3000ms
14:51:08.663 INFO BlockManager - encryptDataTransfer = false
14:51:08.663 INFO BlockManager - maxNumBlocksToLog = 1000
14:51:08.681 INFO FSDirectory - GLOBAL serial map: bits=29 maxEntries=536870911
14:51:08.681 INFO FSDirectory - USER serial map: bits=24 maxEntries=16777215
14:51:08.681 INFO FSDirectory - GROUP serial map: bits=24 maxEntries=16777215
14:51:08.681 INFO FSDirectory - XATTR serial map: bits=24 maxEntries=16777215
14:51:08.687 INFO GSet - Computing capacity for map INodeMap
14:51:08.687 INFO GSet - VM type = 64-bit
14:51:08.688 INFO GSet - 1.0% max memory 3.4 GB = 35 MB
14:51:08.688 INFO GSet - capacity = 2^22 = 4194304 entries
14:51:08.689 INFO FSDirectory - ACLs enabled? true
14:51:08.689 INFO FSDirectory - POSIX ACL inheritance enabled? true
14:51:08.689 INFO FSDirectory - XAttrs enabled? true
14:51:08.689 INFO NameNode - Caching file names occurring more than 10 times
14:51:08.692 INFO SnapshotManager - Loaded config captureOpenFiles: false, skipCaptureAccessTimeOnlyChange: false, snapshotDiffAllowSnapRootDescendant: true, maxSnapshotLimit: 65536
14:51:08.693 INFO SnapshotManager - SkipList is disabled
14:51:08.696 INFO GSet - Computing capacity for map cachedBlocks
14:51:08.696 INFO GSet - VM type = 64-bit
14:51:08.696 INFO GSet - 0.25% max memory 3.4 GB = 8.8 MB
14:51:08.696 INFO GSet - capacity = 2^20 = 1048576 entries
14:51:08.701 INFO TopMetrics - NNTop conf: dfs.namenode.top.window.num.buckets = 10
14:51:08.701 INFO TopMetrics - NNTop conf: dfs.namenode.top.num.users = 10
14:51:08.701 INFO TopMetrics - NNTop conf: dfs.namenode.top.windows.minutes = 1,5,25
14:51:08.702 INFO FSNamesystem - Retry cache on namenode is enabled
14:51:08.702 INFO FSNamesystem - Retry cache will use 0.03 of total heap and retry cache entry expiry time is 600000 millis
14:51:08.704 INFO GSet - Computing capacity for map NameNodeRetryCache
14:51:08.704 INFO GSet - VM type = 64-bit
14:51:08.704 INFO GSet - 0.029999999329447746% max memory 3.4 GB = 1.0 MB
14:51:08.704 INFO GSet - capacity = 2^17 = 131072 entries
14:51:08.715 INFO FSImage - Allocated new BlockPoolId: BP-1768883704-10.1.0.125-1772635868711
14:51:08.720 INFO Storage - Storage directory /tmp/minicluster_storage16268522075870465194/name-0-1 has been successfully formatted.
14:51:08.721 INFO Storage - Storage directory /tmp/minicluster_storage16268522075870465194/name-0-2 has been successfully formatted.
14:51:08.740 INFO FSImageFormatProtobuf - Saving image file /tmp/minicluster_storage16268522075870465194/name-0-2/current/fsimage.ckpt_0000000000000000000 using no compression
14:51:08.740 INFO FSImageFormatProtobuf - Saving image file /tmp/minicluster_storage16268522075870465194/name-0-1/current/fsimage.ckpt_0000000000000000000 using no compression
14:51:08.855 INFO FSImageFormatProtobuf - Image file /tmp/minicluster_storage16268522075870465194/name-0-2/current/fsimage.ckpt_0000000000000000000 of size 401 bytes saved in 0 seconds .
14:51:08.855 INFO FSImageFormatProtobuf - Image file /tmp/minicluster_storage16268522075870465194/name-0-1/current/fsimage.ckpt_0000000000000000000 of size 401 bytes saved in 0 seconds .
14:51:08.866 INFO NNStorageRetentionManager - Going to retain 1 images with txid >= 0
14:51:08.885 INFO FSNamesystem - Stopping services started for active state
14:51:08.885 INFO FSNamesystem - Stopping services started for standby state
14:51:08.886 INFO NameNode - createNameNode []
14:51:08.925 WARN MetricsConfig - Cannot locate configuration: tried hadoop-metrics2-namenode.properties,hadoop-metrics2.properties
14:51:08.933 INFO MetricsSystemImpl - Scheduled Metric snapshot period at 10 second(s).
14:51:08.934 INFO MetricsSystemImpl - NameNode metrics system started
14:51:08.939 INFO NameNodeUtils - fs.defaultFS is hdfs://127.0.0.1:0
14:51:08.966 INFO JvmPauseMonitor - Starting JVM pause monitor
14:51:08.978 INFO DFSUtil - Filter initializers set : org.apache.hadoop.http.lib.StaticUserWebFilter,org.apache.hadoop.hdfs.web.AuthFilterInitializer
14:51:08.982 INFO DFSUtil - Starting Web-server for hdfs at: http://localhost:0
14:51:08.992 INFO log - Logging initialized @28358ms to org.eclipse.jetty.util.log.Slf4jLog
14:51:09.062 WARN AuthenticationFilter - Unable to initialize FileSignerSecretProvider, falling back to use random secrets. Reason: Could not read signature secret file: /home/runner/hadoop-http-auth-signature-secret
14:51:09.066 WARN HttpRequestLog - Jetty request log can only be enabled using Log4j
14:51:09.069 INFO HttpServer2 - Added global filter 'safety' (class=org.apache.hadoop.http.HttpServer2$QuotingInputFilter)
14:51:09.070 INFO HttpServer2 - Added filter static_user_filter (class=org.apache.hadoop.http.lib.StaticUserWebFilter$StaticUserFilter) to context hdfs
14:51:09.071 INFO HttpServer2 - Added filter static_user_filter (class=org.apache.hadoop.http.lib.StaticUserWebFilter$StaticUserFilter) to context static
14:51:09.072 INFO HttpServer2 - Added filter AuthFilter (class=org.apache.hadoop.hdfs.web.AuthFilter) to context hdfs
14:51:09.072 INFO HttpServer2 - Added filter AuthFilter (class=org.apache.hadoop.hdfs.web.AuthFilter) to context static
14:51:09.101 INFO HttpServer2 - addJerseyResourcePackage: packageName=org.apache.hadoop.hdfs.server.namenode.web.resources;org.apache.hadoop.hdfs.web.resources, pathSpec=/webhdfs/v1/*
14:51:09.105 INFO HttpServer2 - Jetty bound to port 40947
14:51:09.106 INFO Server - jetty-9.4.56.v20240826; built: 2024-08-26T17:15:05.868Z; git: ec6782ff5ead824dabdcf47fa98f90a4aedff401; jvm 17.0.6+10
14:51:09.130 INFO session - DefaultSessionIdManager workerName=node0
14:51:09.130 INFO session - No SessionScavenger set, using defaults
14:51:09.132 INFO session - node0 Scavenging every 600000ms
14:51:09.145 WARN AuthenticationFilter - Unable to initialize FileSignerSecretProvider, falling back to use random secrets. Reason: Could not read signature secret file: /home/runner/hadoop-http-auth-signature-secret
14:51:09.147 INFO ContextHandler - Started o.e.j.s.ServletContextHandler@4e670245{static,/static,jar:file:/home/runner/.gradle/caches/modules-2/files-2.1/org.apache.hadoop/hadoop-hdfs/3.3.6/5058b645375c6a68f509e167ad6a6ada9642df09/hadoop-hdfs-3.3.6-tests.jar!/webapps/static,AVAILABLE}
14:51:09.281 INFO ContextHandler - Started o.e.j.w.WebAppContext@7e3f7bde{hdfs,/,file:///tmp/jetty-localhost-40947-hadoop-hdfs-3_3_6-tests_jar-_-any-2476154713751773569/webapp/,AVAILABLE}{jar:file:/home/runner/.gradle/caches/modules-2/files-2.1/org.apache.hadoop/hadoop-hdfs/3.3.6/5058b645375c6a68f509e167ad6a6ada9642df09/hadoop-hdfs-3.3.6-tests.jar!/webapps/hdfs}
14:51:09.286 INFO AbstractConnector - Started ServerConnector@2984638a{HTTP/1.1, (http/1.1)}{localhost:40947}
14:51:09.286 INFO Server - Started @28653ms
14:51:09.291 INFO FSEditLog - Edit logging is async:true
14:51:09.300 INFO FSNamesystem - KeyProvider: null
14:51:09.300 INFO FSNamesystem - fsLock is fair: true
14:51:09.300 INFO FSNamesystem - Detailed lock hold time metrics enabled: false
14:51:09.301 INFO FSNamesystem - fsOwner = runner (auth:SIMPLE)
14:51:09.301 INFO FSNamesystem - supergroup = supergroup
14:51:09.301 INFO FSNamesystem - isPermissionEnabled = true
14:51:09.301 INFO FSNamesystem - isStoragePolicyEnabled = true
14:51:09.301 INFO FSNamesystem - HA Enabled: false
14:51:09.301 INFO Util - dfs.datanode.fileio.profiling.sampling.percentage set to 0. Disabling file IO profiling
14:51:09.301 INFO DatanodeManager - dfs.block.invalidate.limit : configured=1000, counted=60, effected=1000
14:51:09.301 INFO DatanodeManager - dfs.namenode.datanode.registration.ip-hostname-check=true
14:51:09.301 INFO BlockManager - dfs.namenode.startup.delay.block.deletion.sec is set to 000:00:00:00.000
14:51:09.302 INFO BlockManager - The block deletion will start around 2026 Mar 04 14:51:09
14:51:09.302 INFO GSet - Computing capacity for map BlocksMap
14:51:09.302 INFO GSet - VM type = 64-bit
14:51:09.302 INFO GSet - 2.0% max memory 3.4 GB = 70 MB
14:51:09.302 INFO GSet - capacity = 2^23 = 8388608 entries
14:51:09.318 INFO BlockManager - Storage policy satisfier is disabled
14:51:09.318 INFO BlockManager - dfs.block.access.token.enable = false
14:51:09.319 INFO BlockManagerSafeMode - dfs.namenode.safemode.threshold-pct = 0.999
14:51:09.319 INFO BlockManagerSafeMode - dfs.namenode.safemode.min.datanodes = 0
14:51:09.319 INFO BlockManagerSafeMode - dfs.namenode.safemode.extension = 0
14:51:09.319 INFO BlockManager - defaultReplication = 1
14:51:09.319 INFO BlockManager - maxReplication = 512
14:51:09.319 INFO BlockManager - minReplication = 1
14:51:09.319 INFO BlockManager - maxReplicationStreams = 2
14:51:09.319 INFO BlockManager - redundancyRecheckInterval = 3000ms
14:51:09.319 INFO BlockManager - encryptDataTransfer = false
14:51:09.319 INFO BlockManager - maxNumBlocksToLog = 1000
14:51:09.319 INFO GSet - Computing capacity for map INodeMap
14:51:09.319 INFO GSet - VM type = 64-bit
14:51:09.319 INFO GSet - 1.0% max memory 3.4 GB = 35 MB
14:51:09.319 INFO GSet - capacity = 2^22 = 4194304 entries
14:51:09.320 INFO BlockManagerInfo - Removed broadcast_21_piece0 on localhost:44923 in memory (size: 2.4 KiB, free: 1920.0 MiB)
14:51:09.321 INFO FSDirectory - ACLs enabled? true
14:51:09.321 INFO FSDirectory - POSIX ACL inheritance enabled? true
14:51:09.321 INFO FSDirectory - XAttrs enabled? true
14:51:09.321 INFO NameNode - Caching file names occurring more than 10 times
14:51:09.321 INFO SnapshotManager - Loaded config captureOpenFiles: false, skipCaptureAccessTimeOnlyChange: false, snapshotDiffAllowSnapRootDescendant: true, maxSnapshotLimit: 65536
14:51:09.321 INFO SnapshotManager - SkipList is disabled
14:51:09.321 INFO GSet - Computing capacity for map cachedBlocks
14:51:09.321 INFO GSet - VM type = 64-bit
14:51:09.321 INFO GSet - 0.25% max memory 3.4 GB = 8.8 MB
14:51:09.321 INFO GSet - capacity = 2^20 = 1048576 entries
14:51:09.322 INFO TopMetrics - NNTop conf: dfs.namenode.top.window.num.buckets = 10
14:51:09.322 INFO TopMetrics - NNTop conf: dfs.namenode.top.num.users = 10
14:51:09.322 INFO TopMetrics - NNTop conf: dfs.namenode.top.windows.minutes = 1,5,25
14:51:09.322 INFO FSNamesystem - Retry cache on namenode is enabled
14:51:09.322 INFO FSNamesystem - Retry cache will use 0.03 of total heap and retry cache entry expiry time is 600000 millis
14:51:09.322 INFO GSet - Computing capacity for map NameNodeRetryCache
14:51:09.322 INFO GSet - VM type = 64-bit
14:51:09.322 INFO GSet - 0.029999999329447746% max memory 3.4 GB = 1.0 MB
14:51:09.322 INFO GSet - capacity = 2^17 = 131072 entries
14:51:09.326 INFO BlockManagerInfo - Removed broadcast_27_piece0 on localhost:44923 in memory (size: 5.1 KiB, free: 1920.0 MiB)
14:51:09.326 INFO Storage - Lock on /tmp/minicluster_storage16268522075870465194/name-0-1/in_use.lock acquired by nodename 3348@runnervmnay03
14:51:09.328 INFO Storage - Lock on /tmp/minicluster_storage16268522075870465194/name-0-2/in_use.lock acquired by nodename 3348@runnervmnay03
14:51:09.329 INFO FileJournalManager - Recovering unfinalized segments in /tmp/minicluster_storage16268522075870465194/name-0-1/current
14:51:09.329 INFO FileJournalManager - Recovering unfinalized segments in /tmp/minicluster_storage16268522075870465194/name-0-2/current
14:51:09.330 INFO FSImage - No edit log streams selected.
14:51:09.330 INFO FSImage - Planning to load image: FSImageFile(file=/tmp/minicluster_storage16268522075870465194/name-0-1/current/fsimage_0000000000000000000, cpktTxId=0000000000000000000)
14:51:09.332 INFO BlockManagerInfo - Removed broadcast_28_piece0 on localhost:44923 in memory (size: 320.0 B, free: 1920.0 MiB)
14:51:09.334 INFO BlockManagerInfo - Removed broadcast_31_piece0 on localhost:44923 in memory (size: 320.0 B, free: 1920.0 MiB)
14:51:09.335 INFO BlockManagerInfo - Removed broadcast_29_piece0 on localhost:44923 in memory (size: 3.8 KiB, free: 1920.0 MiB)
14:51:09.338 INFO BlockManagerInfo - Removed broadcast_22_piece0 on localhost:44923 in memory (size: 159.0 B, free: 1920.0 MiB)
14:51:09.343 INFO BlockManagerInfo - Removed broadcast_26_piece0 on localhost:44923 in memory (size: 3.2 KiB, free: 1920.0 MiB)
14:51:09.349 INFO BlockManagerInfo - Removed broadcast_33_piece0 on localhost:44923 in memory (size: 4.8 KiB, free: 1920.0 MiB)
14:51:09.352 INFO BlockManagerInfo - Removed broadcast_30_piece0 on localhost:44923 in memory (size: 4.7 KiB, free: 1920.0 MiB)
14:51:09.353 INFO FSImageFormatPBINode - Loading 1 INodes.
14:51:09.353 INFO FSImageFormatPBINode - Successfully loaded 1 inodes
14:51:09.354 INFO BlockManager - Removing RDD 47
14:51:09.356 INFO BlockManagerInfo - Removed broadcast_32_piece0 on localhost:44923 in memory (size: 3.8 KiB, free: 1920.0 MiB)
14:51:09.357 INFO FSImageFormatPBINode - Completed update blocks map and name cache, total waiting duration 0ms.
14:51:09.357 INFO BlockManagerInfo - Removed broadcast_25_piece0 on localhost:44923 in memory (size: 4.5 KiB, free: 1920.0 MiB)
14:51:09.358 INFO FSImageFormatProtobuf - Loaded FSImage in 0 seconds.
14:51:09.358 INFO FSImage - Loaded image for txid 0 from /tmp/minicluster_storage16268522075870465194/name-0-1/current/fsimage_0000000000000000000
14:51:09.359 INFO BlockManagerInfo - Removed broadcast_23_piece0 on localhost:44923 in memory (size: 465.0 B, free: 1920.0 MiB)
14:51:09.360 INFO BlockManagerInfo - Removed broadcast_24_piece0 on localhost:44923 in memory (size: 4.3 KiB, free: 1920.0 MiB)
14:51:09.362 INFO FSNamesystem - Need to save fs image? false (staleImage=false, haEnabled=false, isRollingUpgrade=false)
14:51:09.363 INFO FSEditLog - Starting log segment at 1
14:51:09.372 INFO NameCache - initialized with 0 entries 0 lookups
14:51:09.372 INFO FSNamesystem - Finished loading FSImage in 49 msecs
14:51:09.439 INFO NameNode - RPC server is binding to localhost:0
14:51:09.439 INFO NameNode - Enable NameNode state context:false
14:51:09.442 INFO CallQueueManager - Using callQueue: class java.util.concurrent.LinkedBlockingQueue, queueCapacity: 1000, scheduler: class org.apache.hadoop.ipc.DefaultRpcScheduler, ipcBackoff: false.
14:51:09.449 INFO Server - Listener at localhost:43595
14:51:09.449 INFO Server - Starting Socket Reader #1 for port 0
14:51:09.471 INFO NameNode - Clients are to use localhost:43595 to access this namenode/service.
14:51:09.473 INFO FSNamesystem - Registered FSNamesystemState, ReplicatedBlocksState and ECBlockGroupsState MBeans.
14:51:09.488 INFO LeaseManager - Number of blocks under construction: 0
14:51:09.493 INFO DatanodeAdminDefaultMonitor - Initialized the Default Decommission and Maintenance monitor
14:51:09.494 INFO BlockManager - Start MarkedDeleteBlockScrubber thread
14:51:09.495 INFO BlockManager - initializing replication queues
14:51:09.495 INFO StateChange - STATE* Leaving safe mode after 0 secs
14:51:09.495 INFO StateChange - STATE* Network topology has 0 racks and 0 datanodes
14:51:09.496 INFO StateChange - STATE* UnderReplicatedBlocks has 0 blocks
14:51:09.498 INFO BlockManager - Total number of blocks = 0
14:51:09.499 INFO BlockManager - Number of invalid blocks = 0
14:51:09.499 INFO BlockManager - Number of under-replicated blocks = 0
14:51:09.499 INFO BlockManager - Number of over-replicated blocks = 0
14:51:09.499 INFO BlockManager - Number of blocks being written = 0
14:51:09.499 INFO StateChange - STATE* Replication Queue initialization scan for invalid, over- and under-replicated blocks completed in 4 msec
14:51:09.513 INFO Server - IPC Server Responder: starting
14:51:09.513 INFO Server - IPC Server listener on 0: starting
14:51:09.516 INFO NameNode - NameNode RPC up at: localhost/127.0.0.1:43595
14:51:09.517 WARN MetricsLoggerTask - Metrics logging will not be async since the logger is not log4j
14:51:09.517 INFO FSNamesystem - Starting services required for active state
14:51:09.517 INFO FSDirectory - Initializing quota with 12 thread(s)
14:51:09.519 INFO FSDirectory - Quota initialization completed in 2 milliseconds
name space=1
storage space=0
storage types=RAM_DISK=0, SSD=0, DISK=0, ARCHIVE=0, PROVIDED=0
14:51:09.521 INFO CacheReplicationMonitor - Starting CacheReplicationMonitor with interval 30000 milliseconds
14:51:09.529 INFO MiniDFSCluster - Starting DataNode 0 with dfs.datanode.data.dir: [DISK]file:/tmp/minicluster_storage16268522075870465194/data/data1,[DISK]file:/tmp/minicluster_storage16268522075870465194/data/data2
14:51:09.542 INFO ThrottledAsyncChecker - Scheduling a check for [DISK]file:/tmp/minicluster_storage16268522075870465194/data/data1
14:51:09.550 INFO ThrottledAsyncChecker - Scheduling a check for [DISK]file:/tmp/minicluster_storage16268522075870465194/data/data2
14:51:09.563 INFO MetricsSystemImpl - DataNode metrics system started (again)
14:51:09.570 INFO Util - dfs.datanode.fileio.profiling.sampling.percentage set to 0. Disabling file IO profiling
14:51:09.573 INFO BlockScanner - Initialized block scanner with targetBytesPerSec 1048576
14:51:09.576 INFO DataNode - Configured hostname is 127.0.0.1
14:51:09.577 INFO Util - dfs.datanode.fileio.profiling.sampling.percentage set to 0. Disabling file IO profiling
14:51:09.577 INFO DataNode - Starting DataNode with maxLockedMemory = 0
14:51:09.581 INFO DataNode - Opened streaming server at /127.0.0.1:34059
14:51:09.583 INFO DataNode - Balancing bandwidth is 104857600 bytes/s
14:51:09.583 INFO DataNode - Number threads for balancing is 100
14:51:09.587 WARN AuthenticationFilter - Unable to initialize FileSignerSecretProvider, falling back to use random secrets. Reason: Could not read signature secret file: /home/runner/hadoop-http-auth-signature-secret
14:51:09.588 WARN HttpRequestLog - Jetty request log can only be enabled using Log4j
14:51:09.589 INFO HttpServer2 - Added global filter 'safety' (class=org.apache.hadoop.http.HttpServer2$QuotingInputFilter)
14:51:09.589 INFO HttpServer2 - Added filter static_user_filter (class=org.apache.hadoop.http.lib.StaticUserWebFilter$StaticUserFilter) to context datanode
14:51:09.590 INFO HttpServer2 - Added filter static_user_filter (class=org.apache.hadoop.http.lib.StaticUserWebFilter$StaticUserFilter) to context static
14:51:09.591 INFO HttpServer2 - Jetty bound to port 37027
14:51:09.591 INFO Server - jetty-9.4.56.v20240826; built: 2024-08-26T17:15:05.868Z; git: ec6782ff5ead824dabdcf47fa98f90a4aedff401; jvm 17.0.6+10
14:51:09.592 INFO session - DefaultSessionIdManager workerName=node0
14:51:09.592 INFO session - No SessionScavenger set, using defaults
14:51:09.592 INFO session - node0 Scavenging every 660000ms
14:51:09.593 INFO ContextHandler - Started o.e.j.s.ServletContextHandler@16641839{static,/static,jar:file:/home/runner/.gradle/caches/modules-2/files-2.1/org.apache.hadoop/hadoop-hdfs/3.3.6/5058b645375c6a68f509e167ad6a6ada9642df09/hadoop-hdfs-3.3.6-tests.jar!/webapps/static,AVAILABLE}
14:51:09.699 INFO ContextHandler - Started o.e.j.w.WebAppContext@39b88618{datanode,/,file:///tmp/jetty-localhost-37027-hadoop-hdfs-3_3_6-tests_jar-_-any-3774403122790836223/webapp/,AVAILABLE}{jar:file:/home/runner/.gradle/caches/modules-2/files-2.1/org.apache.hadoop/hadoop-hdfs/3.3.6/5058b645375c6a68f509e167ad6a6ada9642df09/hadoop-hdfs-3.3.6-tests.jar!/webapps/datanode}
14:51:09.699 INFO AbstractConnector - Started ServerConnector@26b3bc8f{HTTP/1.1, (http/1.1)}{localhost:37027}
14:51:09.699 INFO Server - Started @29066ms
14:51:09.705 WARN DatanodeHttpServer - Got null for restCsrfPreventionFilter - will not do any filtering.
14:51:09.706 INFO DatanodeHttpServer - Listening HTTP traffic on /127.0.0.1:43931
14:51:09.707 INFO JvmPauseMonitor - Starting JVM pause monitor
14:51:09.708 INFO DataNode - dnUserName = runner
14:51:09.708 INFO DataNode - supergroup = supergroup
14:51:09.718 INFO CallQueueManager - Using callQueue: class java.util.concurrent.LinkedBlockingQueue, queueCapacity: 1000, scheduler: class org.apache.hadoop.ipc.DefaultRpcScheduler, ipcBackoff: false.
14:51:09.719 INFO Server - Listener at localhost:35677
14:51:09.719 INFO Server - Starting Socket Reader #1 for port 0
14:51:09.723 INFO DataNode - Opened IPC server at /127.0.0.1:35677
14:51:09.745 INFO DataNode - Refresh request received for nameservices: null
14:51:09.746 INFO DataNode - Starting BPOfferServices for nameservices: <default>
14:51:09.755 INFO DataNode - Block pool <registering> (Datanode Uuid unassigned) service to localhost/127.0.0.1:43595 starting to offer service
14:51:09.757 WARN MetricsLoggerTask - Metrics logging will not be async since the logger is not log4j
14:51:09.758 INFO Server - IPC Server Responder: starting
14:51:09.758 INFO Server - IPC Server listener on 0: starting
14:51:09.869 INFO DataNode - Acknowledging ACTIVE Namenode during handshakeBlock pool <registering> (Datanode Uuid unassigned) service to localhost/127.0.0.1:43595
14:51:09.870 INFO Storage - Using 2 threads to upgrade data directories (dfs.datanode.parallel.volumes.load.threads.num=2, dataDirs=2)
14:51:09.871 INFO Storage - Lock on /tmp/minicluster_storage16268522075870465194/data/data1/in_use.lock acquired by nodename 3348@runnervmnay03
14:51:09.872 INFO Storage - Storage directory with location [DISK]file:/tmp/minicluster_storage16268522075870465194/data/data1 is not formatted for namespace 789686550. Formatting...
14:51:09.872 INFO Storage - Generated new storageID DS-c4d6e5ab-12ff-44c3-8f78-9b7127043a66 for directory /tmp/minicluster_storage16268522075870465194/data/data1
14:51:09.874 INFO Storage - Lock on /tmp/minicluster_storage16268522075870465194/data/data2/in_use.lock acquired by nodename 3348@runnervmnay03
14:51:09.874 INFO Storage - Storage directory with location [DISK]file:/tmp/minicluster_storage16268522075870465194/data/data2 is not formatted for namespace 789686550. Formatting...
14:51:09.875 INFO Storage - Generated new storageID DS-1d60ecab-7d6a-4ede-984e-134f3c383036 for directory /tmp/minicluster_storage16268522075870465194/data/data2
14:51:09.879 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=datanodeReport src=null dst=null perm=null proto=rpc
14:51:09.883 INFO MiniDFSCluster - dnInfo.length != numDataNodes
14:51:09.883 INFO MiniDFSCluster - Waiting for cluster to become active
14:51:09.895 INFO Storage - Analyzing storage directories for bpid BP-1768883704-10.1.0.125-1772635868711
14:51:09.895 INFO Storage - Locking is disabled for /tmp/minicluster_storage16268522075870465194/data/data1/current/BP-1768883704-10.1.0.125-1772635868711
14:51:09.895 INFO Storage - Block pool storage directory for location [DISK]file:/tmp/minicluster_storage16268522075870465194/data/data1 and block pool id BP-1768883704-10.1.0.125-1772635868711 is not formatted. Formatting ...
14:51:09.895 INFO Storage - Formatting block pool BP-1768883704-10.1.0.125-1772635868711 directory /tmp/minicluster_storage16268522075870465194/data/data1/current/BP-1768883704-10.1.0.125-1772635868711/current
14:51:09.911 INFO Storage - Analyzing storage directories for bpid BP-1768883704-10.1.0.125-1772635868711
14:51:09.911 INFO Storage - Locking is disabled for /tmp/minicluster_storage16268522075870465194/data/data2/current/BP-1768883704-10.1.0.125-1772635868711
14:51:09.911 INFO Storage - Block pool storage directory for location [DISK]file:/tmp/minicluster_storage16268522075870465194/data/data2 and block pool id BP-1768883704-10.1.0.125-1772635868711 is not formatted. Formatting ...
14:51:09.911 INFO Storage - Formatting block pool BP-1768883704-10.1.0.125-1772635868711 directory /tmp/minicluster_storage16268522075870465194/data/data2/current/BP-1768883704-10.1.0.125-1772635868711/current
14:51:09.913 INFO DataNode - Setting up storage: nsid=789686550;bpid=BP-1768883704-10.1.0.125-1772635868711;lv=-57;nsInfo=lv=-66;cid=testClusterID;nsid=789686550;c=1772635868711;bpid=BP-1768883704-10.1.0.125-1772635868711;dnuuid=null
14:51:09.914 INFO DataNode - Generated and persisted new Datanode UUID 3c05d188-bc2c-46c8-958b-79f7184a02a2
14:51:09.923 INFO FsDatasetImpl - The datanode lock is a read write lock
14:51:09.948 INFO FsDatasetImpl - Added new volume: DS-c4d6e5ab-12ff-44c3-8f78-9b7127043a66
14:51:09.948 INFO FsDatasetImpl - Added volume - [DISK]file:/tmp/minicluster_storage16268522075870465194/data/data1, StorageType: DISK
14:51:09.949 INFO FsDatasetImpl - Added new volume: DS-1d60ecab-7d6a-4ede-984e-134f3c383036
14:51:09.950 INFO FsDatasetImpl - Added volume - [DISK]file:/tmp/minicluster_storage16268522075870465194/data/data2, StorageType: DISK
14:51:09.952 INFO MemoryMappableBlockLoader - Initializing cache loader: MemoryMappableBlockLoader.
14:51:09.954 INFO FsDatasetImpl - Registered FSDatasetState MBean
14:51:09.957 INFO FsDatasetImpl - Adding block pool BP-1768883704-10.1.0.125-1772635868711
14:51:09.958 INFO FsDatasetImpl - Scanning block pool BP-1768883704-10.1.0.125-1772635868711 on volume /tmp/minicluster_storage16268522075870465194/data/data1...
14:51:09.958 INFO FsDatasetImpl - Scanning block pool BP-1768883704-10.1.0.125-1772635868711 on volume /tmp/minicluster_storage16268522075870465194/data/data2...
14:51:09.963 WARN FsDatasetImpl - dfsUsed file missing in /tmp/minicluster_storage16268522075870465194/data/data2/current/BP-1768883704-10.1.0.125-1772635868711/current, will proceed with Du for space computation calculation,
14:51:09.963 WARN FsDatasetImpl - dfsUsed file missing in /tmp/minicluster_storage16268522075870465194/data/data1/current/BP-1768883704-10.1.0.125-1772635868711/current, will proceed with Du for space computation calculation,
14:51:09.980 INFO FsDatasetImpl - Time taken to scan block pool BP-1768883704-10.1.0.125-1772635868711 on /tmp/minicluster_storage16268522075870465194/data/data2: 23ms
14:51:09.982 INFO FsDatasetImpl - Time taken to scan block pool BP-1768883704-10.1.0.125-1772635868711 on /tmp/minicluster_storage16268522075870465194/data/data1: 24ms
14:51:09.982 INFO FsDatasetImpl - Total time to scan all replicas for block pool BP-1768883704-10.1.0.125-1772635868711: 24ms
14:51:09.983 INFO FsDatasetImpl - Adding replicas to map for block pool BP-1768883704-10.1.0.125-1772635868711 on volume /tmp/minicluster_storage16268522075870465194/data/data1...
14:51:09.983 INFO FsDatasetImpl - Adding replicas to map for block pool BP-1768883704-10.1.0.125-1772635868711 on volume /tmp/minicluster_storage16268522075870465194/data/data2...
14:51:09.983 INFO BlockPoolSlice - Replica Cache file: /tmp/minicluster_storage16268522075870465194/data/data1/current/BP-1768883704-10.1.0.125-1772635868711/current/replicas doesn't exist
14:51:09.983 INFO BlockPoolSlice - Replica Cache file: /tmp/minicluster_storage16268522075870465194/data/data2/current/BP-1768883704-10.1.0.125-1772635868711/current/replicas doesn't exist
14:51:09.984 INFO FsDatasetImpl - Time to add replicas to map for block pool BP-1768883704-10.1.0.125-1772635868711 on volume /tmp/minicluster_storage16268522075870465194/data/data2: 1ms
14:51:09.984 INFO FsDatasetImpl - Time to add replicas to map for block pool BP-1768883704-10.1.0.125-1772635868711 on volume /tmp/minicluster_storage16268522075870465194/data/data1: 1ms
14:51:09.984 INFO FsDatasetImpl - Total time to add all replicas to map for block pool BP-1768883704-10.1.0.125-1772635868711: 1ms
14:51:09.984 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=datanodeReport src=null dst=null perm=null proto=rpc
14:51:09.985 INFO ThrottledAsyncChecker - Scheduling a check for /tmp/minicluster_storage16268522075870465194/data/data1
14:51:09.985 INFO MiniDFSCluster - dnInfo.length != numDataNodes
14:51:09.985 INFO MiniDFSCluster - Waiting for cluster to become active
14:51:09.989 INFO DatasetVolumeChecker - Scheduled health check for volume /tmp/minicluster_storage16268522075870465194/data/data1
14:51:09.990 INFO ThrottledAsyncChecker - Scheduling a check for /tmp/minicluster_storage16268522075870465194/data/data2
14:51:09.990 INFO DatasetVolumeChecker - Scheduled health check for volume /tmp/minicluster_storage16268522075870465194/data/data2
14:51:09.991 INFO VolumeScanner - Now scanning bpid BP-1768883704-10.1.0.125-1772635868711 on volume /tmp/minicluster_storage16268522075870465194/data/data2
14:51:09.991 INFO VolumeScanner - Now scanning bpid BP-1768883704-10.1.0.125-1772635868711 on volume /tmp/minicluster_storage16268522075870465194/data/data1
14:51:09.992 INFO VolumeScanner - VolumeScanner(/tmp/minicluster_storage16268522075870465194/data/data1, DS-c4d6e5ab-12ff-44c3-8f78-9b7127043a66): finished scanning block pool BP-1768883704-10.1.0.125-1772635868711
14:51:09.992 INFO VolumeScanner - VolumeScanner(/tmp/minicluster_storage16268522075870465194/data/data2, DS-1d60ecab-7d6a-4ede-984e-134f3c383036): finished scanning block pool BP-1768883704-10.1.0.125-1772635868711
14:51:09.993 WARN DirectoryScanner - dfs.datanode.directoryscan.throttle.limit.ms.per.sec set to value above 1000 ms/sec. Assuming default value of -1
14:51:09.993 INFO DirectoryScanner - Periodic Directory Tree Verification scan starting in 14091656ms with interval of 21600000ms and throttle limit of -1ms/s
14:51:09.995 INFO VolumeScanner - VolumeScanner(/tmp/minicluster_storage16268522075870465194/data/data2, DS-1d60ecab-7d6a-4ede-984e-134f3c383036): no suitable block pools found to scan. Waiting 1814399996 ms.
14:51:09.995 INFO VolumeScanner - VolumeScanner(/tmp/minicluster_storage16268522075870465194/data/data1, DS-c4d6e5ab-12ff-44c3-8f78-9b7127043a66): no suitable block pools found to scan. Waiting 1814399996 ms.
14:51:09.997 INFO DataNode - Block pool BP-1768883704-10.1.0.125-1772635868711 (Datanode Uuid 3c05d188-bc2c-46c8-958b-79f7184a02a2) service to localhost/127.0.0.1:43595 beginning handshake with NN
14:51:10.019 INFO StateChange - BLOCK* registerDatanode: from DatanodeRegistration(127.0.0.1:34059, datanodeUuid=3c05d188-bc2c-46c8-958b-79f7184a02a2, infoPort=43931, infoSecurePort=0, ipcPort=35677, storageInfo=lv=-57;cid=testClusterID;nsid=789686550;c=1772635868711) storage 3c05d188-bc2c-46c8-958b-79f7184a02a2
14:51:10.020 INFO NetworkTopology - Adding a new node: /default-rack/127.0.0.1:34059
14:51:10.021 INFO BlockReportLeaseManager - Registered DN 3c05d188-bc2c-46c8-958b-79f7184a02a2 (127.0.0.1:34059).
14:51:10.024 INFO DataNode - Block pool BP-1768883704-10.1.0.125-1772635868711 (Datanode Uuid 3c05d188-bc2c-46c8-958b-79f7184a02a2) service to localhost/127.0.0.1:43595 successfully registered with NN
14:51:10.024 INFO DataNode - For namenode localhost/127.0.0.1:43595 using BLOCKREPORT_INTERVAL of 21600000msecs CACHEREPORT_INTERVAL of 10000msecs Initial delay: 0msecs; heartBeatInterval=3000
14:51:10.025 INFO DataNode - Starting IBR Task Handler.
14:51:10.033 INFO DatanodeDescriptor - Adding new storage ID DS-c4d6e5ab-12ff-44c3-8f78-9b7127043a66 for DN 127.0.0.1:34059
14:51:10.033 INFO DatanodeDescriptor - Adding new storage ID DS-1d60ecab-7d6a-4ede-984e-134f3c383036 for DN 127.0.0.1:34059
14:51:10.038 INFO DataNode - After receiving heartbeat response, updating state of namenode localhost:43595 to active
14:51:10.046 INFO BlockStateChange - BLOCK* processReport 0xfd7fe7023a0bf717 with lease ID 0xfe742c5bdfb3d67a: Processing first storage report for DS-c4d6e5ab-12ff-44c3-8f78-9b7127043a66 from datanode DatanodeRegistration(127.0.0.1:34059, datanodeUuid=3c05d188-bc2c-46c8-958b-79f7184a02a2, infoPort=43931, infoSecurePort=0, ipcPort=35677, storageInfo=lv=-57;cid=testClusterID;nsid=789686550;c=1772635868711)
14:51:10.047 INFO BlockStateChange - BLOCK* processReport 0xfd7fe7023a0bf717 with lease ID 0xfe742c5bdfb3d67a: from storage DS-c4d6e5ab-12ff-44c3-8f78-9b7127043a66 node DatanodeRegistration(127.0.0.1:34059, datanodeUuid=3c05d188-bc2c-46c8-958b-79f7184a02a2, infoPort=43931, infoSecurePort=0, ipcPort=35677, storageInfo=lv=-57;cid=testClusterID;nsid=789686550;c=1772635868711), blocks: 0, hasStaleStorage: true, processing time: 1 msecs, invalidatedBlocks: 0
14:51:10.048 INFO BlockStateChange - BLOCK* processReport 0xfd7fe7023a0bf717 with lease ID 0xfe742c5bdfb3d67a: Processing first storage report for DS-1d60ecab-7d6a-4ede-984e-134f3c383036 from datanode DatanodeRegistration(127.0.0.1:34059, datanodeUuid=3c05d188-bc2c-46c8-958b-79f7184a02a2, infoPort=43931, infoSecurePort=0, ipcPort=35677, storageInfo=lv=-57;cid=testClusterID;nsid=789686550;c=1772635868711)
14:51:10.048 INFO BlockStateChange - BLOCK* processReport 0xfd7fe7023a0bf717 with lease ID 0xfe742c5bdfb3d67a: from storage DS-1d60ecab-7d6a-4ede-984e-134f3c383036 node DatanodeRegistration(127.0.0.1:34059, datanodeUuid=3c05d188-bc2c-46c8-958b-79f7184a02a2, infoPort=43931, infoSecurePort=0, ipcPort=35677, storageInfo=lv=-57;cid=testClusterID;nsid=789686550;c=1772635868711), blocks: 0, hasStaleStorage: false, processing time: 0 msecs, invalidatedBlocks: 0
14:51:10.057 INFO DataNode - Successfully sent block report 0xfd7fe7023a0bf717 with lease ID 0xfe742c5bdfb3d67a to namenode: localhost/127.0.0.1:43595, containing 2 storage report(s), of which we sent 2. The reports had 0 total blocks and used 1 RPC(s). This took 1 msecs to generate and 17 msecs for RPC and NN processing. Got back one command: FinalizeCommand/5.
14:51:10.058 INFO DataNode - Got finalize command for block pool BP-1768883704-10.1.0.125-1772635868711
14:51:10.086 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=datanodeReport src=null dst=null perm=null proto=rpc
14:51:10.091 INFO MiniDFSCluster - Cluster is active
14:51:10.153 INFO MemoryStore - Block broadcast_34 stored as values in memory (estimated size 297.9 KiB, free 1919.7 MiB)
14:51:10.175 INFO MemoryStore - Block broadcast_34_piece0 stored as bytes in memory (estimated size 50.2 KiB, free 1919.7 MiB)
14:51:10.176 INFO BlockManagerInfo - Added broadcast_34_piece0 in memory on localhost:44923 (size: 50.2 KiB, free: 1920.0 MiB)
14:51:10.176 INFO SparkContext - Created broadcast 34 from newAPIHadoopFile at PathSplitSource.java:96
14:51:10.224 INFO MemoryStore - Block broadcast_35 stored as values in memory (estimated size 297.9 KiB, free 1919.4 MiB)
14:51:10.233 INFO MemoryStore - Block broadcast_35_piece0 stored as bytes in memory (estimated size 50.2 KiB, free 1919.3 MiB)
14:51:10.234 INFO BlockManagerInfo - Added broadcast_35_piece0 in memory on localhost:44923 (size: 50.2 KiB, free: 1919.9 MiB)
14:51:10.234 INFO SparkContext - Created broadcast 35 from newAPIHadoopFile at PathSplitSource.java:96
14:51:10.287 INFO FileInputFormat - Total input files to process : 1
14:51:10.302 INFO MemoryStore - Block broadcast_36 stored as values in memory (estimated size 160.7 KiB, free 1919.2 MiB)
14:51:10.306 INFO MemoryStore - Block broadcast_36_piece0 stored as bytes in memory (estimated size 9.6 KiB, free 1919.2 MiB)
14:51:10.307 INFO BlockManagerInfo - Added broadcast_36_piece0 in memory on localhost:44923 (size: 9.6 KiB, free: 1919.9 MiB)
14:51:10.307 INFO SparkContext - Created broadcast 36 from broadcast at ReadsSparkSink.java:133
14:51:10.318 INFO MemoryStore - Block broadcast_37 stored as values in memory (estimated size 163.2 KiB, free 1919.0 MiB)
14:51:10.322 INFO MemoryStore - Block broadcast_37_piece0 stored as bytes in memory (estimated size 9.6 KiB, free 1919.0 MiB)
14:51:10.322 INFO BlockManagerInfo - Added broadcast_37_piece0 in memory on localhost:44923 (size: 9.6 KiB, free: 1919.9 MiB)
14:51:10.323 INFO SparkContext - Created broadcast 37 from broadcast at BamSink.java:76
14:51:10.341 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_d59fbdb1-8010-45a9-b77b-727a20ececdf.bam.parts dst=null perm=null proto=rpc
14:51:10.345 INFO PathOutputCommitterFactory - No output committer factory defined, defaulting to FileOutputCommitterFactory
14:51:10.346 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
14:51:10.346 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
14:51:10.364 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=mkdirs src=/user/runner/ReadsSparkSinkUnitTest1_d59fbdb1-8010-45a9-b77b-727a20ececdf.bam.parts/_temporary/0 dst=null perm=runner:supergroup:rwxr-xr-x proto=rpc
14:51:10.378 INFO SparkContext - Starting job: runJob at SparkHadoopWriter.scala:83
14:51:10.379 INFO DAGScheduler - Registering RDD 77 (mapToPair at SparkUtils.java:161) as input to shuffle 7
14:51:10.379 INFO DAGScheduler - Got job 20 (runJob at SparkHadoopWriter.scala:83) with 1 output partitions
14:51:10.380 INFO DAGScheduler - Final stage: ResultStage 30 (runJob at SparkHadoopWriter.scala:83)
14:51:10.380 INFO DAGScheduler - Parents of final stage: List(ShuffleMapStage 29)
14:51:10.380 INFO DAGScheduler - Missing parents: List(ShuffleMapStage 29)
14:51:10.380 INFO DAGScheduler - Submitting ShuffleMapStage 29 (MapPartitionsRDD[77] at mapToPair at SparkUtils.java:161), which has no missing parents
14:51:10.411 INFO MemoryStore - Block broadcast_38 stored as values in memory (estimated size 520.4 KiB, free 1918.5 MiB)
14:51:10.413 INFO MemoryStore - Block broadcast_38_piece0 stored as bytes in memory (estimated size 166.1 KiB, free 1918.3 MiB)
14:51:10.413 INFO BlockManagerInfo - Added broadcast_38_piece0 in memory on localhost:44923 (size: 166.1 KiB, free: 1919.7 MiB)
14:51:10.414 INFO SparkContext - Created broadcast 38 from broadcast at DAGScheduler.scala:1580
14:51:10.414 INFO DAGScheduler - Submitting 1 missing tasks from ShuffleMapStage 29 (MapPartitionsRDD[77] at mapToPair at SparkUtils.java:161) (first 15 tasks are for partitions Vector(0))
14:51:10.414 INFO TaskSchedulerImpl - Adding task set 29.0 with 1 tasks resource profile 0
14:51:10.417 INFO TaskSetManager - Starting task 0.0 in stage 29.0 (TID 67) (localhost, executor driver, partition 0, PROCESS_LOCAL, 7861 bytes)
14:51:10.417 INFO Executor - Running task 0.0 in stage 29.0 (TID 67)
14:51:10.481 INFO NewHadoopRDD - Input split: file:/home/runner/work/gatk/gatk/src/test/resources/org/broadinstitute/hellbender/tools/BQSR/HiSeq.1mb.1RG.2k_lines.bam:0+222075
14:51:10.540 INFO Executor - Finished task 0.0 in stage 29.0 (TID 67). 1148 bytes result sent to driver
14:51:10.541 INFO TaskSetManager - Finished task 0.0 in stage 29.0 (TID 67) in 126 ms on localhost (executor driver) (1/1)
14:51:10.541 INFO TaskSchedulerImpl - Removed TaskSet 29.0, whose tasks have all completed, from pool
14:51:10.541 INFO DAGScheduler - ShuffleMapStage 29 (mapToPair at SparkUtils.java:161) finished in 0.158 s
14:51:10.542 INFO DAGScheduler - looking for newly runnable stages
14:51:10.542 INFO DAGScheduler - running: HashSet()
14:51:10.542 INFO DAGScheduler - waiting: HashSet(ResultStage 30)
14:51:10.542 INFO DAGScheduler - failed: HashSet()
14:51:10.542 INFO DAGScheduler - Submitting ResultStage 30 (MapPartitionsRDD[82] at mapToPair at BamSink.java:91), which has no missing parents
14:51:10.557 INFO MemoryStore - Block broadcast_39 stored as values in memory (estimated size 241.5 KiB, free 1918.1 MiB)
14:51:10.558 INFO MemoryStore - Block broadcast_39_piece0 stored as bytes in memory (estimated size 67.1 KiB, free 1918.0 MiB)
14:51:10.558 INFO BlockManagerInfo - Added broadcast_39_piece0 in memory on localhost:44923 (size: 67.1 KiB, free: 1919.7 MiB)
14:51:10.559 INFO SparkContext - Created broadcast 39 from broadcast at DAGScheduler.scala:1580
14:51:10.559 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 30 (MapPartitionsRDD[82] at mapToPair at BamSink.java:91) (first 15 tasks are for partitions Vector(0))
14:51:10.559 INFO TaskSchedulerImpl - Adding task set 30.0 with 1 tasks resource profile 0
14:51:10.560 INFO TaskSetManager - Starting task 0.0 in stage 30.0 (TID 68) (localhost, executor driver, partition 0, NODE_LOCAL, 7513 bytes)
14:51:10.560 INFO Executor - Running task 0.0 in stage 30.0 (TID 68)
14:51:10.583 INFO ShuffleBlockFetcherIterator - Getting 1 (343.8 KiB) non-empty blocks including 1 (343.8 KiB) local and 0 (0.0 B) host-local and 0 (0.0 B) push-merged-local and 0 (0.0 B) remote blocks
14:51:10.583 INFO ShuffleBlockFetcherIterator - Started 0 remote fetches in 0 ms
14:51:10.675 INFO PathOutputCommitterFactory - No output committer factory defined, defaulting to FileOutputCommitterFactory
14:51:10.675 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
14:51:10.675 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
14:51:10.676 INFO PathOutputCommitterFactory - No output committer factory defined, defaulting to FileOutputCommitterFactory
14:51:10.676 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
14:51:10.676 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
14:51:10.695 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=create src=/user/runner/ReadsSparkSinkUnitTest1_d59fbdb1-8010-45a9-b77b-727a20ececdf.bam.parts/_temporary/0/_temporary/attempt_202603041451108405695383951148878_0082_r_000000_0/part-r-00000 dst=null perm=runner:supergroup:rw-r--r-- proto=rpc
14:51:10.711 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=create src=/user/runner/ReadsSparkSinkUnitTest1_d59fbdb1-8010-45a9-b77b-727a20ececdf.bam.parts/_temporary/0/_temporary/attempt_202603041451108405695383951148878_0082_r_000000_0/.part-r-00000.sbi dst=null perm=runner:supergroup:rw-r--r-- proto=rpc
14:51:10.713 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=create src=/user/runner/ReadsSparkSinkUnitTest1_d59fbdb1-8010-45a9-b77b-727a20ececdf.bam.parts/_temporary/0/_temporary/attempt_202603041451108405695383951148878_0082_r_000000_0/.part-r-00000.bai dst=null perm=runner:supergroup:rw-r--r-- proto=rpc
14:51:10.739 INFO StateChange - BLOCK* allocate blk_1073741825_1001, replicas=127.0.0.1:34059 for /user/runner/ReadsSparkSinkUnitTest1_d59fbdb1-8010-45a9-b77b-727a20ececdf.bam.parts/_temporary/0/_temporary/attempt_202603041451108405695383951148878_0082_r_000000_0/part-r-00000
14:51:10.770 INFO DataNode - Receiving BP-1768883704-10.1.0.125-1772635868711:blk_1073741825_1001 src: /127.0.0.1:33730 dest: /127.0.0.1:34059
14:51:10.794 INFO clienttrace - src: /127.0.0.1:33730, dest: /127.0.0.1:34059, bytes: 231298, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-951833550_1, offset: 0, srvID: 3c05d188-bc2c-46c8-958b-79f7184a02a2, blockid: BP-1768883704-10.1.0.125-1772635868711:blk_1073741825_1001, duration(ns): 5461091
14:51:10.794 INFO DataNode - PacketResponder: BP-1768883704-10.1.0.125-1772635868711:blk_1073741825_1001, type=LAST_IN_PIPELINE terminating
14:51:10.799 INFO FSNamesystem - BLOCK* blk_1073741825_1001 is COMMITTED but not COMPLETE(numNodes= 0 < minimum = 1) in file /user/runner/ReadsSparkSinkUnitTest1_d59fbdb1-8010-45a9-b77b-727a20ececdf.bam.parts/_temporary/0/_temporary/attempt_202603041451108405695383951148878_0082_r_000000_0/part-r-00000
14:51:11.202 INFO StateChange - DIR* completeFile: /user/runner/ReadsSparkSinkUnitTest1_d59fbdb1-8010-45a9-b77b-727a20ececdf.bam.parts/_temporary/0/_temporary/attempt_202603041451108405695383951148878_0082_r_000000_0/part-r-00000 is closed by DFSClient_NONMAPREDUCE_-951833550_1
14:51:11.204 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_d59fbdb1-8010-45a9-b77b-727a20ececdf.bam.parts/_temporary/0/_temporary/attempt_202603041451108405695383951148878_0082_r_000000_0/part-r-00000 dst=null perm=null proto=rpc
14:51:11.206 INFO StateChange - BLOCK* allocate blk_1073741826_1002, replicas=127.0.0.1:34059 for /user/runner/ReadsSparkSinkUnitTest1_d59fbdb1-8010-45a9-b77b-727a20ececdf.bam.parts/_temporary/0/_temporary/attempt_202603041451108405695383951148878_0082_r_000000_0/.part-r-00000.sbi
14:51:11.208 INFO DataNode - Receiving BP-1768883704-10.1.0.125-1772635868711:blk_1073741826_1002 src: /127.0.0.1:33740 dest: /127.0.0.1:34059
14:51:11.210 INFO clienttrace - src: /127.0.0.1:33740, dest: /127.0.0.1:34059, bytes: 212, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-951833550_1, offset: 0, srvID: 3c05d188-bc2c-46c8-958b-79f7184a02a2, blockid: BP-1768883704-10.1.0.125-1772635868711:blk_1073741826_1002, duration(ns): 876508
14:51:11.210 INFO DataNode - PacketResponder: BP-1768883704-10.1.0.125-1772635868711:blk_1073741826_1002, type=LAST_IN_PIPELINE terminating
14:51:11.212 INFO StateChange - DIR* completeFile: /user/runner/ReadsSparkSinkUnitTest1_d59fbdb1-8010-45a9-b77b-727a20ececdf.bam.parts/_temporary/0/_temporary/attempt_202603041451108405695383951148878_0082_r_000000_0/.part-r-00000.sbi is closed by DFSClient_NONMAPREDUCE_-951833550_1
14:51:11.216 INFO StateChange - BLOCK* allocate blk_1073741827_1003, replicas=127.0.0.1:34059 for /user/runner/ReadsSparkSinkUnitTest1_d59fbdb1-8010-45a9-b77b-727a20ececdf.bam.parts/_temporary/0/_temporary/attempt_202603041451108405695383951148878_0082_r_000000_0/.part-r-00000.bai
14:51:11.217 INFO DataNode - Receiving BP-1768883704-10.1.0.125-1772635868711:blk_1073741827_1003 src: /127.0.0.1:33750 dest: /127.0.0.1:34059
14:51:11.219 INFO clienttrace - src: /127.0.0.1:33750, dest: /127.0.0.1:34059, bytes: 5472, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-951833550_1, offset: 0, srvID: 3c05d188-bc2c-46c8-958b-79f7184a02a2, blockid: BP-1768883704-10.1.0.125-1772635868711:blk_1073741827_1003, duration(ns): 740755
14:51:11.219 INFO DataNode - PacketResponder: BP-1768883704-10.1.0.125-1772635868711:blk_1073741827_1003, type=LAST_IN_PIPELINE terminating
14:51:11.220 INFO StateChange - DIR* completeFile: /user/runner/ReadsSparkSinkUnitTest1_d59fbdb1-8010-45a9-b77b-727a20ececdf.bam.parts/_temporary/0/_temporary/attempt_202603041451108405695383951148878_0082_r_000000_0/.part-r-00000.bai is closed by DFSClient_NONMAPREDUCE_-951833550_1
14:51:11.224 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_d59fbdb1-8010-45a9-b77b-727a20ececdf.bam.parts/_temporary/0/_temporary/attempt_202603041451108405695383951148878_0082_r_000000_0 dst=null perm=null proto=rpc
14:51:11.228 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_d59fbdb1-8010-45a9-b77b-727a20ececdf.bam.parts/_temporary/0/_temporary/attempt_202603041451108405695383951148878_0082_r_000000_0 dst=null perm=null proto=rpc
14:51:11.229 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_d59fbdb1-8010-45a9-b77b-727a20ececdf.bam.parts/_temporary/0/task_202603041451108405695383951148878_0082_r_000000 dst=null perm=null proto=rpc
14:51:11.235 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=rename src=/user/runner/ReadsSparkSinkUnitTest1_d59fbdb1-8010-45a9-b77b-727a20ececdf.bam.parts/_temporary/0/_temporary/attempt_202603041451108405695383951148878_0082_r_000000_0 dst=/user/runner/ReadsSparkSinkUnitTest1_d59fbdb1-8010-45a9-b77b-727a20ececdf.bam.parts/_temporary/0/task_202603041451108405695383951148878_0082_r_000000 perm=runner:supergroup:rwxr-xr-x proto=rpc
14:51:11.236 INFO FileOutputCommitter - Saved output of task 'attempt_202603041451108405695383951148878_0082_r_000000_0' to hdfs://localhost:43595/user/runner/ReadsSparkSinkUnitTest1_d59fbdb1-8010-45a9-b77b-727a20ececdf.bam.parts/_temporary/0/task_202603041451108405695383951148878_0082_r_000000
14:51:11.237 INFO SparkHadoopMapRedUtil - attempt_202603041451108405695383951148878_0082_r_000000_0: Committed. Elapsed time: 9 ms.
14:51:11.238 INFO Executor - Finished task 0.0 in stage 30.0 (TID 68). 1858 bytes result sent to driver
14:51:11.241 INFO TaskSetManager - Finished task 0.0 in stage 30.0 (TID 68) in 681 ms on localhost (executor driver) (1/1)
14:51:11.241 INFO TaskSchedulerImpl - Removed TaskSet 30.0, whose tasks have all completed, from pool
14:51:11.241 INFO DAGScheduler - ResultStage 30 (runJob at SparkHadoopWriter.scala:83) finished in 0.699 s
14:51:11.242 INFO DAGScheduler - Job 20 is finished. Cancelling potential speculative or zombie tasks for this job
14:51:11.242 INFO TaskSchedulerImpl - Killing all running tasks in stage 30: Stage finished
14:51:11.242 INFO DAGScheduler - Job 20 finished: runJob at SparkHadoopWriter.scala:83, took 0.863534 s
14:51:11.244 INFO SparkHadoopWriter - Start to commit write Job job_202603041451108405695383951148878_0082.
14:51:11.247 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=listStatus src=/user/runner/ReadsSparkSinkUnitTest1_d59fbdb1-8010-45a9-b77b-727a20ececdf.bam.parts/_temporary/0 dst=null perm=null proto=rpc
14:51:11.251 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_d59fbdb1-8010-45a9-b77b-727a20ececdf.bam.parts dst=null perm=null proto=rpc
14:51:11.252 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=listStatus src=/user/runner/ReadsSparkSinkUnitTest1_d59fbdb1-8010-45a9-b77b-727a20ececdf.bam.parts/_temporary/0/task_202603041451108405695383951148878_0082_r_000000 dst=null perm=null proto=rpc
14:51:11.253 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_d59fbdb1-8010-45a9-b77b-727a20ececdf.bam.parts/.part-r-00000.bai dst=null perm=null proto=rpc
14:51:11.257 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=rename src=/user/runner/ReadsSparkSinkUnitTest1_d59fbdb1-8010-45a9-b77b-727a20ececdf.bam.parts/_temporary/0/task_202603041451108405695383951148878_0082_r_000000/.part-r-00000.bai dst=/user/runner/ReadsSparkSinkUnitTest1_d59fbdb1-8010-45a9-b77b-727a20ececdf.bam.parts/.part-r-00000.bai perm=runner:supergroup:rw-r--r-- proto=rpc
14:51:11.258 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_d59fbdb1-8010-45a9-b77b-727a20ececdf.bam.parts/.part-r-00000.sbi dst=null perm=null proto=rpc
14:51:11.259 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=rename src=/user/runner/ReadsSparkSinkUnitTest1_d59fbdb1-8010-45a9-b77b-727a20ececdf.bam.parts/_temporary/0/task_202603041451108405695383951148878_0082_r_000000/.part-r-00000.sbi dst=/user/runner/ReadsSparkSinkUnitTest1_d59fbdb1-8010-45a9-b77b-727a20ececdf.bam.parts/.part-r-00000.sbi perm=runner:supergroup:rw-r--r-- proto=rpc
14:51:11.260 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_d59fbdb1-8010-45a9-b77b-727a20ececdf.bam.parts/part-r-00000 dst=null perm=null proto=rpc
14:51:11.261 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=rename src=/user/runner/ReadsSparkSinkUnitTest1_d59fbdb1-8010-45a9-b77b-727a20ececdf.bam.parts/_temporary/0/task_202603041451108405695383951148878_0082_r_000000/part-r-00000 dst=/user/runner/ReadsSparkSinkUnitTest1_d59fbdb1-8010-45a9-b77b-727a20ececdf.bam.parts/part-r-00000 perm=runner:supergroup:rw-r--r-- proto=rpc
14:51:11.270 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=delete src=/user/runner/ReadsSparkSinkUnitTest1_d59fbdb1-8010-45a9-b77b-727a20ececdf.bam.parts/_temporary dst=null perm=null proto=rpc
14:51:11.273 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=create src=/user/runner/ReadsSparkSinkUnitTest1_d59fbdb1-8010-45a9-b77b-727a20ececdf.bam.parts/_SUCCESS dst=null perm=runner:supergroup:rw-r--r-- proto=rpc
14:51:11.274 INFO StateChange - DIR* completeFile: /user/runner/ReadsSparkSinkUnitTest1_d59fbdb1-8010-45a9-b77b-727a20ececdf.bam.parts/_SUCCESS is closed by DFSClient_NONMAPREDUCE_-951833550_1
14:51:11.277 INFO audit - allowed=false ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=delete src=/user/runner/ReadsSparkSinkUnitTest1_d59fbdb1-8010-45a9-b77b-727a20ececdf.bam.parts/.spark-staging-82 dst=null perm=null proto=rpc
14:51:11.278 INFO SparkHadoopWriter - Write Job job_202603041451108405695383951148878_0082 committed. Elapsed time: 33 ms.
14:51:11.279 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=create src=/user/runner/ReadsSparkSinkUnitTest1_d59fbdb1-8010-45a9-b77b-727a20ececdf.bam.parts/header dst=null perm=runner:supergroup:rw-r--r-- proto=rpc
14:51:11.282 INFO StateChange - BLOCK* allocate blk_1073741828_1004, replicas=127.0.0.1:34059 for /user/runner/ReadsSparkSinkUnitTest1_d59fbdb1-8010-45a9-b77b-727a20ececdf.bam.parts/header
14:51:11.283 INFO DataNode - Receiving BP-1768883704-10.1.0.125-1772635868711:blk_1073741828_1004 src: /127.0.0.1:33760 dest: /127.0.0.1:34059
14:51:11.285 INFO clienttrace - src: /127.0.0.1:33760, dest: /127.0.0.1:34059, bytes: 5712, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-951833550_1, offset: 0, srvID: 3c05d188-bc2c-46c8-958b-79f7184a02a2, blockid: BP-1768883704-10.1.0.125-1772635868711:blk_1073741828_1004, duration(ns): 947916
14:51:11.285 INFO DataNode - PacketResponder: BP-1768883704-10.1.0.125-1772635868711:blk_1073741828_1004, type=LAST_IN_PIPELINE terminating
14:51:11.287 INFO StateChange - DIR* completeFile: /user/runner/ReadsSparkSinkUnitTest1_d59fbdb1-8010-45a9-b77b-727a20ececdf.bam.parts/header is closed by DFSClient_NONMAPREDUCE_-951833550_1
14:51:11.288 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=create src=/user/runner/ReadsSparkSinkUnitTest1_d59fbdb1-8010-45a9-b77b-727a20ececdf.bam.parts/terminator dst=null perm=runner:supergroup:rw-r--r-- proto=rpc
14:51:11.289 INFO StateChange - BLOCK* allocate blk_1073741829_1005, replicas=127.0.0.1:34059 for /user/runner/ReadsSparkSinkUnitTest1_d59fbdb1-8010-45a9-b77b-727a20ececdf.bam.parts/terminator
14:51:11.291 INFO DataNode - Receiving BP-1768883704-10.1.0.125-1772635868711:blk_1073741829_1005 src: /127.0.0.1:33776 dest: /127.0.0.1:34059
14:51:11.292 INFO clienttrace - src: /127.0.0.1:33776, dest: /127.0.0.1:34059, bytes: 28, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-951833550_1, offset: 0, srvID: 3c05d188-bc2c-46c8-958b-79f7184a02a2, blockid: BP-1768883704-10.1.0.125-1772635868711:blk_1073741829_1005, duration(ns): 763132
14:51:11.292 INFO DataNode - PacketResponder: BP-1768883704-10.1.0.125-1772635868711:blk_1073741829_1005, type=LAST_IN_PIPELINE terminating
14:51:11.293 INFO FSNamesystem - BLOCK* blk_1073741829_1005 is COMMITTED but not COMPLETE(numNodes= 0 < minimum = 1) in file /user/runner/ReadsSparkSinkUnitTest1_d59fbdb1-8010-45a9-b77b-727a20ececdf.bam.parts/terminator
14:51:11.695 INFO StateChange - DIR* completeFile: /user/runner/ReadsSparkSinkUnitTest1_d59fbdb1-8010-45a9-b77b-727a20ececdf.bam.parts/terminator is closed by DFSClient_NONMAPREDUCE_-951833550_1
14:51:11.697 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=listStatus src=/user/runner/ReadsSparkSinkUnitTest1_d59fbdb1-8010-45a9-b77b-727a20ececdf.bam.parts dst=null perm=null proto=rpc
14:51:11.701 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=create src=/user/runner/ReadsSparkSinkUnitTest1_d59fbdb1-8010-45a9-b77b-727a20ececdf.bam.parts/output dst=null perm=runner:supergroup:rw-r--r-- proto=rpc
14:51:11.702 INFO StateChange - DIR* completeFile: /user/runner/ReadsSparkSinkUnitTest1_d59fbdb1-8010-45a9-b77b-727a20ececdf.bam.parts/output is closed by DFSClient_NONMAPREDUCE_-951833550_1
14:51:11.703 INFO HadoopFileSystemWrapper - Concatenating 3 parts to hdfs://localhost:43595/user/runner/ReadsSparkSinkUnitTest1_d59fbdb1-8010-45a9-b77b-727a20ececdf.bam
14:51:11.706 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=concat src=[/user/runner/ReadsSparkSinkUnitTest1_d59fbdb1-8010-45a9-b77b-727a20ececdf.bam.parts/header, /user/runner/ReadsSparkSinkUnitTest1_d59fbdb1-8010-45a9-b77b-727a20ececdf.bam.parts/part-r-00000, /user/runner/ReadsSparkSinkUnitTest1_d59fbdb1-8010-45a9-b77b-727a20ececdf.bam.parts/terminator] dst=/user/runner/ReadsSparkSinkUnitTest1_d59fbdb1-8010-45a9-b77b-727a20ececdf.bam.parts/output perm=runner:supergroup:rw-r--r-- proto=rpc
14:51:11.707 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_d59fbdb1-8010-45a9-b77b-727a20ececdf.bam dst=null perm=null proto=rpc
14:51:11.708 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=rename src=/user/runner/ReadsSparkSinkUnitTest1_d59fbdb1-8010-45a9-b77b-727a20ececdf.bam.parts/output dst=/user/runner/ReadsSparkSinkUnitTest1_d59fbdb1-8010-45a9-b77b-727a20ececdf.bam perm=runner:supergroup:rw-r--r-- proto=rpc
14:51:11.709 INFO HadoopFileSystemWrapper - Concatenating to hdfs://localhost:43595/user/runner/ReadsSparkSinkUnitTest1_d59fbdb1-8010-45a9-b77b-727a20ececdf.bam done
14:51:11.709 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_d59fbdb1-8010-45a9-b77b-727a20ececdf.bam dst=null perm=null proto=rpc
14:51:11.711 INFO IndexFileMerger - Merging .sbi files in temp directory hdfs://localhost:43595/user/runner/ReadsSparkSinkUnitTest1_d59fbdb1-8010-45a9-b77b-727a20ececdf.bam.parts/ to hdfs://localhost:43595/user/runner/ReadsSparkSinkUnitTest1_d59fbdb1-8010-45a9-b77b-727a20ececdf.bam.sbi
14:51:11.712 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=listStatus src=/user/runner/ReadsSparkSinkUnitTest1_d59fbdb1-8010-45a9-b77b-727a20ececdf.bam.parts dst=null perm=null proto=rpc
14:51:11.714 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=create src=/user/runner/ReadsSparkSinkUnitTest1_d59fbdb1-8010-45a9-b77b-727a20ececdf.bam.sbi dst=null perm=runner:supergroup:rw-r--r-- proto=rpc
14:51:11.716 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_d59fbdb1-8010-45a9-b77b-727a20ececdf.bam.parts/.part-r-00000.sbi dst=null perm=null proto=rpc
14:51:11.720 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest1_d59fbdb1-8010-45a9-b77b-727a20ececdf.bam.parts/.part-r-00000.sbi dst=null perm=null proto=rpc
14:51:11.750 WARN DFSUtil - Unexpected value for data transfer bytes=216 duration=0
14:51:11.752 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=delete src=/user/runner/ReadsSparkSinkUnitTest1_d59fbdb1-8010-45a9-b77b-727a20ececdf.bam.parts/.part-r-00000.sbi dst=null perm=null proto=rpc
14:51:11.754 INFO StateChange - BLOCK* allocate blk_1073741830_1006, replicas=127.0.0.1:34059 for /user/runner/ReadsSparkSinkUnitTest1_d59fbdb1-8010-45a9-b77b-727a20ececdf.bam.sbi
14:51:11.755 INFO DataNode - Receiving BP-1768883704-10.1.0.125-1772635868711:blk_1073741830_1006 src: /127.0.0.1:33792 dest: /127.0.0.1:34059
14:51:11.757 INFO clienttrace - src: /127.0.0.1:33792, dest: /127.0.0.1:34059, bytes: 212, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-951833550_1, offset: 0, srvID: 3c05d188-bc2c-46c8-958b-79f7184a02a2, blockid: BP-1768883704-10.1.0.125-1772635868711:blk_1073741830_1006, duration(ns): 819151
14:51:11.757 INFO DataNode - PacketResponder: BP-1768883704-10.1.0.125-1772635868711:blk_1073741830_1006, type=LAST_IN_PIPELINE terminating
14:51:11.759 INFO StateChange - DIR* completeFile: /user/runner/ReadsSparkSinkUnitTest1_d59fbdb1-8010-45a9-b77b-727a20ececdf.bam.sbi is closed by DFSClient_NONMAPREDUCE_-951833550_1
14:51:11.760 INFO IndexFileMerger - Done merging .sbi files
14:51:11.761 INFO IndexFileMerger - Merging .bai files in temp directory hdfs://localhost:43595/user/runner/ReadsSparkSinkUnitTest1_d59fbdb1-8010-45a9-b77b-727a20ececdf.bam.parts/ to hdfs://localhost:43595/user/runner/ReadsSparkSinkUnitTest1_d59fbdb1-8010-45a9-b77b-727a20ececdf.bam.bai
14:51:11.762 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=listStatus src=/user/runner/ReadsSparkSinkUnitTest1_d59fbdb1-8010-45a9-b77b-727a20ececdf.bam.parts dst=null perm=null proto=rpc
14:51:11.763 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=create src=/user/runner/ReadsSparkSinkUnitTest1_d59fbdb1-8010-45a9-b77b-727a20ececdf.bam.bai dst=null perm=runner:supergroup:rw-r--r-- proto=rpc
14:51:11.765 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_d59fbdb1-8010-45a9-b77b-727a20ececdf.bam.parts/.part-r-00000.bai dst=null perm=null proto=rpc
14:51:11.766 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest1_d59fbdb1-8010-45a9-b77b-727a20ececdf.bam.parts/.part-r-00000.bai dst=null perm=null proto=rpc
14:51:11.768 WARN DFSUtil - Unexpected value for data transfer bytes=5516 duration=0
14:51:11.770 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=delete src=/user/runner/ReadsSparkSinkUnitTest1_d59fbdb1-8010-45a9-b77b-727a20ececdf.bam.parts/.part-r-00000.bai dst=null perm=null proto=rpc
14:51:11.778 INFO StateChange - BLOCK* allocate blk_1073741831_1007, replicas=127.0.0.1:34059 for /user/runner/ReadsSparkSinkUnitTest1_d59fbdb1-8010-45a9-b77b-727a20ececdf.bam.bai
14:51:11.780 INFO DataNode - Receiving BP-1768883704-10.1.0.125-1772635868711:blk_1073741831_1007 src: /127.0.0.1:33804 dest: /127.0.0.1:34059
14:51:11.781 INFO clienttrace - src: /127.0.0.1:33804, dest: /127.0.0.1:34059, bytes: 5472, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-951833550_1, offset: 0, srvID: 3c05d188-bc2c-46c8-958b-79f7184a02a2, blockid: BP-1768883704-10.1.0.125-1772635868711:blk_1073741831_1007, duration(ns): 788550
14:51:11.782 INFO DataNode - PacketResponder: BP-1768883704-10.1.0.125-1772635868711:blk_1073741831_1007, type=LAST_IN_PIPELINE terminating
14:51:11.783 INFO FSNamesystem - BLOCK* blk_1073741831_1007 is COMMITTED but not COMPLETE(numNodes= 0 < minimum = 1) in file /user/runner/ReadsSparkSinkUnitTest1_d59fbdb1-8010-45a9-b77b-727a20ececdf.bam.bai
14:51:12.184 INFO StateChange - DIR* completeFile: /user/runner/ReadsSparkSinkUnitTest1_d59fbdb1-8010-45a9-b77b-727a20ececdf.bam.bai is closed by DFSClient_NONMAPREDUCE_-951833550_1
14:51:12.185 INFO IndexFileMerger - Done merging .bai files
14:51:12.186 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=delete src=/user/runner/ReadsSparkSinkUnitTest1_d59fbdb1-8010-45a9-b77b-727a20ececdf.bam.parts dst=null perm=null proto=rpc
14:51:12.196 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_d59fbdb1-8010-45a9-b77b-727a20ececdf.bam.bai dst=null perm=null proto=rpc
14:51:12.205 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_d59fbdb1-8010-45a9-b77b-727a20ececdf.bam.sbi dst=null perm=null proto=rpc
14:51:12.206 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_d59fbdb1-8010-45a9-b77b-727a20ececdf.bam.sbi dst=null perm=null proto=rpc
14:51:12.207 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest1_d59fbdb1-8010-45a9-b77b-727a20ececdf.bam.sbi dst=null perm=null proto=rpc
14:51:12.210 WARN DFSUtil - Unexpected value for data transfer bytes=216 duration=0
14:51:12.210 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_d59fbdb1-8010-45a9-b77b-727a20ececdf.bam dst=null perm=null proto=rpc
14:51:12.211 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_d59fbdb1-8010-45a9-b77b-727a20ececdf.bam dst=null perm=null proto=rpc
14:51:12.212 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_d59fbdb1-8010-45a9-b77b-727a20ececdf.bam dst=null perm=null proto=rpc
14:51:12.212 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest1_d59fbdb1-8010-45a9-b77b-727a20ececdf.bam dst=null perm=null proto=rpc
14:51:12.214 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_d59fbdb1-8010-45a9-b77b-727a20ececdf.bam.bai dst=null perm=null proto=rpc
14:51:12.215 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_d59fbdb1-8010-45a9-b77b-727a20ececdf.bam.bai dst=null perm=null proto=rpc
14:51:12.216 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest1_d59fbdb1-8010-45a9-b77b-727a20ececdf.bam.bai dst=null perm=null proto=rpc
14:51:12.218 WARN DFSUtil - Unexpected value for data transfer bytes=5760 duration=0
14:51:12.222 WARN DFSUtil - Unexpected value for data transfer bytes=5516 duration=0
14:51:12.223 WARN DFSUtil - Unexpected value for data transfer bytes=5516 duration=0
14:51:12.224 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_d59fbdb1-8010-45a9-b77b-727a20ececdf.bam.sbi dst=null perm=null proto=rpc
14:51:12.224 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
14:51:12.225 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_d59fbdb1-8010-45a9-b77b-727a20ececdf.bam.sbi dst=null perm=null proto=rpc
14:51:12.226 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest1_d59fbdb1-8010-45a9-b77b-727a20ececdf.bam.sbi dst=null perm=null proto=rpc
14:51:12.227 WARN DFSUtil - Unexpected value for data transfer bytes=216 duration=0
14:51:12.228 INFO MemoryStore - Block broadcast_40 stored as values in memory (estimated size 320.0 B, free 1918.0 MiB)
14:51:12.229 INFO MemoryStore - Block broadcast_40_piece0 stored as bytes in memory (estimated size 233.0 B, free 1918.0 MiB)
14:51:12.229 INFO BlockManagerInfo - Added broadcast_40_piece0 in memory on localhost:44923 (size: 233.0 B, free: 1919.7 MiB)
14:51:12.230 INFO SparkContext - Created broadcast 40 from broadcast at BamSource.java:104
14:51:12.233 INFO MemoryStore - Block broadcast_41 stored as values in memory (estimated size 297.9 KiB, free 1917.7 MiB)
14:51:12.241 INFO MemoryStore - Block broadcast_41_piece0 stored as bytes in memory (estimated size 50.2 KiB, free 1917.7 MiB)
14:51:12.241 INFO BlockManagerInfo - Added broadcast_41_piece0 in memory on localhost:44923 (size: 50.2 KiB, free: 1919.6 MiB)
14:51:12.241 INFO SparkContext - Created broadcast 41 from newAPIHadoopFile at PathSplitSource.java:96
14:51:12.267 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_d59fbdb1-8010-45a9-b77b-727a20ececdf.bam dst=null perm=null proto=rpc
14:51:12.268 INFO FileInputFormat - Total input files to process : 1
14:51:12.270 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest1_d59fbdb1-8010-45a9-b77b-727a20ececdf.bam dst=null perm=null proto=rpc
14:51:12.307 INFO SparkContext - Starting job: collect at ReadsSparkSinkUnitTest.java:182
14:51:12.308 INFO DAGScheduler - Got job 21 (collect at ReadsSparkSinkUnitTest.java:182) with 1 output partitions
14:51:12.308 INFO DAGScheduler - Final stage: ResultStage 31 (collect at ReadsSparkSinkUnitTest.java:182)
14:51:12.308 INFO DAGScheduler - Parents of final stage: List()
14:51:12.308 INFO DAGScheduler - Missing parents: List()
14:51:12.308 INFO DAGScheduler - Submitting ResultStage 31 (MapPartitionsRDD[88] at filter at ReadsSparkSource.java:96), which has no missing parents
14:51:12.318 INFO MemoryStore - Block broadcast_42 stored as values in memory (estimated size 148.2 KiB, free 1917.5 MiB)
14:51:12.319 INFO MemoryStore - Block broadcast_42_piece0 stored as bytes in memory (estimated size 54.6 KiB, free 1917.5 MiB)
14:51:12.319 INFO BlockManagerInfo - Added broadcast_42_piece0 in memory on localhost:44923 (size: 54.6 KiB, free: 1919.6 MiB)
14:51:12.319 INFO SparkContext - Created broadcast 42 from broadcast at DAGScheduler.scala:1580
14:51:12.320 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 31 (MapPartitionsRDD[88] at filter at ReadsSparkSource.java:96) (first 15 tasks are for partitions Vector(0))
14:51:12.320 INFO TaskSchedulerImpl - Adding task set 31.0 with 1 tasks resource profile 0
14:51:12.321 INFO TaskSetManager - Starting task 0.0 in stage 31.0 (TID 69) (localhost, executor driver, partition 0, ANY, 7852 bytes)
14:51:12.321 INFO Executor - Running task 0.0 in stage 31.0 (TID 69)
14:51:12.336 INFO NewHadoopRDD - Input split: hdfs://localhost:43595/user/runner/ReadsSparkSinkUnitTest1_d59fbdb1-8010-45a9-b77b-727a20ececdf.bam:0+237038
14:51:12.338 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_d59fbdb1-8010-45a9-b77b-727a20ececdf.bam dst=null perm=null proto=rpc
14:51:12.339 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest1_d59fbdb1-8010-45a9-b77b-727a20ececdf.bam dst=null perm=null proto=rpc
14:51:12.341 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_d59fbdb1-8010-45a9-b77b-727a20ececdf.bam.bai dst=null perm=null proto=rpc
14:51:12.342 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_d59fbdb1-8010-45a9-b77b-727a20ececdf.bam.bai dst=null perm=null proto=rpc
14:51:12.343 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest1_d59fbdb1-8010-45a9-b77b-727a20ececdf.bam.bai dst=null perm=null proto=rpc
14:51:12.345 WARN DFSUtil - Unexpected value for data transfer bytes=5760 duration=0
14:51:12.349 WARN DFSUtil - Unexpected value for data transfer bytes=5516 duration=0
14:51:12.350 WARN DFSUtil - Unexpected value for data transfer bytes=5516 duration=0
14:51:12.354 WARN DFSUtil - Unexpected value for data transfer bytes=32 duration=0
14:51:12.361 INFO Executor - Finished task 0.0 in stage 31.0 (TID 69). 651526 bytes result sent to driver
14:51:12.367 INFO TaskSetManager - Finished task 0.0 in stage 31.0 (TID 69) in 45 ms on localhost (executor driver) (1/1)
14:51:12.367 INFO TaskSchedulerImpl - Removed TaskSet 31.0, whose tasks have all completed, from pool
14:51:12.367 INFO DAGScheduler - ResultStage 31 (collect at ReadsSparkSinkUnitTest.java:182) finished in 0.058 s
14:51:12.367 INFO DAGScheduler - Job 21 is finished. Cancelling potential speculative or zombie tasks for this job
14:51:12.367 INFO TaskSchedulerImpl - Killing all running tasks in stage 31: Stage finished
14:51:12.367 INFO DAGScheduler - Job 21 finished: collect at ReadsSparkSinkUnitTest.java:182, took 0.060366 s
14:51:12.394 INFO SparkContext - Starting job: count at ReadsSparkSinkUnitTest.java:185
14:51:12.395 INFO DAGScheduler - Got job 22 (count at ReadsSparkSinkUnitTest.java:185) with 1 output partitions
14:51:12.395 INFO DAGScheduler - Final stage: ResultStage 32 (count at ReadsSparkSinkUnitTest.java:185)
14:51:12.395 INFO DAGScheduler - Parents of final stage: List()
14:51:12.395 INFO DAGScheduler - Missing parents: List()
14:51:12.395 INFO DAGScheduler - Submitting ResultStage 32 (MapPartitionsRDD[70] at filter at ReadsSparkSource.java:96), which has no missing parents
14:51:12.417 INFO MemoryStore - Block broadcast_43 stored as values in memory (estimated size 426.1 KiB, free 1917.1 MiB)
14:51:12.418 INFO MemoryStore - Block broadcast_43_piece0 stored as bytes in memory (estimated size 153.6 KiB, free 1916.9 MiB)
14:51:12.418 INFO BlockManagerInfo - Added broadcast_43_piece0 in memory on localhost:44923 (size: 153.6 KiB, free: 1919.4 MiB)
14:51:12.419 INFO SparkContext - Created broadcast 43 from broadcast at DAGScheduler.scala:1580
14:51:12.419 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 32 (MapPartitionsRDD[70] at filter at ReadsSparkSource.java:96) (first 15 tasks are for partitions Vector(0))
14:51:12.419 INFO TaskSchedulerImpl - Adding task set 32.0 with 1 tasks resource profile 0
14:51:12.420 INFO TaskSetManager - Starting task 0.0 in stage 32.0 (TID 70) (localhost, executor driver, partition 0, PROCESS_LOCAL, 7872 bytes)
14:51:12.421 INFO Executor - Running task 0.0 in stage 32.0 (TID 70)
14:51:12.462 INFO NewHadoopRDD - Input split: file:/home/runner/work/gatk/gatk/src/test/resources/org/broadinstitute/hellbender/tools/BQSR/HiSeq.1mb.1RG.2k_lines.bam:0+222075
14:51:12.496 INFO Executor - Finished task 0.0 in stage 32.0 (TID 70). 989 bytes result sent to driver
14:51:12.496 INFO TaskSetManager - Finished task 0.0 in stage 32.0 (TID 70) in 76 ms on localhost (executor driver) (1/1)
14:51:12.496 INFO TaskSchedulerImpl - Removed TaskSet 32.0, whose tasks have all completed, from pool
14:51:12.497 INFO DAGScheduler - ResultStage 32 (count at ReadsSparkSinkUnitTest.java:185) finished in 0.100 s
14:51:12.497 INFO DAGScheduler - Job 22 is finished. Cancelling potential speculative or zombie tasks for this job
14:51:12.497 INFO TaskSchedulerImpl - Killing all running tasks in stage 32: Stage finished
14:51:12.497 INFO DAGScheduler - Job 22 finished: count at ReadsSparkSinkUnitTest.java:185, took 0.102346 s
14:51:12.502 INFO SparkContext - Starting job: count at ReadsSparkSinkUnitTest.java:185
14:51:12.503 INFO DAGScheduler - Got job 23 (count at ReadsSparkSinkUnitTest.java:185) with 1 output partitions
14:51:12.503 INFO DAGScheduler - Final stage: ResultStage 33 (count at ReadsSparkSinkUnitTest.java:185)
14:51:12.503 INFO DAGScheduler - Parents of final stage: List()
14:51:12.503 INFO DAGScheduler - Missing parents: List()
14:51:12.503 INFO DAGScheduler - Submitting ResultStage 33 (MapPartitionsRDD[88] at filter at ReadsSparkSource.java:96), which has no missing parents
14:51:12.511 INFO MemoryStore - Block broadcast_44 stored as values in memory (estimated size 148.1 KiB, free 1916.8 MiB)
14:51:12.512 INFO MemoryStore - Block broadcast_44_piece0 stored as bytes in memory (estimated size 54.6 KiB, free 1916.7 MiB)
14:51:12.513 INFO BlockManagerInfo - Added broadcast_44_piece0 in memory on localhost:44923 (size: 54.6 KiB, free: 1919.3 MiB)
14:51:12.513 INFO SparkContext - Created broadcast 44 from broadcast at DAGScheduler.scala:1580
14:51:12.513 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 33 (MapPartitionsRDD[88] at filter at ReadsSparkSource.java:96) (first 15 tasks are for partitions Vector(0))
14:51:12.513 INFO TaskSchedulerImpl - Adding task set 33.0 with 1 tasks resource profile 0
14:51:12.514 INFO TaskSetManager - Starting task 0.0 in stage 33.0 (TID 71) (localhost, executor driver, partition 0, ANY, 7852 bytes)
14:51:12.514 INFO Executor - Running task 0.0 in stage 33.0 (TID 71)
14:51:12.529 INFO NewHadoopRDD - Input split: hdfs://localhost:43595/user/runner/ReadsSparkSinkUnitTest1_d59fbdb1-8010-45a9-b77b-727a20ececdf.bam:0+237038
14:51:12.531 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_d59fbdb1-8010-45a9-b77b-727a20ececdf.bam dst=null perm=null proto=rpc
14:51:12.532 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest1_d59fbdb1-8010-45a9-b77b-727a20ececdf.bam dst=null perm=null proto=rpc
14:51:12.533 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_d59fbdb1-8010-45a9-b77b-727a20ececdf.bam.bai dst=null perm=null proto=rpc
14:51:12.534 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_d59fbdb1-8010-45a9-b77b-727a20ececdf.bam.bai dst=null perm=null proto=rpc
14:51:12.535 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest1_d59fbdb1-8010-45a9-b77b-727a20ececdf.bam.bai dst=null perm=null proto=rpc
14:51:12.537 WARN DFSUtil - Unexpected value for data transfer bytes=5760 duration=0
14:51:12.540 WARN DFSUtil - Unexpected value for data transfer bytes=5516 duration=0
14:51:12.541 WARN DFSUtil - Unexpected value for data transfer bytes=5516 duration=0
14:51:12.544 WARN DFSUtil - Unexpected value for data transfer bytes=32 duration=0
14:51:12.547 INFO Executor - Finished task 0.0 in stage 33.0 (TID 71). 989 bytes result sent to driver
14:51:12.548 INFO TaskSetManager - Finished task 0.0 in stage 33.0 (TID 71) in 34 ms on localhost (executor driver) (1/1)
14:51:12.548 INFO TaskSchedulerImpl - Removed TaskSet 33.0, whose tasks have all completed, from pool
14:51:12.549 INFO DAGScheduler - ResultStage 33 (count at ReadsSparkSinkUnitTest.java:185) finished in 0.045 s
14:51:12.549 INFO DAGScheduler - Job 23 is finished. Cancelling potential speculative or zombie tasks for this job
14:51:12.549 INFO TaskSchedulerImpl - Killing all running tasks in stage 33: Stage finished
14:51:12.549 INFO DAGScheduler - Job 23 finished: count at ReadsSparkSinkUnitTest.java:185, took 0.046483 s
14:51:12.555 INFO MemoryStore - Block broadcast_45 stored as values in memory (estimated size 297.9 KiB, free 1916.4 MiB)
14:51:12.579 INFO BlockManagerInfo - Removed broadcast_44_piece0 on localhost:44923 in memory (size: 54.6 KiB, free: 1919.4 MiB)
14:51:12.580 INFO BlockManagerInfo - Removed broadcast_42_piece0 on localhost:44923 in memory (size: 54.6 KiB, free: 1919.5 MiB)
14:51:12.582 INFO MemoryStore - Block broadcast_45_piece0 stored as bytes in memory (estimated size 50.2 KiB, free 1916.8 MiB)
14:51:12.582 INFO BlockManagerInfo - Added broadcast_45_piece0 in memory on localhost:44923 (size: 50.2 KiB, free: 1919.4 MiB)
14:51:12.582 INFO BlockManagerInfo - Removed broadcast_39_piece0 on localhost:44923 in memory (size: 67.1 KiB, free: 1919.5 MiB)
14:51:12.585 INFO BlockManagerInfo - Removed broadcast_40_piece0 on localhost:44923 in memory (size: 233.0 B, free: 1919.5 MiB)
14:51:12.585 INFO SparkContext - Created broadcast 45 from newAPIHadoopFile at PathSplitSource.java:96
14:51:12.588 INFO BlockManagerInfo - Removed broadcast_34_piece0 on localhost:44923 in memory (size: 50.2 KiB, free: 1919.5 MiB)
14:51:12.591 INFO BlockManagerInfo - Removed broadcast_41_piece0 on localhost:44923 in memory (size: 50.2 KiB, free: 1919.6 MiB)
14:51:12.593 INFO BlockManagerInfo - Removed broadcast_43_piece0 on localhost:44923 in memory (size: 153.6 KiB, free: 1919.7 MiB)
14:51:12.595 INFO BlockManagerInfo - Removed broadcast_35_piece0 on localhost:44923 in memory (size: 50.2 KiB, free: 1919.8 MiB)
14:51:12.597 INFO BlockManagerInfo - Removed broadcast_36_piece0 on localhost:44923 in memory (size: 9.6 KiB, free: 1919.8 MiB)
14:51:12.598 INFO BlockManagerInfo - Removed broadcast_37_piece0 on localhost:44923 in memory (size: 9.6 KiB, free: 1919.8 MiB)
14:51:12.599 INFO BlockManagerInfo - Removed broadcast_38_piece0 on localhost:44923 in memory (size: 166.1 KiB, free: 1920.0 MiB)
14:51:12.622 INFO MemoryStore - Block broadcast_46 stored as values in memory (estimated size 297.9 KiB, free 1919.4 MiB)
14:51:12.630 INFO MemoryStore - Block broadcast_46_piece0 stored as bytes in memory (estimated size 50.2 KiB, free 1919.3 MiB)
14:51:12.630 INFO BlockManagerInfo - Added broadcast_46_piece0 in memory on localhost:44923 (size: 50.2 KiB, free: 1919.9 MiB)
14:51:12.631 INFO SparkContext - Created broadcast 46 from newAPIHadoopFile at PathSplitSource.java:96
14:51:12.660 INFO FileInputFormat - Total input files to process : 1
14:51:12.673 INFO MemoryStore - Block broadcast_47 stored as values in memory (estimated size 160.7 KiB, free 1919.2 MiB)
14:51:12.677 INFO MemoryStore - Block broadcast_47_piece0 stored as bytes in memory (estimated size 9.6 KiB, free 1919.2 MiB)
14:51:12.677 INFO BlockManagerInfo - Added broadcast_47_piece0 in memory on localhost:44923 (size: 9.6 KiB, free: 1919.9 MiB)
14:51:12.677 INFO SparkContext - Created broadcast 47 from broadcast at ReadsSparkSink.java:133
14:51:12.680 INFO MemoryStore - Block broadcast_48 stored as values in memory (estimated size 163.2 KiB, free 1919.0 MiB)
14:51:12.682 INFO MemoryStore - Block broadcast_48_piece0 stored as bytes in memory (estimated size 9.6 KiB, free 1919.0 MiB)
14:51:12.682 INFO BlockManagerInfo - Added broadcast_48_piece0 in memory on localhost:44923 (size: 9.6 KiB, free: 1919.9 MiB)
14:51:12.682 INFO SparkContext - Created broadcast 48 from broadcast at BamSink.java:76
14:51:12.686 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_d5756aa4-f67c-4082-8d5c-02e679f31e67.bam.parts dst=null perm=null proto=rpc
14:51:12.686 INFO PathOutputCommitterFactory - No output committer factory defined, defaulting to FileOutputCommitterFactory
14:51:12.686 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
14:51:12.686 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
14:51:12.688 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=mkdirs src=/user/runner/ReadsSparkSinkUnitTest1_d5756aa4-f67c-4082-8d5c-02e679f31e67.bam.parts/_temporary/0 dst=null perm=runner:supergroup:rwxr-xr-x proto=rpc
14:51:12.696 INFO SparkContext - Starting job: runJob at SparkHadoopWriter.scala:83
14:51:12.697 INFO DAGScheduler - Registering RDD 102 (mapToPair at SparkUtils.java:161) as input to shuffle 8
14:51:12.697 INFO DAGScheduler - Got job 24 (runJob at SparkHadoopWriter.scala:83) with 1 output partitions
14:51:12.697 INFO DAGScheduler - Final stage: ResultStage 35 (runJob at SparkHadoopWriter.scala:83)
14:51:12.697 INFO DAGScheduler - Parents of final stage: List(ShuffleMapStage 34)
14:51:12.697 INFO DAGScheduler - Missing parents: List(ShuffleMapStage 34)
14:51:12.698 INFO DAGScheduler - Submitting ShuffleMapStage 34 (MapPartitionsRDD[102] at mapToPair at SparkUtils.java:161), which has no missing parents
14:51:12.722 INFO MemoryStore - Block broadcast_49 stored as values in memory (estimated size 520.4 KiB, free 1918.5 MiB)
14:51:12.724 INFO MemoryStore - Block broadcast_49_piece0 stored as bytes in memory (estimated size 166.1 KiB, free 1918.3 MiB)
14:51:12.724 INFO BlockManagerInfo - Added broadcast_49_piece0 in memory on localhost:44923 (size: 166.1 KiB, free: 1919.7 MiB)
14:51:12.724 INFO SparkContext - Created broadcast 49 from broadcast at DAGScheduler.scala:1580
14:51:12.724 INFO DAGScheduler - Submitting 1 missing tasks from ShuffleMapStage 34 (MapPartitionsRDD[102] at mapToPair at SparkUtils.java:161) (first 15 tasks are for partitions Vector(0))
14:51:12.725 INFO TaskSchedulerImpl - Adding task set 34.0 with 1 tasks resource profile 0
14:51:12.726 INFO TaskSetManager - Starting task 0.0 in stage 34.0 (TID 72) (localhost, executor driver, partition 0, PROCESS_LOCAL, 7861 bytes)
14:51:12.726 INFO Executor - Running task 0.0 in stage 34.0 (TID 72)
14:51:12.776 INFO NewHadoopRDD - Input split: file:/home/runner/work/gatk/gatk/src/test/resources/org/broadinstitute/hellbender/tools/BQSR/HiSeq.1mb.1RG.2k_lines.bam:0+222075
14:51:12.806 INFO Executor - Finished task 0.0 in stage 34.0 (TID 72). 1148 bytes result sent to driver
14:51:12.807 INFO TaskSetManager - Finished task 0.0 in stage 34.0 (TID 72) in 81 ms on localhost (executor driver) (1/1)
14:51:12.807 INFO TaskSchedulerImpl - Removed TaskSet 34.0, whose tasks have all completed, from pool
14:51:12.807 INFO DAGScheduler - ShuffleMapStage 34 (mapToPair at SparkUtils.java:161) finished in 0.109 s
14:51:12.807 INFO DAGScheduler - looking for newly runnable stages
14:51:12.807 INFO DAGScheduler - running: HashSet()
14:51:12.807 INFO DAGScheduler - waiting: HashSet(ResultStage 35)
14:51:12.807 INFO DAGScheduler - failed: HashSet()
14:51:12.807 INFO DAGScheduler - Submitting ResultStage 35 (MapPartitionsRDD[107] at mapToPair at BamSink.java:91), which has no missing parents
14:51:12.817 INFO MemoryStore - Block broadcast_50 stored as values in memory (estimated size 241.5 KiB, free 1918.1 MiB)
14:51:12.818 INFO MemoryStore - Block broadcast_50_piece0 stored as bytes in memory (estimated size 67.1 KiB, free 1918.0 MiB)
14:51:12.818 INFO BlockManagerInfo - Added broadcast_50_piece0 in memory on localhost:44923 (size: 67.1 KiB, free: 1919.7 MiB)
14:51:12.819 INFO SparkContext - Created broadcast 50 from broadcast at DAGScheduler.scala:1580
14:51:12.819 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 35 (MapPartitionsRDD[107] at mapToPair at BamSink.java:91) (first 15 tasks are for partitions Vector(0))
14:51:12.819 INFO TaskSchedulerImpl - Adding task set 35.0 with 1 tasks resource profile 0
14:51:12.820 INFO TaskSetManager - Starting task 0.0 in stage 35.0 (TID 73) (localhost, executor driver, partition 0, NODE_LOCAL, 7513 bytes)
14:51:12.821 INFO Executor - Running task 0.0 in stage 35.0 (TID 73)
14:51:12.833 INFO ShuffleBlockFetcherIterator - Getting 1 (343.8 KiB) non-empty blocks including 1 (343.8 KiB) local and 0 (0.0 B) host-local and 0 (0.0 B) push-merged-local and 0 (0.0 B) remote blocks
14:51:12.833 INFO ShuffleBlockFetcherIterator - Started 0 remote fetches in 0 ms
14:51:12.864 INFO PathOutputCommitterFactory - No output committer factory defined, defaulting to FileOutputCommitterFactory
14:51:12.864 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
14:51:12.864 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
14:51:12.864 INFO PathOutputCommitterFactory - No output committer factory defined, defaulting to FileOutputCommitterFactory
14:51:12.864 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
14:51:12.864 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
14:51:12.866 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=create src=/user/runner/ReadsSparkSinkUnitTest1_d5756aa4-f67c-4082-8d5c-02e679f31e67.bam.parts/_temporary/0/_temporary/attempt_20260304145112369924135368217136_0107_r_000000_0/part-r-00000 dst=null perm=runner:supergroup:rw-r--r-- proto=rpc
14:51:12.868 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=create src=/user/runner/ReadsSparkSinkUnitTest1_d5756aa4-f67c-4082-8d5c-02e679f31e67.bam.parts/_temporary/0/_temporary/attempt_20260304145112369924135368217136_0107_r_000000_0/.part-r-00000.sbi dst=null perm=runner:supergroup:rw-r--r-- proto=rpc
14:51:12.870 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=create src=/user/runner/ReadsSparkSinkUnitTest1_d5756aa4-f67c-4082-8d5c-02e679f31e67.bam.parts/_temporary/0/_temporary/attempt_20260304145112369924135368217136_0107_r_000000_0/.part-r-00000.bai dst=null perm=runner:supergroup:rw-r--r-- proto=rpc
14:51:12.877 INFO StateChange - BLOCK* allocate blk_1073741832_1008, replicas=127.0.0.1:34059 for /user/runner/ReadsSparkSinkUnitTest1_d5756aa4-f67c-4082-8d5c-02e679f31e67.bam.parts/_temporary/0/_temporary/attempt_20260304145112369924135368217136_0107_r_000000_0/part-r-00000
14:51:12.881 INFO DataNode - Receiving BP-1768883704-10.1.0.125-1772635868711:blk_1073741832_1008 src: /127.0.0.1:33824 dest: /127.0.0.1:34059
14:51:12.884 INFO clienttrace - src: /127.0.0.1:33824, dest: /127.0.0.1:34059, bytes: 231298, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-951833550_1, offset: 0, srvID: 3c05d188-bc2c-46c8-958b-79f7184a02a2, blockid: BP-1768883704-10.1.0.125-1772635868711:blk_1073741832_1008, duration(ns): 1477853
14:51:12.884 INFO DataNode - PacketResponder: BP-1768883704-10.1.0.125-1772635868711:blk_1073741832_1008, type=LAST_IN_PIPELINE terminating
14:51:12.886 INFO StateChange - DIR* completeFile: /user/runner/ReadsSparkSinkUnitTest1_d5756aa4-f67c-4082-8d5c-02e679f31e67.bam.parts/_temporary/0/_temporary/attempt_20260304145112369924135368217136_0107_r_000000_0/part-r-00000 is closed by DFSClient_NONMAPREDUCE_-951833550_1
14:51:12.887 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_d5756aa4-f67c-4082-8d5c-02e679f31e67.bam.parts/_temporary/0/_temporary/attempt_20260304145112369924135368217136_0107_r_000000_0/part-r-00000 dst=null perm=null proto=rpc
14:51:12.888 INFO StateChange - BLOCK* allocate blk_1073741833_1009, replicas=127.0.0.1:34059 for /user/runner/ReadsSparkSinkUnitTest1_d5756aa4-f67c-4082-8d5c-02e679f31e67.bam.parts/_temporary/0/_temporary/attempt_20260304145112369924135368217136_0107_r_000000_0/.part-r-00000.sbi
14:51:12.889 INFO DataNode - Receiving BP-1768883704-10.1.0.125-1772635868711:blk_1073741833_1009 src: /127.0.0.1:33840 dest: /127.0.0.1:34059
14:51:12.891 INFO clienttrace - src: /127.0.0.1:33840, dest: /127.0.0.1:34059, bytes: 13492, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-951833550_1, offset: 0, srvID: 3c05d188-bc2c-46c8-958b-79f7184a02a2, blockid: BP-1768883704-10.1.0.125-1772635868711:blk_1073741833_1009, duration(ns): 878280
14:51:12.891 INFO DataNode - PacketResponder: BP-1768883704-10.1.0.125-1772635868711:blk_1073741833_1009, type=LAST_IN_PIPELINE terminating
14:51:12.893 INFO StateChange - DIR* completeFile: /user/runner/ReadsSparkSinkUnitTest1_d5756aa4-f67c-4082-8d5c-02e679f31e67.bam.parts/_temporary/0/_temporary/attempt_20260304145112369924135368217136_0107_r_000000_0/.part-r-00000.sbi is closed by DFSClient_NONMAPREDUCE_-951833550_1
14:51:12.895 INFO StateChange - BLOCK* allocate blk_1073741834_1010, replicas=127.0.0.1:34059 for /user/runner/ReadsSparkSinkUnitTest1_d5756aa4-f67c-4082-8d5c-02e679f31e67.bam.parts/_temporary/0/_temporary/attempt_20260304145112369924135368217136_0107_r_000000_0/.part-r-00000.bai
14:51:12.897 INFO DataNode - Receiving BP-1768883704-10.1.0.125-1772635868711:blk_1073741834_1010 src: /127.0.0.1:33844 dest: /127.0.0.1:34059
14:51:12.898 INFO clienttrace - src: /127.0.0.1:33844, dest: /127.0.0.1:34059, bytes: 5472, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-951833550_1, offset: 0, srvID: 3c05d188-bc2c-46c8-958b-79f7184a02a2, blockid: BP-1768883704-10.1.0.125-1772635868711:blk_1073741834_1010, duration(ns): 685647
14:51:12.899 INFO DataNode - PacketResponder: BP-1768883704-10.1.0.125-1772635868711:blk_1073741834_1010, type=LAST_IN_PIPELINE terminating
14:51:12.900 INFO StateChange - DIR* completeFile: /user/runner/ReadsSparkSinkUnitTest1_d5756aa4-f67c-4082-8d5c-02e679f31e67.bam.parts/_temporary/0/_temporary/attempt_20260304145112369924135368217136_0107_r_000000_0/.part-r-00000.bai is closed by DFSClient_NONMAPREDUCE_-951833550_1
14:51:12.901 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_d5756aa4-f67c-4082-8d5c-02e679f31e67.bam.parts/_temporary/0/_temporary/attempt_20260304145112369924135368217136_0107_r_000000_0 dst=null perm=null proto=rpc
14:51:12.902 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_d5756aa4-f67c-4082-8d5c-02e679f31e67.bam.parts/_temporary/0/_temporary/attempt_20260304145112369924135368217136_0107_r_000000_0 dst=null perm=null proto=rpc
14:51:12.903 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_d5756aa4-f67c-4082-8d5c-02e679f31e67.bam.parts/_temporary/0/task_20260304145112369924135368217136_0107_r_000000 dst=null perm=null proto=rpc
14:51:12.905 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=rename src=/user/runner/ReadsSparkSinkUnitTest1_d5756aa4-f67c-4082-8d5c-02e679f31e67.bam.parts/_temporary/0/_temporary/attempt_20260304145112369924135368217136_0107_r_000000_0 dst=/user/runner/ReadsSparkSinkUnitTest1_d5756aa4-f67c-4082-8d5c-02e679f31e67.bam.parts/_temporary/0/task_20260304145112369924135368217136_0107_r_000000 perm=runner:supergroup:rwxr-xr-x proto=rpc
14:51:12.905 INFO FileOutputCommitter - Saved output of task 'attempt_20260304145112369924135368217136_0107_r_000000_0' to hdfs://localhost:43595/user/runner/ReadsSparkSinkUnitTest1_d5756aa4-f67c-4082-8d5c-02e679f31e67.bam.parts/_temporary/0/task_20260304145112369924135368217136_0107_r_000000
14:51:12.905 INFO SparkHadoopMapRedUtil - attempt_20260304145112369924135368217136_0107_r_000000_0: Committed. Elapsed time: 3 ms.
14:51:12.906 INFO Executor - Finished task 0.0 in stage 35.0 (TID 73). 1858 bytes result sent to driver
14:51:12.908 INFO TaskSetManager - Finished task 0.0 in stage 35.0 (TID 73) in 88 ms on localhost (executor driver) (1/1)
14:51:12.908 INFO TaskSchedulerImpl - Removed TaskSet 35.0, whose tasks have all completed, from pool
14:51:12.908 INFO DAGScheduler - ResultStage 35 (runJob at SparkHadoopWriter.scala:83) finished in 0.100 s
14:51:12.908 INFO DAGScheduler - Job 24 is finished. Cancelling potential speculative or zombie tasks for this job
14:51:12.909 INFO TaskSchedulerImpl - Killing all running tasks in stage 35: Stage finished
14:51:12.909 INFO DAGScheduler - Job 24 finished: runJob at SparkHadoopWriter.scala:83, took 0.212745 s
14:51:12.910 INFO SparkHadoopWriter - Start to commit write Job job_20260304145112369924135368217136_0107.
14:51:12.910 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=listStatus src=/user/runner/ReadsSparkSinkUnitTest1_d5756aa4-f67c-4082-8d5c-02e679f31e67.bam.parts/_temporary/0 dst=null perm=null proto=rpc
14:51:12.911 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_d5756aa4-f67c-4082-8d5c-02e679f31e67.bam.parts dst=null perm=null proto=rpc
14:51:12.912 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=listStatus src=/user/runner/ReadsSparkSinkUnitTest1_d5756aa4-f67c-4082-8d5c-02e679f31e67.bam.parts/_temporary/0/task_20260304145112369924135368217136_0107_r_000000 dst=null perm=null proto=rpc
14:51:12.913 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_d5756aa4-f67c-4082-8d5c-02e679f31e67.bam.parts/.part-r-00000.bai dst=null perm=null proto=rpc
14:51:12.914 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=rename src=/user/runner/ReadsSparkSinkUnitTest1_d5756aa4-f67c-4082-8d5c-02e679f31e67.bam.parts/_temporary/0/task_20260304145112369924135368217136_0107_r_000000/.part-r-00000.bai dst=/user/runner/ReadsSparkSinkUnitTest1_d5756aa4-f67c-4082-8d5c-02e679f31e67.bam.parts/.part-r-00000.bai perm=runner:supergroup:rw-r--r-- proto=rpc
14:51:12.915 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_d5756aa4-f67c-4082-8d5c-02e679f31e67.bam.parts/.part-r-00000.sbi dst=null perm=null proto=rpc
14:51:12.916 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=rename src=/user/runner/ReadsSparkSinkUnitTest1_d5756aa4-f67c-4082-8d5c-02e679f31e67.bam.parts/_temporary/0/task_20260304145112369924135368217136_0107_r_000000/.part-r-00000.sbi dst=/user/runner/ReadsSparkSinkUnitTest1_d5756aa4-f67c-4082-8d5c-02e679f31e67.bam.parts/.part-r-00000.sbi perm=runner:supergroup:rw-r--r-- proto=rpc
14:51:12.916 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_d5756aa4-f67c-4082-8d5c-02e679f31e67.bam.parts/part-r-00000 dst=null perm=null proto=rpc
14:51:12.917 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=rename src=/user/runner/ReadsSparkSinkUnitTest1_d5756aa4-f67c-4082-8d5c-02e679f31e67.bam.parts/_temporary/0/task_20260304145112369924135368217136_0107_r_000000/part-r-00000 dst=/user/runner/ReadsSparkSinkUnitTest1_d5756aa4-f67c-4082-8d5c-02e679f31e67.bam.parts/part-r-00000 perm=runner:supergroup:rw-r--r-- proto=rpc
14:51:12.918 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=delete src=/user/runner/ReadsSparkSinkUnitTest1_d5756aa4-f67c-4082-8d5c-02e679f31e67.bam.parts/_temporary dst=null perm=null proto=rpc
14:51:12.919 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=create src=/user/runner/ReadsSparkSinkUnitTest1_d5756aa4-f67c-4082-8d5c-02e679f31e67.bam.parts/_SUCCESS dst=null perm=runner:supergroup:rw-r--r-- proto=rpc
14:51:12.921 INFO StateChange - DIR* completeFile: /user/runner/ReadsSparkSinkUnitTest1_d5756aa4-f67c-4082-8d5c-02e679f31e67.bam.parts/_SUCCESS is closed by DFSClient_NONMAPREDUCE_-951833550_1
14:51:12.922 INFO audit - allowed=false ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=delete src=/user/runner/ReadsSparkSinkUnitTest1_d5756aa4-f67c-4082-8d5c-02e679f31e67.bam.parts/.spark-staging-107 dst=null perm=null proto=rpc
14:51:12.922 INFO SparkHadoopWriter - Write Job job_20260304145112369924135368217136_0107 committed. Elapsed time: 12 ms.
14:51:12.923 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=create src=/user/runner/ReadsSparkSinkUnitTest1_d5756aa4-f67c-4082-8d5c-02e679f31e67.bam.parts/header dst=null perm=runner:supergroup:rw-r--r-- proto=rpc
14:51:12.925 INFO StateChange - BLOCK* allocate blk_1073741835_1011, replicas=127.0.0.1:34059 for /user/runner/ReadsSparkSinkUnitTest1_d5756aa4-f67c-4082-8d5c-02e679f31e67.bam.parts/header
14:51:12.927 INFO DataNode - Receiving BP-1768883704-10.1.0.125-1772635868711:blk_1073741835_1011 src: /127.0.0.1:33852 dest: /127.0.0.1:34059
14:51:12.928 INFO clienttrace - src: /127.0.0.1:33852, dest: /127.0.0.1:34059, bytes: 5712, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-951833550_1, offset: 0, srvID: 3c05d188-bc2c-46c8-958b-79f7184a02a2, blockid: BP-1768883704-10.1.0.125-1772635868711:blk_1073741835_1011, duration(ns): 733998
14:51:12.929 INFO DataNode - PacketResponder: BP-1768883704-10.1.0.125-1772635868711:blk_1073741835_1011, type=LAST_IN_PIPELINE terminating
14:51:12.930 INFO FSNamesystem - BLOCK* blk_1073741835_1011 is COMMITTED but not COMPLETE(numNodes= 0 < minimum = 1) in file /user/runner/ReadsSparkSinkUnitTest1_d5756aa4-f67c-4082-8d5c-02e679f31e67.bam.parts/header
14:51:13.031 INFO FsDatasetAsyncDiskService - Scheduling blk_1073741826_1002 replica FinalizedReplica, blk_1073741826_1002, FINALIZED
getNumBytes() = 212
getBytesOnDisk() = 212
getVisibleLength()= 212
getVolume() = /tmp/minicluster_storage16268522075870465194/data/data2
getBlockURI() = file:/tmp/minicluster_storage16268522075870465194/data/data2/current/BP-1768883704-10.1.0.125-1772635868711/current/finalized/subdir0/subdir0/blk_1073741826 for deletion
14:51:13.032 INFO FsDatasetAsyncDiskService - Scheduling blk_1073741827_1003 replica FinalizedReplica, blk_1073741827_1003, FINALIZED
getNumBytes() = 5472
getBytesOnDisk() = 5472
getVisibleLength()= 5472
getVolume() = /tmp/minicluster_storage16268522075870465194/data/data1
getBlockURI() = file:/tmp/minicluster_storage16268522075870465194/data/data1/current/BP-1768883704-10.1.0.125-1772635868711/current/finalized/subdir0/subdir0/blk_1073741827 for deletion
14:51:13.032 INFO FsDatasetAsyncDiskService - Deleted BP-1768883704-10.1.0.125-1772635868711 blk_1073741826_1002 URI file:/tmp/minicluster_storage16268522075870465194/data/data2/current/BP-1768883704-10.1.0.125-1772635868711/current/finalized/subdir0/subdir0/blk_1073741826
14:51:13.032 INFO FsDatasetAsyncDiskService - Deleted BP-1768883704-10.1.0.125-1772635868711 blk_1073741827_1003 URI file:/tmp/minicluster_storage16268522075870465194/data/data1/current/BP-1768883704-10.1.0.125-1772635868711/current/finalized/subdir0/subdir0/blk_1073741827
14:51:13.331 INFO StateChange - DIR* completeFile: /user/runner/ReadsSparkSinkUnitTest1_d5756aa4-f67c-4082-8d5c-02e679f31e67.bam.parts/header is closed by DFSClient_NONMAPREDUCE_-951833550_1
14:51:13.333 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=create src=/user/runner/ReadsSparkSinkUnitTest1_d5756aa4-f67c-4082-8d5c-02e679f31e67.bam.parts/terminator dst=null perm=runner:supergroup:rw-r--r-- proto=rpc
14:51:13.334 INFO StateChange - BLOCK* allocate blk_1073741836_1012, replicas=127.0.0.1:34059 for /user/runner/ReadsSparkSinkUnitTest1_d5756aa4-f67c-4082-8d5c-02e679f31e67.bam.parts/terminator
14:51:13.336 INFO DataNode - Receiving BP-1768883704-10.1.0.125-1772635868711:blk_1073741836_1012 src: /127.0.0.1:33856 dest: /127.0.0.1:34059
14:51:13.337 INFO clienttrace - src: /127.0.0.1:33856, dest: /127.0.0.1:34059, bytes: 28, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-951833550_1, offset: 0, srvID: 3c05d188-bc2c-46c8-958b-79f7184a02a2, blockid: BP-1768883704-10.1.0.125-1772635868711:blk_1073741836_1012, duration(ns): 777636
14:51:13.338 INFO DataNode - PacketResponder: BP-1768883704-10.1.0.125-1772635868711:blk_1073741836_1012, type=LAST_IN_PIPELINE terminating
14:51:13.338 INFO FSNamesystem - BLOCK* blk_1073741836_1012 is COMMITTED but not COMPLETE(numNodes= 0 < minimum = 1) in file /user/runner/ReadsSparkSinkUnitTest1_d5756aa4-f67c-4082-8d5c-02e679f31e67.bam.parts/terminator
14:51:13.740 INFO StateChange - DIR* completeFile: /user/runner/ReadsSparkSinkUnitTest1_d5756aa4-f67c-4082-8d5c-02e679f31e67.bam.parts/terminator is closed by DFSClient_NONMAPREDUCE_-951833550_1
14:51:13.741 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=listStatus src=/user/runner/ReadsSparkSinkUnitTest1_d5756aa4-f67c-4082-8d5c-02e679f31e67.bam.parts dst=null perm=null proto=rpc
14:51:13.743 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=create src=/user/runner/ReadsSparkSinkUnitTest1_d5756aa4-f67c-4082-8d5c-02e679f31e67.bam.parts/output dst=null perm=runner:supergroup:rw-r--r-- proto=rpc
14:51:13.744 INFO StateChange - DIR* completeFile: /user/runner/ReadsSparkSinkUnitTest1_d5756aa4-f67c-4082-8d5c-02e679f31e67.bam.parts/output is closed by DFSClient_NONMAPREDUCE_-951833550_1
14:51:13.745 INFO HadoopFileSystemWrapper - Concatenating 3 parts to hdfs://localhost:43595/user/runner/ReadsSparkSinkUnitTest1_d5756aa4-f67c-4082-8d5c-02e679f31e67.bam
14:51:13.746 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=concat src=[/user/runner/ReadsSparkSinkUnitTest1_d5756aa4-f67c-4082-8d5c-02e679f31e67.bam.parts/header, /user/runner/ReadsSparkSinkUnitTest1_d5756aa4-f67c-4082-8d5c-02e679f31e67.bam.parts/part-r-00000, /user/runner/ReadsSparkSinkUnitTest1_d5756aa4-f67c-4082-8d5c-02e679f31e67.bam.parts/terminator] dst=/user/runner/ReadsSparkSinkUnitTest1_d5756aa4-f67c-4082-8d5c-02e679f31e67.bam.parts/output perm=runner:supergroup:rw-r--r-- proto=rpc
14:51:13.746 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_d5756aa4-f67c-4082-8d5c-02e679f31e67.bam dst=null perm=null proto=rpc
14:51:13.747 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=rename src=/user/runner/ReadsSparkSinkUnitTest1_d5756aa4-f67c-4082-8d5c-02e679f31e67.bam.parts/output dst=/user/runner/ReadsSparkSinkUnitTest1_d5756aa4-f67c-4082-8d5c-02e679f31e67.bam perm=runner:supergroup:rw-r--r-- proto=rpc
14:51:13.748 INFO HadoopFileSystemWrapper - Concatenating to hdfs://localhost:43595/user/runner/ReadsSparkSinkUnitTest1_d5756aa4-f67c-4082-8d5c-02e679f31e67.bam done
14:51:13.748 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_d5756aa4-f67c-4082-8d5c-02e679f31e67.bam dst=null perm=null proto=rpc
14:51:13.748 INFO IndexFileMerger - Merging .sbi files in temp directory hdfs://localhost:43595/user/runner/ReadsSparkSinkUnitTest1_d5756aa4-f67c-4082-8d5c-02e679f31e67.bam.parts/ to hdfs://localhost:43595/user/runner/ReadsSparkSinkUnitTest1_d5756aa4-f67c-4082-8d5c-02e679f31e67.bam.sbi
14:51:13.749 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=listStatus src=/user/runner/ReadsSparkSinkUnitTest1_d5756aa4-f67c-4082-8d5c-02e679f31e67.bam.parts dst=null perm=null proto=rpc
14:51:13.750 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=create src=/user/runner/ReadsSparkSinkUnitTest1_d5756aa4-f67c-4082-8d5c-02e679f31e67.bam.sbi dst=null perm=runner:supergroup:rw-r--r-- proto=rpc
14:51:13.752 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_d5756aa4-f67c-4082-8d5c-02e679f31e67.bam.parts/.part-r-00000.sbi dst=null perm=null proto=rpc
14:51:13.753 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest1_d5756aa4-f67c-4082-8d5c-02e679f31e67.bam.parts/.part-r-00000.sbi dst=null perm=null proto=rpc
14:51:13.756 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=delete src=/user/runner/ReadsSparkSinkUnitTest1_d5756aa4-f67c-4082-8d5c-02e679f31e67.bam.parts/.part-r-00000.sbi dst=null perm=null proto=rpc
14:51:13.757 INFO StateChange - BLOCK* allocate blk_1073741837_1013, replicas=127.0.0.1:34059 for /user/runner/ReadsSparkSinkUnitTest1_d5756aa4-f67c-4082-8d5c-02e679f31e67.bam.sbi
14:51:13.759 INFO DataNode - Receiving BP-1768883704-10.1.0.125-1772635868711:blk_1073741837_1013 src: /127.0.0.1:33864 dest: /127.0.0.1:34059
14:51:13.761 INFO clienttrace - src: /127.0.0.1:33864, dest: /127.0.0.1:34059, bytes: 13492, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-951833550_1, offset: 0, srvID: 3c05d188-bc2c-46c8-958b-79f7184a02a2, blockid: BP-1768883704-10.1.0.125-1772635868711:blk_1073741837_1013, duration(ns): 858186
14:51:13.761 INFO DataNode - PacketResponder: BP-1768883704-10.1.0.125-1772635868711:blk_1073741837_1013, type=LAST_IN_PIPELINE terminating
14:51:13.762 INFO StateChange - DIR* completeFile: /user/runner/ReadsSparkSinkUnitTest1_d5756aa4-f67c-4082-8d5c-02e679f31e67.bam.sbi is closed by DFSClient_NONMAPREDUCE_-951833550_1
14:51:13.763 INFO IndexFileMerger - Done merging .sbi files
14:51:13.763 INFO IndexFileMerger - Merging .bai files in temp directory hdfs://localhost:43595/user/runner/ReadsSparkSinkUnitTest1_d5756aa4-f67c-4082-8d5c-02e679f31e67.bam.parts/ to hdfs://localhost:43595/user/runner/ReadsSparkSinkUnitTest1_d5756aa4-f67c-4082-8d5c-02e679f31e67.bam.bai
14:51:13.764 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=listStatus src=/user/runner/ReadsSparkSinkUnitTest1_d5756aa4-f67c-4082-8d5c-02e679f31e67.bam.parts dst=null perm=null proto=rpc
14:51:13.765 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=create src=/user/runner/ReadsSparkSinkUnitTest1_d5756aa4-f67c-4082-8d5c-02e679f31e67.bam.bai dst=null perm=runner:supergroup:rw-r--r-- proto=rpc
14:51:13.767 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_d5756aa4-f67c-4082-8d5c-02e679f31e67.bam.parts/.part-r-00000.bai dst=null perm=null proto=rpc
14:51:13.767 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest1_d5756aa4-f67c-4082-8d5c-02e679f31e67.bam.parts/.part-r-00000.bai dst=null perm=null proto=rpc
14:51:13.769 WARN DFSUtil - Unexpected value for data transfer bytes=5516 duration=0
14:51:13.770 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=delete src=/user/runner/ReadsSparkSinkUnitTest1_d5756aa4-f67c-4082-8d5c-02e679f31e67.bam.parts/.part-r-00000.bai dst=null perm=null proto=rpc
14:51:13.776 INFO StateChange - BLOCK* allocate blk_1073741838_1014, replicas=127.0.0.1:34059 for /user/runner/ReadsSparkSinkUnitTest1_d5756aa4-f67c-4082-8d5c-02e679f31e67.bam.bai
14:51:13.778 INFO DataNode - Receiving BP-1768883704-10.1.0.125-1772635868711:blk_1073741838_1014 src: /127.0.0.1:33878 dest: /127.0.0.1:34059
14:51:13.779 INFO clienttrace - src: /127.0.0.1:33878, dest: /127.0.0.1:34059, bytes: 5472, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-951833550_1, offset: 0, srvID: 3c05d188-bc2c-46c8-958b-79f7184a02a2, blockid: BP-1768883704-10.1.0.125-1772635868711:blk_1073741838_1014, duration(ns): 741478
14:51:13.780 INFO DataNode - PacketResponder: BP-1768883704-10.1.0.125-1772635868711:blk_1073741838_1014, type=LAST_IN_PIPELINE terminating
14:51:13.780 INFO StateChange - DIR* completeFile: /user/runner/ReadsSparkSinkUnitTest1_d5756aa4-f67c-4082-8d5c-02e679f31e67.bam.bai is closed by DFSClient_NONMAPREDUCE_-951833550_1
14:51:13.781 INFO IndexFileMerger - Done merging .bai files
14:51:13.782 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=delete src=/user/runner/ReadsSparkSinkUnitTest1_d5756aa4-f67c-4082-8d5c-02e679f31e67.bam.parts dst=null perm=null proto=rpc
14:51:13.792 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_d5756aa4-f67c-4082-8d5c-02e679f31e67.bam.bai dst=null perm=null proto=rpc
14:51:13.801 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_d5756aa4-f67c-4082-8d5c-02e679f31e67.bam.sbi dst=null perm=null proto=rpc
14:51:13.802 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_d5756aa4-f67c-4082-8d5c-02e679f31e67.bam.sbi dst=null perm=null proto=rpc
14:51:13.803 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest1_d5756aa4-f67c-4082-8d5c-02e679f31e67.bam.sbi dst=null perm=null proto=rpc
14:51:13.805 WARN DFSUtil - Unexpected value for data transfer bytes=13600 duration=0
14:51:13.806 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_d5756aa4-f67c-4082-8d5c-02e679f31e67.bam dst=null perm=null proto=rpc
14:51:13.807 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_d5756aa4-f67c-4082-8d5c-02e679f31e67.bam dst=null perm=null proto=rpc
14:51:13.807 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_d5756aa4-f67c-4082-8d5c-02e679f31e67.bam dst=null perm=null proto=rpc
14:51:13.808 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest1_d5756aa4-f67c-4082-8d5c-02e679f31e67.bam dst=null perm=null proto=rpc
14:51:13.810 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_d5756aa4-f67c-4082-8d5c-02e679f31e67.bam.bai dst=null perm=null proto=rpc
14:51:13.810 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_d5756aa4-f67c-4082-8d5c-02e679f31e67.bam.bai dst=null perm=null proto=rpc
14:51:13.811 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest1_d5756aa4-f67c-4082-8d5c-02e679f31e67.bam.bai dst=null perm=null proto=rpc
14:51:13.816 WARN DFSUtil - Unexpected value for data transfer bytes=5516 duration=0
14:51:13.817 WARN DFSUtil - Unexpected value for data transfer bytes=5516 duration=0
14:51:13.818 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_d5756aa4-f67c-4082-8d5c-02e679f31e67.bam.sbi dst=null perm=null proto=rpc
14:51:13.819 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_d5756aa4-f67c-4082-8d5c-02e679f31e67.bam.sbi dst=null perm=null proto=rpc
14:51:13.819 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest1_d5756aa4-f67c-4082-8d5c-02e679f31e67.bam.sbi dst=null perm=null proto=rpc
14:51:13.821 INFO MemoryStore - Block broadcast_51 stored as values in memory (estimated size 13.3 KiB, free 1918.0 MiB)
14:51:13.823 INFO MemoryStore - Block broadcast_51_piece0 stored as bytes in memory (estimated size 8.3 KiB, free 1918.0 MiB)
14:51:13.823 INFO BlockManagerInfo - Added broadcast_51_piece0 in memory on localhost:44923 (size: 8.3 KiB, free: 1919.6 MiB)
14:51:13.823 INFO SparkContext - Created broadcast 51 from broadcast at BamSource.java:104
14:51:13.825 INFO MemoryStore - Block broadcast_52 stored as values in memory (estimated size 297.9 KiB, free 1917.7 MiB)
14:51:13.833 INFO MemoryStore - Block broadcast_52_piece0 stored as bytes in memory (estimated size 50.2 KiB, free 1917.7 MiB)
14:51:13.833 INFO BlockManagerInfo - Added broadcast_52_piece0 in memory on localhost:44923 (size: 50.2 KiB, free: 1919.6 MiB)
14:51:13.833 INFO SparkContext - Created broadcast 52 from newAPIHadoopFile at PathSplitSource.java:96
14:51:13.846 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_d5756aa4-f67c-4082-8d5c-02e679f31e67.bam dst=null perm=null proto=rpc
14:51:13.846 INFO FileInputFormat - Total input files to process : 1
14:51:13.847 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest1_d5756aa4-f67c-4082-8d5c-02e679f31e67.bam dst=null perm=null proto=rpc
14:51:13.875 INFO SparkContext - Starting job: collect at ReadsSparkSinkUnitTest.java:182
14:51:13.875 INFO DAGScheduler - Got job 25 (collect at ReadsSparkSinkUnitTest.java:182) with 1 output partitions
14:51:13.875 INFO DAGScheduler - Final stage: ResultStage 36 (collect at ReadsSparkSinkUnitTest.java:182)
14:51:13.875 INFO DAGScheduler - Parents of final stage: List()
14:51:13.875 INFO DAGScheduler - Missing parents: List()
14:51:13.876 INFO DAGScheduler - Submitting ResultStage 36 (MapPartitionsRDD[113] at filter at ReadsSparkSource.java:96), which has no missing parents
14:51:13.883 INFO MemoryStore - Block broadcast_53 stored as values in memory (estimated size 148.2 KiB, free 1917.5 MiB)
14:51:13.884 INFO MemoryStore - Block broadcast_53_piece0 stored as bytes in memory (estimated size 54.6 KiB, free 1917.5 MiB)
14:51:13.885 INFO BlockManagerInfo - Added broadcast_53_piece0 in memory on localhost:44923 (size: 54.6 KiB, free: 1919.5 MiB)
14:51:13.885 INFO SparkContext - Created broadcast 53 from broadcast at DAGScheduler.scala:1580
14:51:13.886 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 36 (MapPartitionsRDD[113] at filter at ReadsSparkSource.java:96) (first 15 tasks are for partitions Vector(0))
14:51:13.886 INFO TaskSchedulerImpl - Adding task set 36.0 with 1 tasks resource profile 0
14:51:13.886 INFO TaskSetManager - Starting task 0.0 in stage 36.0 (TID 74) (localhost, executor driver, partition 0, ANY, 7852 bytes)
14:51:13.887 INFO Executor - Running task 0.0 in stage 36.0 (TID 74)
14:51:13.904 INFO NewHadoopRDD - Input split: hdfs://localhost:43595/user/runner/ReadsSparkSinkUnitTest1_d5756aa4-f67c-4082-8d5c-02e679f31e67.bam:0+237038
14:51:13.906 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_d5756aa4-f67c-4082-8d5c-02e679f31e67.bam dst=null perm=null proto=rpc
14:51:13.907 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest1_d5756aa4-f67c-4082-8d5c-02e679f31e67.bam dst=null perm=null proto=rpc
14:51:13.908 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_d5756aa4-f67c-4082-8d5c-02e679f31e67.bam.bai dst=null perm=null proto=rpc
14:51:13.909 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_d5756aa4-f67c-4082-8d5c-02e679f31e67.bam.bai dst=null perm=null proto=rpc
14:51:13.909 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest1_d5756aa4-f67c-4082-8d5c-02e679f31e67.bam.bai dst=null perm=null proto=rpc
14:51:13.912 WARN DFSUtil - Unexpected value for data transfer bytes=5760 duration=0
14:51:13.915 WARN DFSUtil - Unexpected value for data transfer bytes=5516 duration=0
14:51:13.916 WARN DFSUtil - Unexpected value for data transfer bytes=5516 duration=0
14:51:13.918 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
14:51:13.918 WARN DFSUtil - Unexpected value for data transfer bytes=32 duration=0
14:51:13.923 INFO Executor - Finished task 0.0 in stage 36.0 (TID 74). 651526 bytes result sent to driver
14:51:13.925 INFO TaskSetManager - Finished task 0.0 in stage 36.0 (TID 74) in 39 ms on localhost (executor driver) (1/1)
14:51:13.926 INFO TaskSchedulerImpl - Removed TaskSet 36.0, whose tasks have all completed, from pool
14:51:13.926 INFO DAGScheduler - ResultStage 36 (collect at ReadsSparkSinkUnitTest.java:182) finished in 0.050 s
14:51:13.926 INFO DAGScheduler - Job 25 is finished. Cancelling potential speculative or zombie tasks for this job
14:51:13.926 INFO TaskSchedulerImpl - Killing all running tasks in stage 36: Stage finished
14:51:13.926 INFO DAGScheduler - Job 25 finished: collect at ReadsSparkSinkUnitTest.java:182, took 0.051592 s
14:51:13.948 INFO SparkContext - Starting job: count at ReadsSparkSinkUnitTest.java:185
14:51:13.949 INFO DAGScheduler - Got job 26 (count at ReadsSparkSinkUnitTest.java:185) with 1 output partitions
14:51:13.949 INFO DAGScheduler - Final stage: ResultStage 37 (count at ReadsSparkSinkUnitTest.java:185)
14:51:13.949 INFO DAGScheduler - Parents of final stage: List()
14:51:13.949 INFO DAGScheduler - Missing parents: List()
14:51:13.949 INFO DAGScheduler - Submitting ResultStage 37 (MapPartitionsRDD[95] at filter at ReadsSparkSource.java:96), which has no missing parents
14:51:13.968 INFO MemoryStore - Block broadcast_54 stored as values in memory (estimated size 426.1 KiB, free 1917.0 MiB)
14:51:13.970 INFO MemoryStore - Block broadcast_54_piece0 stored as bytes in memory (estimated size 153.6 KiB, free 1916.9 MiB)
14:51:13.970 INFO BlockManagerInfo - Added broadcast_54_piece0 in memory on localhost:44923 (size: 153.6 KiB, free: 1919.4 MiB)
14:51:13.970 INFO SparkContext - Created broadcast 54 from broadcast at DAGScheduler.scala:1580
14:51:13.971 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 37 (MapPartitionsRDD[95] at filter at ReadsSparkSource.java:96) (first 15 tasks are for partitions Vector(0))
14:51:13.971 INFO TaskSchedulerImpl - Adding task set 37.0 with 1 tasks resource profile 0
14:51:13.971 INFO TaskSetManager - Starting task 0.0 in stage 37.0 (TID 75) (localhost, executor driver, partition 0, PROCESS_LOCAL, 7872 bytes)
14:51:13.972 INFO Executor - Running task 0.0 in stage 37.0 (TID 75)
14:51:14.011 INFO NewHadoopRDD - Input split: file:/home/runner/work/gatk/gatk/src/test/resources/org/broadinstitute/hellbender/tools/BQSR/HiSeq.1mb.1RG.2k_lines.bam:0+222075
14:51:14.025 INFO Executor - Finished task 0.0 in stage 37.0 (TID 75). 989 bytes result sent to driver
14:51:14.026 INFO TaskSetManager - Finished task 0.0 in stage 37.0 (TID 75) in 55 ms on localhost (executor driver) (1/1)
14:51:14.026 INFO TaskSchedulerImpl - Removed TaskSet 37.0, whose tasks have all completed, from pool
14:51:14.026 INFO DAGScheduler - ResultStage 37 (count at ReadsSparkSinkUnitTest.java:185) finished in 0.076 s
14:51:14.026 INFO DAGScheduler - Job 26 is finished. Cancelling potential speculative or zombie tasks for this job
14:51:14.026 INFO TaskSchedulerImpl - Killing all running tasks in stage 37: Stage finished
14:51:14.026 INFO DAGScheduler - Job 26 finished: count at ReadsSparkSinkUnitTest.java:185, took 0.078178 s
14:51:14.031 INFO SparkContext - Starting job: count at ReadsSparkSinkUnitTest.java:185
14:51:14.031 INFO DAGScheduler - Got job 27 (count at ReadsSparkSinkUnitTest.java:185) with 1 output partitions
14:51:14.031 INFO DAGScheduler - Final stage: ResultStage 38 (count at ReadsSparkSinkUnitTest.java:185)
14:51:14.031 INFO DAGScheduler - Parents of final stage: List()
14:51:14.031 INFO DAGScheduler - Missing parents: List()
14:51:14.032 INFO DAGScheduler - Submitting ResultStage 38 (MapPartitionsRDD[113] at filter at ReadsSparkSource.java:96), which has no missing parents
14:51:14.043 INFO MemoryStore - Block broadcast_55 stored as values in memory (estimated size 148.1 KiB, free 1916.7 MiB)
14:51:14.044 INFO MemoryStore - Block broadcast_55_piece0 stored as bytes in memory (estimated size 54.6 KiB, free 1916.7 MiB)
14:51:14.044 INFO BlockManagerInfo - Added broadcast_55_piece0 in memory on localhost:44923 (size: 54.6 KiB, free: 1919.3 MiB)
14:51:14.045 INFO SparkContext - Created broadcast 55 from broadcast at DAGScheduler.scala:1580
14:51:14.045 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 38 (MapPartitionsRDD[113] at filter at ReadsSparkSource.java:96) (first 15 tasks are for partitions Vector(0))
14:51:14.045 INFO TaskSchedulerImpl - Adding task set 38.0 with 1 tasks resource profile 0
14:51:14.046 INFO TaskSetManager - Starting task 0.0 in stage 38.0 (TID 76) (localhost, executor driver, partition 0, ANY, 7852 bytes)
14:51:14.046 INFO Executor - Running task 0.0 in stage 38.0 (TID 76)
14:51:14.062 INFO NewHadoopRDD - Input split: hdfs://localhost:43595/user/runner/ReadsSparkSinkUnitTest1_d5756aa4-f67c-4082-8d5c-02e679f31e67.bam:0+237038
14:51:14.063 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_d5756aa4-f67c-4082-8d5c-02e679f31e67.bam dst=null perm=null proto=rpc
14:51:14.064 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest1_d5756aa4-f67c-4082-8d5c-02e679f31e67.bam dst=null perm=null proto=rpc
14:51:14.065 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_d5756aa4-f67c-4082-8d5c-02e679f31e67.bam.bai dst=null perm=null proto=rpc
14:51:14.066 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_d5756aa4-f67c-4082-8d5c-02e679f31e67.bam.bai dst=null perm=null proto=rpc
14:51:14.067 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest1_d5756aa4-f67c-4082-8d5c-02e679f31e67.bam.bai dst=null perm=null proto=rpc
14:51:14.069 WARN DFSUtil - Unexpected value for data transfer bytes=5760 duration=0
14:51:14.071 WARN DFSUtil - Unexpected value for data transfer bytes=5516 duration=0
14:51:14.072 WARN DFSUtil - Unexpected value for data transfer bytes=5516 duration=0
14:51:14.074 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
14:51:14.077 INFO Executor - Finished task 0.0 in stage 38.0 (TID 76). 989 bytes result sent to driver
14:51:14.078 INFO TaskSetManager - Finished task 0.0 in stage 38.0 (TID 76) in 32 ms on localhost (executor driver) (1/1)
14:51:14.078 INFO TaskSchedulerImpl - Removed TaskSet 38.0, whose tasks have all completed, from pool
14:51:14.078 INFO DAGScheduler - ResultStage 38 (count at ReadsSparkSinkUnitTest.java:185) finished in 0.046 s
14:51:14.078 INFO DAGScheduler - Job 27 is finished. Cancelling potential speculative or zombie tasks for this job
14:51:14.078 INFO TaskSchedulerImpl - Killing all running tasks in stage 38: Stage finished
14:51:14.078 INFO DAGScheduler - Job 27 finished: count at ReadsSparkSinkUnitTest.java:185, took 0.047677 s
14:51:14.084 INFO MemoryStore - Block broadcast_56 stored as values in memory (estimated size 297.9 KiB, free 1916.4 MiB)
14:51:14.094 INFO MemoryStore - Block broadcast_56_piece0 stored as bytes in memory (estimated size 50.2 KiB, free 1916.4 MiB)
14:51:14.094 INFO BlockManagerInfo - Added broadcast_56_piece0 in memory on localhost:44923 (size: 50.2 KiB, free: 1919.3 MiB)
14:51:14.094 INFO SparkContext - Created broadcast 56 from newAPIHadoopFile at PathSplitSource.java:96
14:51:14.122 INFO MemoryStore - Block broadcast_57 stored as values in memory (estimated size 297.9 KiB, free 1916.1 MiB)
14:51:14.129 INFO MemoryStore - Block broadcast_57_piece0 stored as bytes in memory (estimated size 50.2 KiB, free 1916.0 MiB)
14:51:14.129 INFO BlockManagerInfo - Added broadcast_57_piece0 in memory on localhost:44923 (size: 50.2 KiB, free: 1919.2 MiB)
14:51:14.129 INFO SparkContext - Created broadcast 57 from newAPIHadoopFile at PathSplitSource.java:96
14:51:14.165 INFO FileInputFormat - Total input files to process : 1
14:51:14.168 INFO MemoryStore - Block broadcast_58 stored as values in memory (estimated size 160.7 KiB, free 1915.9 MiB)
14:51:14.184 INFO MemoryStore - Block broadcast_58_piece0 stored as bytes in memory (estimated size 9.6 KiB, free 1916.2 MiB)
14:51:14.184 INFO BlockManagerInfo - Removed broadcast_46_piece0 on localhost:44923 in memory (size: 50.2 KiB, free: 1919.3 MiB)
14:51:14.184 INFO BlockManagerInfo - Added broadcast_58_piece0 in memory on localhost:44923 (size: 9.6 KiB, free: 1919.3 MiB)
14:51:14.184 INFO SparkContext - Created broadcast 58 from broadcast at ReadsSparkSink.java:133
14:51:14.185 INFO BlockManagerInfo - Removed broadcast_54_piece0 on localhost:44923 in memory (size: 153.6 KiB, free: 1919.4 MiB)
14:51:14.186 WARN HtsjdkReadsRddStorage - Unrecognized write option: DISABLE
14:51:14.186 INFO BlockManagerInfo - Removed broadcast_52_piece0 on localhost:44923 in memory (size: 50.2 KiB, free: 1919.5 MiB)
14:51:14.187 INFO BlockManagerInfo - Removed broadcast_53_piece0 on localhost:44923 in memory (size: 54.6 KiB, free: 1919.5 MiB)
14:51:14.187 INFO BlockManagerInfo - Removed broadcast_47_piece0 on localhost:44923 in memory (size: 9.6 KiB, free: 1919.5 MiB)
14:51:14.188 INFO MemoryStore - Block broadcast_59 stored as values in memory (estimated size 163.2 KiB, free 1917.3 MiB)
14:51:14.189 INFO BlockManagerInfo - Removed broadcast_57_piece0 on localhost:44923 in memory (size: 50.2 KiB, free: 1919.6 MiB)
14:51:14.190 INFO BlockManagerInfo - Removed broadcast_50_piece0 on localhost:44923 in memory (size: 67.1 KiB, free: 1919.7 MiB)
14:51:14.190 INFO MemoryStore - Block broadcast_59_piece0 stored as bytes in memory (estimated size 9.6 KiB, free 1917.9 MiB)
14:51:14.190 INFO BlockManagerInfo - Added broadcast_59_piece0 in memory on localhost:44923 (size: 9.6 KiB, free: 1919.7 MiB)
14:51:14.191 INFO BlockManagerInfo - Removed broadcast_48_piece0 on localhost:44923 in memory (size: 9.6 KiB, free: 1919.7 MiB)
14:51:14.191 INFO SparkContext - Created broadcast 59 from broadcast at BamSink.java:76
14:51:14.192 INFO BlockManagerInfo - Removed broadcast_55_piece0 on localhost:44923 in memory (size: 54.6 KiB, free: 1919.7 MiB)
14:51:14.193 INFO BlockManagerInfo - Removed broadcast_51_piece0 on localhost:44923 in memory (size: 8.3 KiB, free: 1919.7 MiB)
14:51:14.193 INFO BlockManagerInfo - Removed broadcast_49_piece0 on localhost:44923 in memory (size: 166.1 KiB, free: 1919.9 MiB)
14:51:14.194 INFO BlockManagerInfo - Removed broadcast_45_piece0 on localhost:44923 in memory (size: 50.2 KiB, free: 1919.9 MiB)
14:51:14.195 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_4d3233f3-5a23-42fb-9416-62f0828858e3.bam.parts dst=null perm=null proto=rpc
14:51:14.196 INFO PathOutputCommitterFactory - No output committer factory defined, defaulting to FileOutputCommitterFactory
14:51:14.196 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
14:51:14.196 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
14:51:14.197 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=mkdirs src=/user/runner/ReadsSparkSinkUnitTest1_4d3233f3-5a23-42fb-9416-62f0828858e3.bam.parts/_temporary/0 dst=null perm=runner:supergroup:rwxr-xr-x proto=rpc
14:51:14.204 INFO SparkContext - Starting job: runJob at SparkHadoopWriter.scala:83
14:51:14.204 INFO DAGScheduler - Registering RDD 127 (mapToPair at SparkUtils.java:161) as input to shuffle 9
14:51:14.205 INFO DAGScheduler - Got job 28 (runJob at SparkHadoopWriter.scala:83) with 1 output partitions
14:51:14.205 INFO DAGScheduler - Final stage: ResultStage 40 (runJob at SparkHadoopWriter.scala:83)
14:51:14.205 INFO DAGScheduler - Parents of final stage: List(ShuffleMapStage 39)
14:51:14.205 INFO DAGScheduler - Missing parents: List(ShuffleMapStage 39)
14:51:14.205 INFO DAGScheduler - Submitting ShuffleMapStage 39 (MapPartitionsRDD[127] at mapToPair at SparkUtils.java:161), which has no missing parents
14:51:14.235 INFO MemoryStore - Block broadcast_60 stored as values in memory (estimated size 520.4 KiB, free 1918.8 MiB)
14:51:14.237 INFO MemoryStore - Block broadcast_60_piece0 stored as bytes in memory (estimated size 166.1 KiB, free 1918.7 MiB)
14:51:14.237 INFO BlockManagerInfo - Added broadcast_60_piece0 in memory on localhost:44923 (size: 166.1 KiB, free: 1919.8 MiB)
14:51:14.238 INFO SparkContext - Created broadcast 60 from broadcast at DAGScheduler.scala:1580
14:51:14.238 INFO DAGScheduler - Submitting 1 missing tasks from ShuffleMapStage 39 (MapPartitionsRDD[127] at mapToPair at SparkUtils.java:161) (first 15 tasks are for partitions Vector(0))
14:51:14.238 INFO TaskSchedulerImpl - Adding task set 39.0 with 1 tasks resource profile 0
14:51:14.239 INFO TaskSetManager - Starting task 0.0 in stage 39.0 (TID 77) (localhost, executor driver, partition 0, PROCESS_LOCAL, 7861 bytes)
14:51:14.240 INFO Executor - Running task 0.0 in stage 39.0 (TID 77)
14:51:14.287 INFO NewHadoopRDD - Input split: file:/home/runner/work/gatk/gatk/src/test/resources/org/broadinstitute/hellbender/tools/BQSR/HiSeq.1mb.1RG.2k_lines.bam:0+222075
14:51:14.309 INFO Executor - Finished task 0.0 in stage 39.0 (TID 77). 1148 bytes result sent to driver
14:51:14.310 INFO TaskSetManager - Finished task 0.0 in stage 39.0 (TID 77) in 71 ms on localhost (executor driver) (1/1)
14:51:14.310 INFO TaskSchedulerImpl - Removed TaskSet 39.0, whose tasks have all completed, from pool
14:51:14.310 INFO DAGScheduler - ShuffleMapStage 39 (mapToPair at SparkUtils.java:161) finished in 0.104 s
14:51:14.310 INFO DAGScheduler - looking for newly runnable stages
14:51:14.310 INFO DAGScheduler - running: HashSet()
14:51:14.311 INFO DAGScheduler - waiting: HashSet(ResultStage 40)
14:51:14.311 INFO DAGScheduler - failed: HashSet()
14:51:14.311 INFO DAGScheduler - Submitting ResultStage 40 (MapPartitionsRDD[132] at mapToPair at BamSink.java:91), which has no missing parents
14:51:14.319 INFO MemoryStore - Block broadcast_61 stored as values in memory (estimated size 241.5 KiB, free 1918.4 MiB)
14:51:14.320 INFO MemoryStore - Block broadcast_61_piece0 stored as bytes in memory (estimated size 67.1 KiB, free 1918.4 MiB)
14:51:14.320 INFO BlockManagerInfo - Added broadcast_61_piece0 in memory on localhost:44923 (size: 67.1 KiB, free: 1919.7 MiB)
14:51:14.320 INFO SparkContext - Created broadcast 61 from broadcast at DAGScheduler.scala:1580
14:51:14.321 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 40 (MapPartitionsRDD[132] at mapToPair at BamSink.java:91) (first 15 tasks are for partitions Vector(0))
14:51:14.321 INFO TaskSchedulerImpl - Adding task set 40.0 with 1 tasks resource profile 0
14:51:14.322 INFO TaskSetManager - Starting task 0.0 in stage 40.0 (TID 78) (localhost, executor driver, partition 0, NODE_LOCAL, 7513 bytes)
14:51:14.322 INFO Executor - Running task 0.0 in stage 40.0 (TID 78)
14:51:14.329 INFO ShuffleBlockFetcherIterator - Getting 1 (343.8 KiB) non-empty blocks including 1 (343.8 KiB) local and 0 (0.0 B) host-local and 0 (0.0 B) push-merged-local and 0 (0.0 B) remote blocks
14:51:14.329 INFO ShuffleBlockFetcherIterator - Started 0 remote fetches in 0 ms
14:51:14.347 INFO PathOutputCommitterFactory - No output committer factory defined, defaulting to FileOutputCommitterFactory
14:51:14.347 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
14:51:14.347 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
14:51:14.347 INFO PathOutputCommitterFactory - No output committer factory defined, defaulting to FileOutputCommitterFactory
14:51:14.347 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
14:51:14.347 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
14:51:14.349 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=create src=/user/runner/ReadsSparkSinkUnitTest1_4d3233f3-5a23-42fb-9416-62f0828858e3.bam.parts/_temporary/0/_temporary/attempt_202603041451145356872550912779652_0132_r_000000_0/part-r-00000 dst=null perm=runner:supergroup:rw-r--r-- proto=rpc
14:51:14.351 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=create src=/user/runner/ReadsSparkSinkUnitTest1_4d3233f3-5a23-42fb-9416-62f0828858e3.bam.parts/_temporary/0/_temporary/attempt_202603041451145356872550912779652_0132_r_000000_0/.part-r-00000.bai dst=null perm=runner:supergroup:rw-r--r-- proto=rpc
14:51:14.355 INFO StateChange - BLOCK* allocate blk_1073741839_1015, replicas=127.0.0.1:34059 for /user/runner/ReadsSparkSinkUnitTest1_4d3233f3-5a23-42fb-9416-62f0828858e3.bam.parts/_temporary/0/_temporary/attempt_202603041451145356872550912779652_0132_r_000000_0/part-r-00000
14:51:14.357 INFO DataNode - Receiving BP-1768883704-10.1.0.125-1772635868711:blk_1073741839_1015 src: /127.0.0.1:33888 dest: /127.0.0.1:34059
14:51:14.361 INFO clienttrace - src: /127.0.0.1:33888, dest: /127.0.0.1:34059, bytes: 231298, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-951833550_1, offset: 0, srvID: 3c05d188-bc2c-46c8-958b-79f7184a02a2, blockid: BP-1768883704-10.1.0.125-1772635868711:blk_1073741839_1015, duration(ns): 2948352
14:51:14.361 INFO DataNode - PacketResponder: BP-1768883704-10.1.0.125-1772635868711:blk_1073741839_1015, type=LAST_IN_PIPELINE terminating
14:51:14.362 INFO FSNamesystem - BLOCK* blk_1073741839_1015 is COMMITTED but not COMPLETE(numNodes= 0 < minimum = 1) in file /user/runner/ReadsSparkSinkUnitTest1_4d3233f3-5a23-42fb-9416-62f0828858e3.bam.parts/_temporary/0/_temporary/attempt_202603041451145356872550912779652_0132_r_000000_0/part-r-00000
14:51:14.763 INFO StateChange - DIR* completeFile: /user/runner/ReadsSparkSinkUnitTest1_4d3233f3-5a23-42fb-9416-62f0828858e3.bam.parts/_temporary/0/_temporary/attempt_202603041451145356872550912779652_0132_r_000000_0/part-r-00000 is closed by DFSClient_NONMAPREDUCE_-951833550_1
14:51:14.765 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_4d3233f3-5a23-42fb-9416-62f0828858e3.bam.parts/_temporary/0/_temporary/attempt_202603041451145356872550912779652_0132_r_000000_0/part-r-00000 dst=null perm=null proto=rpc
14:51:14.768 INFO StateChange - BLOCK* allocate blk_1073741840_1016, replicas=127.0.0.1:34059 for /user/runner/ReadsSparkSinkUnitTest1_4d3233f3-5a23-42fb-9416-62f0828858e3.bam.parts/_temporary/0/_temporary/attempt_202603041451145356872550912779652_0132_r_000000_0/.part-r-00000.bai
14:51:14.769 INFO DataNode - Receiving BP-1768883704-10.1.0.125-1772635868711:blk_1073741840_1016 src: /127.0.0.1:33892 dest: /127.0.0.1:34059
14:51:14.771 INFO clienttrace - src: /127.0.0.1:33892, dest: /127.0.0.1:34059, bytes: 5472, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-951833550_1, offset: 0, srvID: 3c05d188-bc2c-46c8-958b-79f7184a02a2, blockid: BP-1768883704-10.1.0.125-1772635868711:blk_1073741840_1016, duration(ns): 740519
14:51:14.771 INFO DataNode - PacketResponder: BP-1768883704-10.1.0.125-1772635868711:blk_1073741840_1016, type=LAST_IN_PIPELINE terminating
14:51:14.773 INFO StateChange - DIR* completeFile: /user/runner/ReadsSparkSinkUnitTest1_4d3233f3-5a23-42fb-9416-62f0828858e3.bam.parts/_temporary/0/_temporary/attempt_202603041451145356872550912779652_0132_r_000000_0/.part-r-00000.bai is closed by DFSClient_NONMAPREDUCE_-951833550_1
14:51:14.774 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_4d3233f3-5a23-42fb-9416-62f0828858e3.bam.parts/_temporary/0/_temporary/attempt_202603041451145356872550912779652_0132_r_000000_0 dst=null perm=null proto=rpc
14:51:14.775 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_4d3233f3-5a23-42fb-9416-62f0828858e3.bam.parts/_temporary/0/_temporary/attempt_202603041451145356872550912779652_0132_r_000000_0 dst=null perm=null proto=rpc
14:51:14.775 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_4d3233f3-5a23-42fb-9416-62f0828858e3.bam.parts/_temporary/0/task_202603041451145356872550912779652_0132_r_000000 dst=null perm=null proto=rpc
14:51:14.776 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=rename src=/user/runner/ReadsSparkSinkUnitTest1_4d3233f3-5a23-42fb-9416-62f0828858e3.bam.parts/_temporary/0/_temporary/attempt_202603041451145356872550912779652_0132_r_000000_0 dst=/user/runner/ReadsSparkSinkUnitTest1_4d3233f3-5a23-42fb-9416-62f0828858e3.bam.parts/_temporary/0/task_202603041451145356872550912779652_0132_r_000000 perm=runner:supergroup:rwxr-xr-x proto=rpc
14:51:14.777 INFO FileOutputCommitter - Saved output of task 'attempt_202603041451145356872550912779652_0132_r_000000_0' to hdfs://localhost:43595/user/runner/ReadsSparkSinkUnitTest1_4d3233f3-5a23-42fb-9416-62f0828858e3.bam.parts/_temporary/0/task_202603041451145356872550912779652_0132_r_000000
14:51:14.777 INFO SparkHadoopMapRedUtil - attempt_202603041451145356872550912779652_0132_r_000000_0: Committed. Elapsed time: 2 ms.
14:51:14.778 INFO Executor - Finished task 0.0 in stage 40.0 (TID 78). 1858 bytes result sent to driver
14:51:14.779 INFO TaskSetManager - Finished task 0.0 in stage 40.0 (TID 78) in 458 ms on localhost (executor driver) (1/1)
14:51:14.779 INFO TaskSchedulerImpl - Removed TaskSet 40.0, whose tasks have all completed, from pool
14:51:14.780 INFO DAGScheduler - ResultStage 40 (runJob at SparkHadoopWriter.scala:83) finished in 0.469 s
14:51:14.780 INFO DAGScheduler - Job 28 is finished. Cancelling potential speculative or zombie tasks for this job
14:51:14.780 INFO TaskSchedulerImpl - Killing all running tasks in stage 40: Stage finished
14:51:14.780 INFO DAGScheduler - Job 28 finished: runJob at SparkHadoopWriter.scala:83, took 0.576263 s
14:51:14.781 INFO SparkHadoopWriter - Start to commit write Job job_202603041451145356872550912779652_0132.
14:51:14.782 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=listStatus src=/user/runner/ReadsSparkSinkUnitTest1_4d3233f3-5a23-42fb-9416-62f0828858e3.bam.parts/_temporary/0 dst=null perm=null proto=rpc
14:51:14.782 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_4d3233f3-5a23-42fb-9416-62f0828858e3.bam.parts dst=null perm=null proto=rpc
14:51:14.783 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=listStatus src=/user/runner/ReadsSparkSinkUnitTest1_4d3233f3-5a23-42fb-9416-62f0828858e3.bam.parts/_temporary/0/task_202603041451145356872550912779652_0132_r_000000 dst=null perm=null proto=rpc
14:51:14.784 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_4d3233f3-5a23-42fb-9416-62f0828858e3.bam.parts/.part-r-00000.bai dst=null perm=null proto=rpc
14:51:14.785 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=rename src=/user/runner/ReadsSparkSinkUnitTest1_4d3233f3-5a23-42fb-9416-62f0828858e3.bam.parts/_temporary/0/task_202603041451145356872550912779652_0132_r_000000/.part-r-00000.bai dst=/user/runner/ReadsSparkSinkUnitTest1_4d3233f3-5a23-42fb-9416-62f0828858e3.bam.parts/.part-r-00000.bai perm=runner:supergroup:rw-r--r-- proto=rpc
14:51:14.785 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_4d3233f3-5a23-42fb-9416-62f0828858e3.bam.parts/part-r-00000 dst=null perm=null proto=rpc
14:51:14.786 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=rename src=/user/runner/ReadsSparkSinkUnitTest1_4d3233f3-5a23-42fb-9416-62f0828858e3.bam.parts/_temporary/0/task_202603041451145356872550912779652_0132_r_000000/part-r-00000 dst=/user/runner/ReadsSparkSinkUnitTest1_4d3233f3-5a23-42fb-9416-62f0828858e3.bam.parts/part-r-00000 perm=runner:supergroup:rw-r--r-- proto=rpc
14:51:14.787 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=delete src=/user/runner/ReadsSparkSinkUnitTest1_4d3233f3-5a23-42fb-9416-62f0828858e3.bam.parts/_temporary dst=null perm=null proto=rpc
14:51:14.788 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=create src=/user/runner/ReadsSparkSinkUnitTest1_4d3233f3-5a23-42fb-9416-62f0828858e3.bam.parts/_SUCCESS dst=null perm=runner:supergroup:rw-r--r-- proto=rpc
14:51:14.789 INFO StateChange - DIR* completeFile: /user/runner/ReadsSparkSinkUnitTest1_4d3233f3-5a23-42fb-9416-62f0828858e3.bam.parts/_SUCCESS is closed by DFSClient_NONMAPREDUCE_-951833550_1
14:51:14.790 INFO audit - allowed=false ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=delete src=/user/runner/ReadsSparkSinkUnitTest1_4d3233f3-5a23-42fb-9416-62f0828858e3.bam.parts/.spark-staging-132 dst=null perm=null proto=rpc
14:51:14.790 INFO SparkHadoopWriter - Write Job job_202603041451145356872550912779652_0132 committed. Elapsed time: 9 ms.
14:51:14.791 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=create src=/user/runner/ReadsSparkSinkUnitTest1_4d3233f3-5a23-42fb-9416-62f0828858e3.bam.parts/header dst=null perm=runner:supergroup:rw-r--r-- proto=rpc
14:51:14.793 INFO StateChange - BLOCK* allocate blk_1073741841_1017, replicas=127.0.0.1:34059 for /user/runner/ReadsSparkSinkUnitTest1_4d3233f3-5a23-42fb-9416-62f0828858e3.bam.parts/header
14:51:14.794 INFO DataNode - Receiving BP-1768883704-10.1.0.125-1772635868711:blk_1073741841_1017 src: /127.0.0.1:33894 dest: /127.0.0.1:34059
14:51:14.796 INFO clienttrace - src: /127.0.0.1:33894, dest: /127.0.0.1:34059, bytes: 5712, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-951833550_1, offset: 0, srvID: 3c05d188-bc2c-46c8-958b-79f7184a02a2, blockid: BP-1768883704-10.1.0.125-1772635868711:blk_1073741841_1017, duration(ns): 665166
14:51:14.796 INFO DataNode - PacketResponder: BP-1768883704-10.1.0.125-1772635868711:blk_1073741841_1017, type=LAST_IN_PIPELINE terminating
14:51:14.797 INFO StateChange - DIR* completeFile: /user/runner/ReadsSparkSinkUnitTest1_4d3233f3-5a23-42fb-9416-62f0828858e3.bam.parts/header is closed by DFSClient_NONMAPREDUCE_-951833550_1
14:51:14.798 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=create src=/user/runner/ReadsSparkSinkUnitTest1_4d3233f3-5a23-42fb-9416-62f0828858e3.bam.parts/terminator dst=null perm=runner:supergroup:rw-r--r-- proto=rpc
14:51:14.799 INFO StateChange - BLOCK* allocate blk_1073741842_1018, replicas=127.0.0.1:34059 for /user/runner/ReadsSparkSinkUnitTest1_4d3233f3-5a23-42fb-9416-62f0828858e3.bam.parts/terminator
14:51:14.800 INFO DataNode - Receiving BP-1768883704-10.1.0.125-1772635868711:blk_1073741842_1018 src: /127.0.0.1:33908 dest: /127.0.0.1:34059
14:51:14.802 INFO clienttrace - src: /127.0.0.1:33908, dest: /127.0.0.1:34059, bytes: 28, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-951833550_1, offset: 0, srvID: 3c05d188-bc2c-46c8-958b-79f7184a02a2, blockid: BP-1768883704-10.1.0.125-1772635868711:blk_1073741842_1018, duration(ns): 595595
14:51:14.802 INFO DataNode - PacketResponder: BP-1768883704-10.1.0.125-1772635868711:blk_1073741842_1018, type=LAST_IN_PIPELINE terminating
14:51:14.803 INFO StateChange - DIR* completeFile: /user/runner/ReadsSparkSinkUnitTest1_4d3233f3-5a23-42fb-9416-62f0828858e3.bam.parts/terminator is closed by DFSClient_NONMAPREDUCE_-951833550_1
14:51:14.804 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=listStatus src=/user/runner/ReadsSparkSinkUnitTest1_4d3233f3-5a23-42fb-9416-62f0828858e3.bam.parts dst=null perm=null proto=rpc
14:51:14.806 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=create src=/user/runner/ReadsSparkSinkUnitTest1_4d3233f3-5a23-42fb-9416-62f0828858e3.bam.parts/output dst=null perm=runner:supergroup:rw-r--r-- proto=rpc
14:51:14.807 INFO StateChange - DIR* completeFile: /user/runner/ReadsSparkSinkUnitTest1_4d3233f3-5a23-42fb-9416-62f0828858e3.bam.parts/output is closed by DFSClient_NONMAPREDUCE_-951833550_1
14:51:14.807 INFO HadoopFileSystemWrapper - Concatenating 3 parts to hdfs://localhost:43595/user/runner/ReadsSparkSinkUnitTest1_4d3233f3-5a23-42fb-9416-62f0828858e3.bam
14:51:14.808 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=concat src=[/user/runner/ReadsSparkSinkUnitTest1_4d3233f3-5a23-42fb-9416-62f0828858e3.bam.parts/header, /user/runner/ReadsSparkSinkUnitTest1_4d3233f3-5a23-42fb-9416-62f0828858e3.bam.parts/part-r-00000, /user/runner/ReadsSparkSinkUnitTest1_4d3233f3-5a23-42fb-9416-62f0828858e3.bam.parts/terminator] dst=/user/runner/ReadsSparkSinkUnitTest1_4d3233f3-5a23-42fb-9416-62f0828858e3.bam.parts/output perm=runner:supergroup:rw-r--r-- proto=rpc
14:51:14.809 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_4d3233f3-5a23-42fb-9416-62f0828858e3.bam dst=null perm=null proto=rpc
14:51:14.810 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=rename src=/user/runner/ReadsSparkSinkUnitTest1_4d3233f3-5a23-42fb-9416-62f0828858e3.bam.parts/output dst=/user/runner/ReadsSparkSinkUnitTest1_4d3233f3-5a23-42fb-9416-62f0828858e3.bam perm=runner:supergroup:rw-r--r-- proto=rpc
14:51:14.810 INFO HadoopFileSystemWrapper - Concatenating to hdfs://localhost:43595/user/runner/ReadsSparkSinkUnitTest1_4d3233f3-5a23-42fb-9416-62f0828858e3.bam done
14:51:14.811 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_4d3233f3-5a23-42fb-9416-62f0828858e3.bam dst=null perm=null proto=rpc
14:51:14.811 INFO IndexFileMerger - Merging .bai files in temp directory hdfs://localhost:43595/user/runner/ReadsSparkSinkUnitTest1_4d3233f3-5a23-42fb-9416-62f0828858e3.bam.parts/ to hdfs://localhost:43595/user/runner/ReadsSparkSinkUnitTest1_4d3233f3-5a23-42fb-9416-62f0828858e3.bam.bai
14:51:14.812 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=listStatus src=/user/runner/ReadsSparkSinkUnitTest1_4d3233f3-5a23-42fb-9416-62f0828858e3.bam.parts dst=null perm=null proto=rpc
14:51:14.813 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=create src=/user/runner/ReadsSparkSinkUnitTest1_4d3233f3-5a23-42fb-9416-62f0828858e3.bam.bai dst=null perm=runner:supergroup:rw-r--r-- proto=rpc
14:51:14.815 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_4d3233f3-5a23-42fb-9416-62f0828858e3.bam.parts/.part-r-00000.bai dst=null perm=null proto=rpc
14:51:14.816 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest1_4d3233f3-5a23-42fb-9416-62f0828858e3.bam.parts/.part-r-00000.bai dst=null perm=null proto=rpc
14:51:14.818 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=delete src=/user/runner/ReadsSparkSinkUnitTest1_4d3233f3-5a23-42fb-9416-62f0828858e3.bam.parts/.part-r-00000.bai dst=null perm=null proto=rpc
14:51:14.821 INFO StateChange - BLOCK* allocate blk_1073741843_1019, replicas=127.0.0.1:34059 for /user/runner/ReadsSparkSinkUnitTest1_4d3233f3-5a23-42fb-9416-62f0828858e3.bam.bai
14:51:14.823 INFO DataNode - Receiving BP-1768883704-10.1.0.125-1772635868711:blk_1073741843_1019 src: /127.0.0.1:33920 dest: /127.0.0.1:34059
14:51:14.826 INFO clienttrace - src: /127.0.0.1:33920, dest: /127.0.0.1:34059, bytes: 5472, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-951833550_1, offset: 0, srvID: 3c05d188-bc2c-46c8-958b-79f7184a02a2, blockid: BP-1768883704-10.1.0.125-1772635868711:blk_1073741843_1019, duration(ns): 759124
14:51:14.826 INFO DataNode - PacketResponder: BP-1768883704-10.1.0.125-1772635868711:blk_1073741843_1019, type=LAST_IN_PIPELINE terminating
14:51:14.828 INFO StateChange - DIR* completeFile: /user/runner/ReadsSparkSinkUnitTest1_4d3233f3-5a23-42fb-9416-62f0828858e3.bam.bai is closed by DFSClient_NONMAPREDUCE_-951833550_1
14:51:14.828 INFO IndexFileMerger - Done merging .bai files
14:51:14.829 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=delete src=/user/runner/ReadsSparkSinkUnitTest1_4d3233f3-5a23-42fb-9416-62f0828858e3.bam.parts dst=null perm=null proto=rpc
14:51:14.843 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_4d3233f3-5a23-42fb-9416-62f0828858e3.bam.bai dst=null perm=null proto=rpc
14:51:14.844 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_4d3233f3-5a23-42fb-9416-62f0828858e3.bam dst=null perm=null proto=rpc
14:51:14.845 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_4d3233f3-5a23-42fb-9416-62f0828858e3.bam dst=null perm=null proto=rpc
14:51:14.846 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_4d3233f3-5a23-42fb-9416-62f0828858e3.bam dst=null perm=null proto=rpc
14:51:14.847 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest1_4d3233f3-5a23-42fb-9416-62f0828858e3.bam dst=null perm=null proto=rpc
14:51:14.848 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_4d3233f3-5a23-42fb-9416-62f0828858e3.bam.bai dst=null perm=null proto=rpc
14:51:14.849 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_4d3233f3-5a23-42fb-9416-62f0828858e3.bam.bai dst=null perm=null proto=rpc
14:51:14.850 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest1_4d3233f3-5a23-42fb-9416-62f0828858e3.bam.bai dst=null perm=null proto=rpc
14:51:14.852 WARN DFSUtil - Unexpected value for data transfer bytes=5760 duration=0
14:51:14.855 WARN DFSUtil - Unexpected value for data transfer bytes=5516 duration=0
14:51:14.856 WARN DFSUtil - Unexpected value for data transfer bytes=5516 duration=0
14:51:14.856 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
14:51:14.856 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_4d3233f3-5a23-42fb-9416-62f0828858e3.bam.sbi dst=null perm=null proto=rpc
14:51:14.859 INFO MemoryStore - Block broadcast_62 stored as values in memory (estimated size 297.9 KiB, free 1918.1 MiB)
14:51:14.866 INFO MemoryStore - Block broadcast_62_piece0 stored as bytes in memory (estimated size 50.2 KiB, free 1918.0 MiB)
14:51:14.866 INFO BlockManagerInfo - Added broadcast_62_piece0 in memory on localhost:44923 (size: 50.2 KiB, free: 1919.7 MiB)
14:51:14.866 INFO SparkContext - Created broadcast 62 from newAPIHadoopFile at PathSplitSource.java:96
14:51:14.891 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_4d3233f3-5a23-42fb-9416-62f0828858e3.bam dst=null perm=null proto=rpc
14:51:14.891 INFO FileInputFormat - Total input files to process : 1
14:51:14.892 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest1_4d3233f3-5a23-42fb-9416-62f0828858e3.bam dst=null perm=null proto=rpc
14:51:14.934 INFO SparkContext - Starting job: collect at ReadsSparkSinkUnitTest.java:182
14:51:14.934 INFO DAGScheduler - Got job 29 (collect at ReadsSparkSinkUnitTest.java:182) with 1 output partitions
14:51:14.934 INFO DAGScheduler - Final stage: ResultStage 41 (collect at ReadsSparkSinkUnitTest.java:182)
14:51:14.934 INFO DAGScheduler - Parents of final stage: List()
14:51:14.935 INFO DAGScheduler - Missing parents: List()
14:51:14.935 INFO DAGScheduler - Submitting ResultStage 41 (MapPartitionsRDD[139] at filter at ReadsSparkSource.java:96), which has no missing parents
14:51:14.953 INFO MemoryStore - Block broadcast_63 stored as values in memory (estimated size 426.2 KiB, free 1917.6 MiB)
14:51:14.955 INFO MemoryStore - Block broadcast_63_piece0 stored as bytes in memory (estimated size 153.7 KiB, free 1917.4 MiB)
14:51:14.955 INFO BlockManagerInfo - Added broadcast_63_piece0 in memory on localhost:44923 (size: 153.7 KiB, free: 1919.5 MiB)
14:51:14.955 INFO SparkContext - Created broadcast 63 from broadcast at DAGScheduler.scala:1580
14:51:14.956 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 41 (MapPartitionsRDD[139] at filter at ReadsSparkSource.java:96) (first 15 tasks are for partitions Vector(0))
14:51:14.956 INFO TaskSchedulerImpl - Adding task set 41.0 with 1 tasks resource profile 0
14:51:14.957 INFO TaskSetManager - Starting task 0.0 in stage 41.0 (TID 79) (localhost, executor driver, partition 0, ANY, 7852 bytes)
14:51:14.957 INFO Executor - Running task 0.0 in stage 41.0 (TID 79)
14:51:15.004 INFO NewHadoopRDD - Input split: hdfs://localhost:43595/user/runner/ReadsSparkSinkUnitTest1_4d3233f3-5a23-42fb-9416-62f0828858e3.bam:0+237038
14:51:15.006 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_4d3233f3-5a23-42fb-9416-62f0828858e3.bam dst=null perm=null proto=rpc
14:51:15.007 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest1_4d3233f3-5a23-42fb-9416-62f0828858e3.bam dst=null perm=null proto=rpc
14:51:15.009 WARN DFSUtil - Unexpected value for data transfer bytes=5760 duration=0
14:51:15.010 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_4d3233f3-5a23-42fb-9416-62f0828858e3.bam dst=null perm=null proto=rpc
14:51:15.010 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest1_4d3233f3-5a23-42fb-9416-62f0828858e3.bam dst=null perm=null proto=rpc
14:51:15.012 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_4d3233f3-5a23-42fb-9416-62f0828858e3.bam.bai dst=null perm=null proto=rpc
14:51:15.012 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_4d3233f3-5a23-42fb-9416-62f0828858e3.bam.bai dst=null perm=null proto=rpc
14:51:15.013 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest1_4d3233f3-5a23-42fb-9416-62f0828858e3.bam.bai dst=null perm=null proto=rpc
14:51:15.015 WARN DFSUtil - Unexpected value for data transfer bytes=5760 duration=0
14:51:15.018 WARN DFSUtil - Unexpected value for data transfer bytes=5516 duration=0
14:51:15.019 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_4d3233f3-5a23-42fb-9416-62f0828858e3.bam dst=null perm=null proto=rpc
14:51:15.020 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest1_4d3233f3-5a23-42fb-9416-62f0828858e3.bam dst=null perm=null proto=rpc
14:51:15.021 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
14:51:15.028 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
14:51:15.029 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
14:51:15.030 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
14:51:15.031 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
14:51:15.034 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
14:51:15.035 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
14:51:15.036 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
14:51:15.037 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
14:51:15.038 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
14:51:15.039 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
14:51:15.040 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
14:51:15.041 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
14:51:15.042 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
14:51:15.043 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
14:51:15.044 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
14:51:15.046 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
14:51:15.048 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
14:51:15.049 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
14:51:15.050 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
14:51:15.051 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
14:51:15.052 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
14:51:15.053 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
14:51:15.054 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
14:51:15.055 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
14:51:15.056 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
14:51:15.059 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
14:51:15.061 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
14:51:15.063 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
14:51:15.064 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
14:51:15.065 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
14:51:15.066 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
14:51:15.067 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
14:51:15.068 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
14:51:15.069 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
14:51:15.070 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
14:51:15.071 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
14:51:15.073 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
14:51:15.075 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
14:51:15.077 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
14:51:15.079 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
14:51:15.081 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
14:51:15.082 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
14:51:15.084 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
14:51:15.087 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
14:51:15.088 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
14:51:15.089 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
14:51:15.090 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
14:51:15.091 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
14:51:15.092 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
14:51:15.093 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
14:51:15.093 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
14:51:15.094 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_4d3233f3-5a23-42fb-9416-62f0828858e3.bam dst=null perm=null proto=rpc
14:51:15.095 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest1_4d3233f3-5a23-42fb-9416-62f0828858e3.bam dst=null perm=null proto=rpc
14:51:15.096 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_4d3233f3-5a23-42fb-9416-62f0828858e3.bam.bai dst=null perm=null proto=rpc
14:51:15.097 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_4d3233f3-5a23-42fb-9416-62f0828858e3.bam.bai dst=null perm=null proto=rpc
14:51:15.098 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest1_4d3233f3-5a23-42fb-9416-62f0828858e3.bam.bai dst=null perm=null proto=rpc
14:51:15.104 WARN DFSUtil - Unexpected value for data transfer bytes=5516 duration=0
14:51:15.106 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
14:51:15.107 WARN DFSUtil - Unexpected value for data transfer bytes=32 duration=0
14:51:15.112 INFO Executor - Finished task 0.0 in stage 41.0 (TID 79). 651526 bytes result sent to driver
14:51:15.115 INFO TaskSetManager - Finished task 0.0 in stage 41.0 (TID 79) in 158 ms on localhost (executor driver) (1/1)
14:51:15.115 INFO TaskSchedulerImpl - Removed TaskSet 41.0, whose tasks have all completed, from pool
14:51:15.115 INFO DAGScheduler - ResultStage 41 (collect at ReadsSparkSinkUnitTest.java:182) finished in 0.180 s
14:51:15.116 INFO DAGScheduler - Job 29 is finished. Cancelling potential speculative or zombie tasks for this job
14:51:15.116 INFO TaskSchedulerImpl - Killing all running tasks in stage 41: Stage finished
14:51:15.116 INFO DAGScheduler - Job 29 finished: collect at ReadsSparkSinkUnitTest.java:182, took 0.182158 s
14:51:15.128 INFO SparkContext - Starting job: count at ReadsSparkSinkUnitTest.java:185
14:51:15.128 INFO DAGScheduler - Got job 30 (count at ReadsSparkSinkUnitTest.java:185) with 1 output partitions
14:51:15.128 INFO DAGScheduler - Final stage: ResultStage 42 (count at ReadsSparkSinkUnitTest.java:185)
14:51:15.128 INFO DAGScheduler - Parents of final stage: List()
14:51:15.129 INFO DAGScheduler - Missing parents: List()
14:51:15.129 INFO DAGScheduler - Submitting ResultStage 42 (MapPartitionsRDD[120] at filter at ReadsSparkSource.java:96), which has no missing parents
14:51:15.147 INFO MemoryStore - Block broadcast_64 stored as values in memory (estimated size 426.1 KiB, free 1917.0 MiB)
14:51:15.149 INFO MemoryStore - Block broadcast_64_piece0 stored as bytes in memory (estimated size 153.6 KiB, free 1916.9 MiB)
14:51:15.149 INFO BlockManagerInfo - Added broadcast_64_piece0 in memory on localhost:44923 (size: 153.6 KiB, free: 1919.4 MiB)
14:51:15.150 INFO SparkContext - Created broadcast 64 from broadcast at DAGScheduler.scala:1580
14:51:15.150 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 42 (MapPartitionsRDD[120] at filter at ReadsSparkSource.java:96) (first 15 tasks are for partitions Vector(0))
14:51:15.150 INFO TaskSchedulerImpl - Adding task set 42.0 with 1 tasks resource profile 0
14:51:15.151 INFO TaskSetManager - Starting task 0.0 in stage 42.0 (TID 80) (localhost, executor driver, partition 0, PROCESS_LOCAL, 7872 bytes)
14:51:15.152 INFO Executor - Running task 0.0 in stage 42.0 (TID 80)
14:51:15.193 INFO NewHadoopRDD - Input split: file:/home/runner/work/gatk/gatk/src/test/resources/org/broadinstitute/hellbender/tools/BQSR/HiSeq.1mb.1RG.2k_lines.bam:0+222075
14:51:15.206 INFO Executor - Finished task 0.0 in stage 42.0 (TID 80). 989 bytes result sent to driver
14:51:15.206 INFO TaskSetManager - Finished task 0.0 in stage 42.0 (TID 80) in 55 ms on localhost (executor driver) (1/1)
14:51:15.206 INFO TaskSchedulerImpl - Removed TaskSet 42.0, whose tasks have all completed, from pool
14:51:15.207 INFO DAGScheduler - ResultStage 42 (count at ReadsSparkSinkUnitTest.java:185) finished in 0.078 s
14:51:15.207 INFO DAGScheduler - Job 30 is finished. Cancelling potential speculative or zombie tasks for this job
14:51:15.207 INFO TaskSchedulerImpl - Killing all running tasks in stage 42: Stage finished
14:51:15.207 INFO DAGScheduler - Job 30 finished: count at ReadsSparkSinkUnitTest.java:185, took 0.079104 s
14:51:15.212 INFO SparkContext - Starting job: count at ReadsSparkSinkUnitTest.java:185
14:51:15.212 INFO DAGScheduler - Got job 31 (count at ReadsSparkSinkUnitTest.java:185) with 1 output partitions
14:51:15.213 INFO DAGScheduler - Final stage: ResultStage 43 (count at ReadsSparkSinkUnitTest.java:185)
14:51:15.213 INFO DAGScheduler - Parents of final stage: List()
14:51:15.213 INFO DAGScheduler - Missing parents: List()
14:51:15.213 INFO DAGScheduler - Submitting ResultStage 43 (MapPartitionsRDD[139] at filter at ReadsSparkSource.java:96), which has no missing parents
14:51:15.243 INFO MemoryStore - Block broadcast_65 stored as values in memory (estimated size 426.1 KiB, free 1916.5 MiB)
14:51:15.245 INFO MemoryStore - Block broadcast_65_piece0 stored as bytes in memory (estimated size 153.6 KiB, free 1916.3 MiB)
14:51:15.245 INFO BlockManagerInfo - Added broadcast_65_piece0 in memory on localhost:44923 (size: 153.6 KiB, free: 1919.2 MiB)
14:51:15.245 INFO SparkContext - Created broadcast 65 from broadcast at DAGScheduler.scala:1580
14:51:15.246 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 43 (MapPartitionsRDD[139] at filter at ReadsSparkSource.java:96) (first 15 tasks are for partitions Vector(0))
14:51:15.246 INFO TaskSchedulerImpl - Adding task set 43.0 with 1 tasks resource profile 0
14:51:15.247 INFO TaskSetManager - Starting task 0.0 in stage 43.0 (TID 81) (localhost, executor driver, partition 0, ANY, 7852 bytes)
14:51:15.247 INFO Executor - Running task 0.0 in stage 43.0 (TID 81)
14:51:15.285 INFO NewHadoopRDD - Input split: hdfs://localhost:43595/user/runner/ReadsSparkSinkUnitTest1_4d3233f3-5a23-42fb-9416-62f0828858e3.bam:0+237038
14:51:15.286 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_4d3233f3-5a23-42fb-9416-62f0828858e3.bam dst=null perm=null proto=rpc
14:51:15.287 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest1_4d3233f3-5a23-42fb-9416-62f0828858e3.bam dst=null perm=null proto=rpc
14:51:15.289 WARN DFSUtil - Unexpected value for data transfer bytes=5760 duration=0
14:51:15.290 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_4d3233f3-5a23-42fb-9416-62f0828858e3.bam dst=null perm=null proto=rpc
14:51:15.291 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest1_4d3233f3-5a23-42fb-9416-62f0828858e3.bam dst=null perm=null proto=rpc
14:51:15.292 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_4d3233f3-5a23-42fb-9416-62f0828858e3.bam.bai dst=null perm=null proto=rpc
14:51:15.293 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_4d3233f3-5a23-42fb-9416-62f0828858e3.bam.bai dst=null perm=null proto=rpc
14:51:15.293 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest1_4d3233f3-5a23-42fb-9416-62f0828858e3.bam.bai dst=null perm=null proto=rpc
14:51:15.295 WARN DFSUtil - Unexpected value for data transfer bytes=5760 duration=0
14:51:15.298 WARN DFSUtil - Unexpected value for data transfer bytes=5516 duration=0
14:51:15.299 WARN DFSUtil - Unexpected value for data transfer bytes=5516 duration=0
14:51:15.300 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_4d3233f3-5a23-42fb-9416-62f0828858e3.bam dst=null perm=null proto=rpc
14:51:15.301 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest1_4d3233f3-5a23-42fb-9416-62f0828858e3.bam dst=null perm=null proto=rpc
14:51:15.302 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
14:51:15.307 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
14:51:15.308 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
14:51:15.309 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
14:51:15.310 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
14:51:15.311 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
14:51:15.312 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
14:51:15.314 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
14:51:15.315 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
14:51:15.316 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
14:51:15.317 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
14:51:15.318 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
14:51:15.319 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
14:51:15.320 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
14:51:15.321 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
14:51:15.325 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
14:51:15.327 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
14:51:15.328 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
14:51:15.329 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
14:51:15.330 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
14:51:15.333 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
14:51:15.334 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
14:51:15.335 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
14:51:15.336 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
14:51:15.337 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
14:51:15.338 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
14:51:15.339 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
14:51:15.340 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
14:51:15.341 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
14:51:15.343 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
14:51:15.344 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
14:51:15.345 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
14:51:15.346 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
14:51:15.348 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
14:51:15.350 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
14:51:15.351 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
14:51:15.365 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
14:51:15.365 INFO BlockManagerInfo - Removed broadcast_58_piece0 on localhost:44923 in memory (size: 9.6 KiB, free: 1919.2 MiB)
14:51:15.367 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
14:51:15.367 INFO BlockManagerInfo - Removed broadcast_63_piece0 on localhost:44923 in memory (size: 153.7 KiB, free: 1919.4 MiB)
14:51:15.368 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
14:51:15.369 INFO BlockManagerInfo - Removed broadcast_64_piece0 on localhost:44923 in memory (size: 153.6 KiB, free: 1919.5 MiB)
14:51:15.370 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
14:51:15.370 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
14:51:15.371 INFO BlockManagerInfo - Removed broadcast_59_piece0 on localhost:44923 in memory (size: 9.6 KiB, free: 1919.5 MiB)
14:51:15.372 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
14:51:15.373 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
14:51:15.373 INFO BlockManagerInfo - Removed broadcast_61_piece0 on localhost:44923 in memory (size: 67.1 KiB, free: 1919.6 MiB)
14:51:15.373 INFO BlockManagerInfo - Removed broadcast_60_piece0 on localhost:44923 in memory (size: 166.1 KiB, free: 1919.8 MiB)
14:51:15.374 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
14:51:15.375 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
14:51:15.377 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
14:51:15.378 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
14:51:15.379 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
14:51:15.380 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
14:51:15.382 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
14:51:15.384 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
14:51:15.385 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
14:51:15.386 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
14:51:15.387 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
14:51:15.387 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
14:51:15.388 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
14:51:15.388 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
14:51:15.389 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_4d3233f3-5a23-42fb-9416-62f0828858e3.bam dst=null perm=null proto=rpc
14:51:15.390 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest1_4d3233f3-5a23-42fb-9416-62f0828858e3.bam dst=null perm=null proto=rpc
14:51:15.392 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_4d3233f3-5a23-42fb-9416-62f0828858e3.bam.bai dst=null perm=null proto=rpc
14:51:15.393 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_4d3233f3-5a23-42fb-9416-62f0828858e3.bam.bai dst=null perm=null proto=rpc
14:51:15.393 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest1_4d3233f3-5a23-42fb-9416-62f0828858e3.bam.bai dst=null perm=null proto=rpc
14:51:15.396 WARN DFSUtil - Unexpected value for data transfer bytes=5760 duration=0
14:51:15.398 WARN DFSUtil - Unexpected value for data transfer bytes=5516 duration=0
14:51:15.399 WARN DFSUtil - Unexpected value for data transfer bytes=5516 duration=0
14:51:15.401 WARN DFSUtil - Unexpected value for data transfer bytes=32 duration=0
14:51:15.403 INFO Executor - Finished task 0.0 in stage 43.0 (TID 81). 1032 bytes result sent to driver
14:51:15.403 INFO TaskSetManager - Finished task 0.0 in stage 43.0 (TID 81) in 157 ms on localhost (executor driver) (1/1)
14:51:15.403 INFO TaskSchedulerImpl - Removed TaskSet 43.0, whose tasks have all completed, from pool
14:51:15.404 INFO DAGScheduler - ResultStage 43 (count at ReadsSparkSinkUnitTest.java:185) finished in 0.191 s
14:51:15.404 INFO DAGScheduler - Job 31 is finished. Cancelling potential speculative or zombie tasks for this job
14:51:15.404 INFO TaskSchedulerImpl - Killing all running tasks in stage 43: Stage finished
14:51:15.405 INFO DAGScheduler - Job 31 finished: count at ReadsSparkSinkUnitTest.java:185, took 0.192532 s
14:51:15.409 INFO MemoryStore - Block broadcast_66 stored as values in memory (estimated size 297.9 KiB, free 1918.5 MiB)
14:51:15.416 INFO MemoryStore - Block broadcast_66_piece0 stored as bytes in memory (estimated size 50.2 KiB, free 1918.4 MiB)
14:51:15.416 INFO BlockManagerInfo - Added broadcast_66_piece0 in memory on localhost:44923 (size: 50.2 KiB, free: 1919.7 MiB)
14:51:15.417 INFO SparkContext - Created broadcast 66 from newAPIHadoopFile at PathSplitSource.java:96
14:51:15.443 INFO MemoryStore - Block broadcast_67 stored as values in memory (estimated size 297.9 KiB, free 1918.1 MiB)
14:51:15.450 INFO MemoryStore - Block broadcast_67_piece0 stored as bytes in memory (estimated size 50.2 KiB, free 1918.1 MiB)
14:51:15.450 INFO BlockManagerInfo - Added broadcast_67_piece0 in memory on localhost:44923 (size: 50.2 KiB, free: 1919.7 MiB)
14:51:15.452 INFO SparkContext - Created broadcast 67 from newAPIHadoopFile at PathSplitSource.java:96
14:51:15.475 INFO FileInputFormat - Total input files to process : 1
14:51:15.477 INFO MemoryStore - Block broadcast_68 stored as values in memory (estimated size 160.7 KiB, free 1917.9 MiB)
14:51:15.479 INFO MemoryStore - Block broadcast_68_piece0 stored as bytes in memory (estimated size 9.6 KiB, free 1917.9 MiB)
14:51:15.479 INFO BlockManagerInfo - Added broadcast_68_piece0 in memory on localhost:44923 (size: 9.6 KiB, free: 1919.6 MiB)
14:51:15.479 INFO SparkContext - Created broadcast 68 from broadcast at ReadsSparkSink.java:133
14:51:15.480 WARN HtsjdkReadsRddStorage - Unrecognized write option: DISABLE
14:51:15.481 INFO MemoryStore - Block broadcast_69 stored as values in memory (estimated size 163.2 KiB, free 1917.7 MiB)
14:51:15.482 INFO MemoryStore - Block broadcast_69_piece0 stored as bytes in memory (estimated size 9.6 KiB, free 1917.7 MiB)
14:51:15.482 INFO BlockManagerInfo - Added broadcast_69_piece0 in memory on localhost:44923 (size: 9.6 KiB, free: 1919.6 MiB)
14:51:15.483 INFO SparkContext - Created broadcast 69 from broadcast at BamSink.java:76
14:51:15.485 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_b452aa5e-9d11-482c-8b9e-a53a9d62ebd6.bam.parts dst=null perm=null proto=rpc
14:51:15.486 INFO PathOutputCommitterFactory - No output committer factory defined, defaulting to FileOutputCommitterFactory
14:51:15.486 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
14:51:15.486 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
14:51:15.487 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=mkdirs src=/user/runner/ReadsSparkSinkUnitTest1_b452aa5e-9d11-482c-8b9e-a53a9d62ebd6.bam.parts/_temporary/0 dst=null perm=runner:supergroup:rwxr-xr-x proto=rpc
14:51:15.494 INFO SparkContext - Starting job: runJob at SparkHadoopWriter.scala:83
14:51:15.494 INFO DAGScheduler - Registering RDD 153 (mapToPair at SparkUtils.java:161) as input to shuffle 10
14:51:15.495 INFO DAGScheduler - Got job 32 (runJob at SparkHadoopWriter.scala:83) with 1 output partitions
14:51:15.495 INFO DAGScheduler - Final stage: ResultStage 45 (runJob at SparkHadoopWriter.scala:83)
14:51:15.495 INFO DAGScheduler - Parents of final stage: List(ShuffleMapStage 44)
14:51:15.495 INFO DAGScheduler - Missing parents: List(ShuffleMapStage 44)
14:51:15.495 INFO DAGScheduler - Submitting ShuffleMapStage 44 (MapPartitionsRDD[153] at mapToPair at SparkUtils.java:161), which has no missing parents
14:51:15.520 INFO MemoryStore - Block broadcast_70 stored as values in memory (estimated size 520.4 KiB, free 1917.2 MiB)
14:51:15.522 INFO MemoryStore - Block broadcast_70_piece0 stored as bytes in memory (estimated size 166.1 KiB, free 1917.1 MiB)
14:51:15.522 INFO BlockManagerInfo - Added broadcast_70_piece0 in memory on localhost:44923 (size: 166.1 KiB, free: 1919.5 MiB)
14:51:15.523 INFO SparkContext - Created broadcast 70 from broadcast at DAGScheduler.scala:1580
14:51:15.523 INFO DAGScheduler - Submitting 1 missing tasks from ShuffleMapStage 44 (MapPartitionsRDD[153] at mapToPair at SparkUtils.java:161) (first 15 tasks are for partitions Vector(0))
14:51:15.523 INFO TaskSchedulerImpl - Adding task set 44.0 with 1 tasks resource profile 0
14:51:15.524 INFO TaskSetManager - Starting task 0.0 in stage 44.0 (TID 82) (localhost, executor driver, partition 0, PROCESS_LOCAL, 7861 bytes)
14:51:15.524 INFO Executor - Running task 0.0 in stage 44.0 (TID 82)
14:51:15.567 INFO NewHadoopRDD - Input split: file:/home/runner/work/gatk/gatk/src/test/resources/org/broadinstitute/hellbender/tools/BQSR/HiSeq.1mb.1RG.2k_lines.bam:0+222075
14:51:15.591 INFO Executor - Finished task 0.0 in stage 44.0 (TID 82). 1148 bytes result sent to driver
14:51:15.591 INFO TaskSetManager - Finished task 0.0 in stage 44.0 (TID 82) in 67 ms on localhost (executor driver) (1/1)
14:51:15.591 INFO TaskSchedulerImpl - Removed TaskSet 44.0, whose tasks have all completed, from pool
14:51:15.592 INFO DAGScheduler - ShuffleMapStage 44 (mapToPair at SparkUtils.java:161) finished in 0.097 s
14:51:15.592 INFO DAGScheduler - looking for newly runnable stages
14:51:15.592 INFO DAGScheduler - running: HashSet()
14:51:15.592 INFO DAGScheduler - waiting: HashSet(ResultStage 45)
14:51:15.592 INFO DAGScheduler - failed: HashSet()
14:51:15.592 INFO DAGScheduler - Submitting ResultStage 45 (MapPartitionsRDD[158] at mapToPair at BamSink.java:91), which has no missing parents
14:51:15.603 INFO MemoryStore - Block broadcast_71 stored as values in memory (estimated size 241.5 KiB, free 1916.8 MiB)
14:51:15.604 INFO MemoryStore - Block broadcast_71_piece0 stored as bytes in memory (estimated size 67.1 KiB, free 1916.8 MiB)
14:51:15.605 INFO BlockManagerInfo - Added broadcast_71_piece0 in memory on localhost:44923 (size: 67.1 KiB, free: 1919.4 MiB)
14:51:15.605 INFO SparkContext - Created broadcast 71 from broadcast at DAGScheduler.scala:1580
14:51:15.605 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 45 (MapPartitionsRDD[158] at mapToPair at BamSink.java:91) (first 15 tasks are for partitions Vector(0))
14:51:15.605 INFO TaskSchedulerImpl - Adding task set 45.0 with 1 tasks resource profile 0
14:51:15.606 INFO TaskSetManager - Starting task 0.0 in stage 45.0 (TID 83) (localhost, executor driver, partition 0, NODE_LOCAL, 7513 bytes)
14:51:15.607 INFO Executor - Running task 0.0 in stage 45.0 (TID 83)
14:51:15.615 INFO ShuffleBlockFetcherIterator - Getting 1 (343.8 KiB) non-empty blocks including 1 (343.8 KiB) local and 0 (0.0 B) host-local and 0 (0.0 B) push-merged-local and 0 (0.0 B) remote blocks
14:51:15.615 INFO ShuffleBlockFetcherIterator - Started 0 remote fetches in 0 ms
14:51:15.643 INFO PathOutputCommitterFactory - No output committer factory defined, defaulting to FileOutputCommitterFactory
14:51:15.643 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
14:51:15.643 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
14:51:15.643 INFO PathOutputCommitterFactory - No output committer factory defined, defaulting to FileOutputCommitterFactory
14:51:15.643 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
14:51:15.643 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
14:51:15.645 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=create src=/user/runner/ReadsSparkSinkUnitTest1_b452aa5e-9d11-482c-8b9e-a53a9d62ebd6.bam.parts/_temporary/0/_temporary/attempt_202603041451159200622510376656017_0158_r_000000_0/part-r-00000 dst=null perm=runner:supergroup:rw-r--r-- proto=rpc
14:51:15.646 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=create src=/user/runner/ReadsSparkSinkUnitTest1_b452aa5e-9d11-482c-8b9e-a53a9d62ebd6.bam.parts/_temporary/0/_temporary/attempt_202603041451159200622510376656017_0158_r_000000_0/.part-r-00000.sbi dst=null perm=runner:supergroup:rw-r--r-- proto=rpc
14:51:15.651 INFO StateChange - BLOCK* allocate blk_1073741844_1020, replicas=127.0.0.1:34059 for /user/runner/ReadsSparkSinkUnitTest1_b452aa5e-9d11-482c-8b9e-a53a9d62ebd6.bam.parts/_temporary/0/_temporary/attempt_202603041451159200622510376656017_0158_r_000000_0/part-r-00000
14:51:15.653 INFO DataNode - Receiving BP-1768883704-10.1.0.125-1772635868711:blk_1073741844_1020 src: /127.0.0.1:36318 dest: /127.0.0.1:34059
14:51:15.657 INFO clienttrace - src: /127.0.0.1:36318, dest: /127.0.0.1:34059, bytes: 231298, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-951833550_1, offset: 0, srvID: 3c05d188-bc2c-46c8-958b-79f7184a02a2, blockid: BP-1768883704-10.1.0.125-1772635868711:blk_1073741844_1020, duration(ns): 3555333
14:51:15.657 INFO DataNode - PacketResponder: BP-1768883704-10.1.0.125-1772635868711:blk_1073741844_1020, type=LAST_IN_PIPELINE terminating
14:51:15.659 INFO StateChange - DIR* completeFile: /user/runner/ReadsSparkSinkUnitTest1_b452aa5e-9d11-482c-8b9e-a53a9d62ebd6.bam.parts/_temporary/0/_temporary/attempt_202603041451159200622510376656017_0158_r_000000_0/part-r-00000 is closed by DFSClient_NONMAPREDUCE_-951833550_1
14:51:15.660 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_b452aa5e-9d11-482c-8b9e-a53a9d62ebd6.bam.parts/_temporary/0/_temporary/attempt_202603041451159200622510376656017_0158_r_000000_0/part-r-00000 dst=null perm=null proto=rpc
14:51:15.661 INFO StateChange - BLOCK* allocate blk_1073741845_1021, replicas=127.0.0.1:34059 for /user/runner/ReadsSparkSinkUnitTest1_b452aa5e-9d11-482c-8b9e-a53a9d62ebd6.bam.parts/_temporary/0/_temporary/attempt_202603041451159200622510376656017_0158_r_000000_0/.part-r-00000.sbi
14:51:15.662 INFO DataNode - Receiving BP-1768883704-10.1.0.125-1772635868711:blk_1073741845_1021 src: /127.0.0.1:36320 dest: /127.0.0.1:34059
14:51:15.664 INFO clienttrace - src: /127.0.0.1:36320, dest: /127.0.0.1:34059, bytes: 212, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-951833550_1, offset: 0, srvID: 3c05d188-bc2c-46c8-958b-79f7184a02a2, blockid: BP-1768883704-10.1.0.125-1772635868711:blk_1073741845_1021, duration(ns): 523212
14:51:15.664 INFO DataNode - PacketResponder: BP-1768883704-10.1.0.125-1772635868711:blk_1073741845_1021, type=LAST_IN_PIPELINE terminating
14:51:15.666 INFO StateChange - DIR* completeFile: /user/runner/ReadsSparkSinkUnitTest1_b452aa5e-9d11-482c-8b9e-a53a9d62ebd6.bam.parts/_temporary/0/_temporary/attempt_202603041451159200622510376656017_0158_r_000000_0/.part-r-00000.sbi is closed by DFSClient_NONMAPREDUCE_-951833550_1
14:51:15.667 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_b452aa5e-9d11-482c-8b9e-a53a9d62ebd6.bam.parts/_temporary/0/_temporary/attempt_202603041451159200622510376656017_0158_r_000000_0 dst=null perm=null proto=rpc
14:51:15.668 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_b452aa5e-9d11-482c-8b9e-a53a9d62ebd6.bam.parts/_temporary/0/_temporary/attempt_202603041451159200622510376656017_0158_r_000000_0 dst=null perm=null proto=rpc
14:51:15.669 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_b452aa5e-9d11-482c-8b9e-a53a9d62ebd6.bam.parts/_temporary/0/task_202603041451159200622510376656017_0158_r_000000 dst=null perm=null proto=rpc
14:51:15.670 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=rename src=/user/runner/ReadsSparkSinkUnitTest1_b452aa5e-9d11-482c-8b9e-a53a9d62ebd6.bam.parts/_temporary/0/_temporary/attempt_202603041451159200622510376656017_0158_r_000000_0 dst=/user/runner/ReadsSparkSinkUnitTest1_b452aa5e-9d11-482c-8b9e-a53a9d62ebd6.bam.parts/_temporary/0/task_202603041451159200622510376656017_0158_r_000000 perm=runner:supergroup:rwxr-xr-x proto=rpc
14:51:15.671 INFO FileOutputCommitter - Saved output of task 'attempt_202603041451159200622510376656017_0158_r_000000_0' to hdfs://localhost:43595/user/runner/ReadsSparkSinkUnitTest1_b452aa5e-9d11-482c-8b9e-a53a9d62ebd6.bam.parts/_temporary/0/task_202603041451159200622510376656017_0158_r_000000
14:51:15.671 INFO SparkHadoopMapRedUtil - attempt_202603041451159200622510376656017_0158_r_000000_0: Committed. Elapsed time: 2 ms.
14:51:15.672 INFO Executor - Finished task 0.0 in stage 45.0 (TID 83). 1858 bytes result sent to driver
14:51:15.673 INFO TaskSetManager - Finished task 0.0 in stage 45.0 (TID 83) in 67 ms on localhost (executor driver) (1/1)
14:51:15.673 INFO TaskSchedulerImpl - Removed TaskSet 45.0, whose tasks have all completed, from pool
14:51:15.673 INFO DAGScheduler - ResultStage 45 (runJob at SparkHadoopWriter.scala:83) finished in 0.080 s
14:51:15.673 INFO DAGScheduler - Job 32 is finished. Cancelling potential speculative or zombie tasks for this job
14:51:15.673 INFO TaskSchedulerImpl - Killing all running tasks in stage 45: Stage finished
14:51:15.674 INFO DAGScheduler - Job 32 finished: runJob at SparkHadoopWriter.scala:83, took 0.180215 s
14:51:15.675 INFO SparkHadoopWriter - Start to commit write Job job_202603041451159200622510376656017_0158.
14:51:15.676 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=listStatus src=/user/runner/ReadsSparkSinkUnitTest1_b452aa5e-9d11-482c-8b9e-a53a9d62ebd6.bam.parts/_temporary/0 dst=null perm=null proto=rpc
14:51:15.677 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_b452aa5e-9d11-482c-8b9e-a53a9d62ebd6.bam.parts dst=null perm=null proto=rpc
14:51:15.678 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=listStatus src=/user/runner/ReadsSparkSinkUnitTest1_b452aa5e-9d11-482c-8b9e-a53a9d62ebd6.bam.parts/_temporary/0/task_202603041451159200622510376656017_0158_r_000000 dst=null perm=null proto=rpc
14:51:15.678 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_b452aa5e-9d11-482c-8b9e-a53a9d62ebd6.bam.parts/.part-r-00000.sbi dst=null perm=null proto=rpc
14:51:15.680 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=rename src=/user/runner/ReadsSparkSinkUnitTest1_b452aa5e-9d11-482c-8b9e-a53a9d62ebd6.bam.parts/_temporary/0/task_202603041451159200622510376656017_0158_r_000000/.part-r-00000.sbi dst=/user/runner/ReadsSparkSinkUnitTest1_b452aa5e-9d11-482c-8b9e-a53a9d62ebd6.bam.parts/.part-r-00000.sbi perm=runner:supergroup:rw-r--r-- proto=rpc
14:51:15.680 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_b452aa5e-9d11-482c-8b9e-a53a9d62ebd6.bam.parts/part-r-00000 dst=null perm=null proto=rpc
14:51:15.681 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=rename src=/user/runner/ReadsSparkSinkUnitTest1_b452aa5e-9d11-482c-8b9e-a53a9d62ebd6.bam.parts/_temporary/0/task_202603041451159200622510376656017_0158_r_000000/part-r-00000 dst=/user/runner/ReadsSparkSinkUnitTest1_b452aa5e-9d11-482c-8b9e-a53a9d62ebd6.bam.parts/part-r-00000 perm=runner:supergroup:rw-r--r-- proto=rpc
14:51:15.682 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=delete src=/user/runner/ReadsSparkSinkUnitTest1_b452aa5e-9d11-482c-8b9e-a53a9d62ebd6.bam.parts/_temporary dst=null perm=null proto=rpc
14:51:15.683 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=create src=/user/runner/ReadsSparkSinkUnitTest1_b452aa5e-9d11-482c-8b9e-a53a9d62ebd6.bam.parts/_SUCCESS dst=null perm=runner:supergroup:rw-r--r-- proto=rpc
14:51:15.684 INFO StateChange - DIR* completeFile: /user/runner/ReadsSparkSinkUnitTest1_b452aa5e-9d11-482c-8b9e-a53a9d62ebd6.bam.parts/_SUCCESS is closed by DFSClient_NONMAPREDUCE_-951833550_1
14:51:15.685 INFO audit - allowed=false ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=delete src=/user/runner/ReadsSparkSinkUnitTest1_b452aa5e-9d11-482c-8b9e-a53a9d62ebd6.bam.parts/.spark-staging-158 dst=null perm=null proto=rpc
14:51:15.685 INFO SparkHadoopWriter - Write Job job_202603041451159200622510376656017_0158 committed. Elapsed time: 9 ms.
14:51:15.686 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=create src=/user/runner/ReadsSparkSinkUnitTest1_b452aa5e-9d11-482c-8b9e-a53a9d62ebd6.bam.parts/header dst=null perm=runner:supergroup:rw-r--r-- proto=rpc
14:51:15.688 INFO StateChange - BLOCK* allocate blk_1073741846_1022, replicas=127.0.0.1:34059 for /user/runner/ReadsSparkSinkUnitTest1_b452aa5e-9d11-482c-8b9e-a53a9d62ebd6.bam.parts/header
14:51:15.689 INFO DataNode - Receiving BP-1768883704-10.1.0.125-1772635868711:blk_1073741846_1022 src: /127.0.0.1:36336 dest: /127.0.0.1:34059
14:51:15.691 INFO clienttrace - src: /127.0.0.1:36336, dest: /127.0.0.1:34059, bytes: 5712, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-951833550_1, offset: 0, srvID: 3c05d188-bc2c-46c8-958b-79f7184a02a2, blockid: BP-1768883704-10.1.0.125-1772635868711:blk_1073741846_1022, duration(ns): 562406
14:51:15.691 INFO DataNode - PacketResponder: BP-1768883704-10.1.0.125-1772635868711:blk_1073741846_1022, type=LAST_IN_PIPELINE terminating
14:51:15.692 INFO StateChange - DIR* completeFile: /user/runner/ReadsSparkSinkUnitTest1_b452aa5e-9d11-482c-8b9e-a53a9d62ebd6.bam.parts/header is closed by DFSClient_NONMAPREDUCE_-951833550_1
14:51:15.693 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=create src=/user/runner/ReadsSparkSinkUnitTest1_b452aa5e-9d11-482c-8b9e-a53a9d62ebd6.bam.parts/terminator dst=null perm=runner:supergroup:rw-r--r-- proto=rpc
14:51:15.694 INFO StateChange - BLOCK* allocate blk_1073741847_1023, replicas=127.0.0.1:34059 for /user/runner/ReadsSparkSinkUnitTest1_b452aa5e-9d11-482c-8b9e-a53a9d62ebd6.bam.parts/terminator
14:51:15.695 INFO DataNode - Receiving BP-1768883704-10.1.0.125-1772635868711:blk_1073741847_1023 src: /127.0.0.1:36352 dest: /127.0.0.1:34059
14:51:15.696 INFO clienttrace - src: /127.0.0.1:36352, dest: /127.0.0.1:34059, bytes: 28, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-951833550_1, offset: 0, srvID: 3c05d188-bc2c-46c8-958b-79f7184a02a2, blockid: BP-1768883704-10.1.0.125-1772635868711:blk_1073741847_1023, duration(ns): 485201
14:51:15.697 INFO DataNode - PacketResponder: BP-1768883704-10.1.0.125-1772635868711:blk_1073741847_1023, type=LAST_IN_PIPELINE terminating
14:51:15.697 INFO StateChange - DIR* completeFile: /user/runner/ReadsSparkSinkUnitTest1_b452aa5e-9d11-482c-8b9e-a53a9d62ebd6.bam.parts/terminator is closed by DFSClient_NONMAPREDUCE_-951833550_1
14:51:15.698 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=listStatus src=/user/runner/ReadsSparkSinkUnitTest1_b452aa5e-9d11-482c-8b9e-a53a9d62ebd6.bam.parts dst=null perm=null proto=rpc
14:51:15.699 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=create src=/user/runner/ReadsSparkSinkUnitTest1_b452aa5e-9d11-482c-8b9e-a53a9d62ebd6.bam.parts/output dst=null perm=runner:supergroup:rw-r--r-- proto=rpc
14:51:15.700 INFO StateChange - DIR* completeFile: /user/runner/ReadsSparkSinkUnitTest1_b452aa5e-9d11-482c-8b9e-a53a9d62ebd6.bam.parts/output is closed by DFSClient_NONMAPREDUCE_-951833550_1
14:51:15.700 INFO HadoopFileSystemWrapper - Concatenating 3 parts to hdfs://localhost:43595/user/runner/ReadsSparkSinkUnitTest1_b452aa5e-9d11-482c-8b9e-a53a9d62ebd6.bam
14:51:15.701 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=concat src=[/user/runner/ReadsSparkSinkUnitTest1_b452aa5e-9d11-482c-8b9e-a53a9d62ebd6.bam.parts/header, /user/runner/ReadsSparkSinkUnitTest1_b452aa5e-9d11-482c-8b9e-a53a9d62ebd6.bam.parts/part-r-00000, /user/runner/ReadsSparkSinkUnitTest1_b452aa5e-9d11-482c-8b9e-a53a9d62ebd6.bam.parts/terminator] dst=/user/runner/ReadsSparkSinkUnitTest1_b452aa5e-9d11-482c-8b9e-a53a9d62ebd6.bam.parts/output perm=runner:supergroup:rw-r--r-- proto=rpc
14:51:15.702 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_b452aa5e-9d11-482c-8b9e-a53a9d62ebd6.bam dst=null perm=null proto=rpc
14:51:15.702 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=rename src=/user/runner/ReadsSparkSinkUnitTest1_b452aa5e-9d11-482c-8b9e-a53a9d62ebd6.bam.parts/output dst=/user/runner/ReadsSparkSinkUnitTest1_b452aa5e-9d11-482c-8b9e-a53a9d62ebd6.bam perm=runner:supergroup:rw-r--r-- proto=rpc
14:51:15.703 INFO HadoopFileSystemWrapper - Concatenating to hdfs://localhost:43595/user/runner/ReadsSparkSinkUnitTest1_b452aa5e-9d11-482c-8b9e-a53a9d62ebd6.bam done
14:51:15.703 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_b452aa5e-9d11-482c-8b9e-a53a9d62ebd6.bam dst=null perm=null proto=rpc
14:51:15.703 INFO IndexFileMerger - Merging .sbi files in temp directory hdfs://localhost:43595/user/runner/ReadsSparkSinkUnitTest1_b452aa5e-9d11-482c-8b9e-a53a9d62ebd6.bam.parts/ to hdfs://localhost:43595/user/runner/ReadsSparkSinkUnitTest1_b452aa5e-9d11-482c-8b9e-a53a9d62ebd6.bam.sbi
14:51:15.704 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=listStatus src=/user/runner/ReadsSparkSinkUnitTest1_b452aa5e-9d11-482c-8b9e-a53a9d62ebd6.bam.parts dst=null perm=null proto=rpc
14:51:15.705 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=create src=/user/runner/ReadsSparkSinkUnitTest1_b452aa5e-9d11-482c-8b9e-a53a9d62ebd6.bam.sbi dst=null perm=runner:supergroup:rw-r--r-- proto=rpc
14:51:15.706 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_b452aa5e-9d11-482c-8b9e-a53a9d62ebd6.bam.parts/.part-r-00000.sbi dst=null perm=null proto=rpc
14:51:15.706 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest1_b452aa5e-9d11-482c-8b9e-a53a9d62ebd6.bam.parts/.part-r-00000.sbi dst=null perm=null proto=rpc
14:51:15.708 WARN DFSUtil - Unexpected value for data transfer bytes=216 duration=0
14:51:15.708 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=delete src=/user/runner/ReadsSparkSinkUnitTest1_b452aa5e-9d11-482c-8b9e-a53a9d62ebd6.bam.parts/.part-r-00000.sbi dst=null perm=null proto=rpc
14:51:15.709 INFO StateChange - BLOCK* allocate blk_1073741848_1024, replicas=127.0.0.1:34059 for /user/runner/ReadsSparkSinkUnitTest1_b452aa5e-9d11-482c-8b9e-a53a9d62ebd6.bam.sbi
14:51:15.710 INFO DataNode - Receiving BP-1768883704-10.1.0.125-1772635868711:blk_1073741848_1024 src: /127.0.0.1:36354 dest: /127.0.0.1:34059
14:51:15.712 INFO clienttrace - src: /127.0.0.1:36354, dest: /127.0.0.1:34059, bytes: 212, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-951833550_1, offset: 0, srvID: 3c05d188-bc2c-46c8-958b-79f7184a02a2, blockid: BP-1768883704-10.1.0.125-1772635868711:blk_1073741848_1024, duration(ns): 555385
14:51:15.712 INFO DataNode - PacketResponder: BP-1768883704-10.1.0.125-1772635868711:blk_1073741848_1024, type=LAST_IN_PIPELINE terminating
14:51:15.713 INFO StateChange - DIR* completeFile: /user/runner/ReadsSparkSinkUnitTest1_b452aa5e-9d11-482c-8b9e-a53a9d62ebd6.bam.sbi is closed by DFSClient_NONMAPREDUCE_-951833550_1
14:51:15.713 INFO IndexFileMerger - Done merging .sbi files
14:51:15.714 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=delete src=/user/runner/ReadsSparkSinkUnitTest1_b452aa5e-9d11-482c-8b9e-a53a9d62ebd6.bam.parts dst=null perm=null proto=rpc
14:51:15.724 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_b452aa5e-9d11-482c-8b9e-a53a9d62ebd6.bam.sbi dst=null perm=null proto=rpc
14:51:15.725 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_b452aa5e-9d11-482c-8b9e-a53a9d62ebd6.bam.sbi dst=null perm=null proto=rpc
14:51:15.726 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest1_b452aa5e-9d11-482c-8b9e-a53a9d62ebd6.bam.sbi dst=null perm=null proto=rpc
14:51:15.727 WARN DFSUtil - Unexpected value for data transfer bytes=216 duration=0
14:51:15.727 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_b452aa5e-9d11-482c-8b9e-a53a9d62ebd6.bam dst=null perm=null proto=rpc
14:51:15.728 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_b452aa5e-9d11-482c-8b9e-a53a9d62ebd6.bam dst=null perm=null proto=rpc
14:51:15.729 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_b452aa5e-9d11-482c-8b9e-a53a9d62ebd6.bam dst=null perm=null proto=rpc
14:51:15.729 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest1_b452aa5e-9d11-482c-8b9e-a53a9d62ebd6.bam dst=null perm=null proto=rpc
14:51:15.731 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_b452aa5e-9d11-482c-8b9e-a53a9d62ebd6.bam.bai dst=null perm=null proto=rpc
14:51:15.731 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_b452aa5e-9d11-482c-8b9e-a53a9d62ebd6.bai dst=null perm=null proto=rpc
14:51:15.733 WARN DFSUtil - Unexpected value for data transfer bytes=5760 duration=0
14:51:15.734 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
14:51:15.734 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_b452aa5e-9d11-482c-8b9e-a53a9d62ebd6.bam.sbi dst=null perm=null proto=rpc
14:51:15.735 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_b452aa5e-9d11-482c-8b9e-a53a9d62ebd6.bam.sbi dst=null perm=null proto=rpc
14:51:15.736 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest1_b452aa5e-9d11-482c-8b9e-a53a9d62ebd6.bam.sbi dst=null perm=null proto=rpc
14:51:15.737 WARN DFSUtil - Unexpected value for data transfer bytes=216 duration=0
14:51:15.737 INFO MemoryStore - Block broadcast_72 stored as values in memory (estimated size 320.0 B, free 1916.8 MiB)
14:51:15.738 INFO MemoryStore - Block broadcast_72_piece0 stored as bytes in memory (estimated size 233.0 B, free 1916.8 MiB)
14:51:15.738 INFO BlockManagerInfo - Added broadcast_72_piece0 in memory on localhost:44923 (size: 233.0 B, free: 1919.4 MiB)
14:51:15.739 INFO SparkContext - Created broadcast 72 from broadcast at BamSource.java:104
14:51:15.741 INFO MemoryStore - Block broadcast_73 stored as values in memory (estimated size 297.9 KiB, free 1916.5 MiB)
14:51:15.752 INFO MemoryStore - Block broadcast_73_piece0 stored as bytes in memory (estimated size 50.2 KiB, free 1916.4 MiB)
14:51:15.753 INFO BlockManagerInfo - Added broadcast_73_piece0 in memory on localhost:44923 (size: 50.2 KiB, free: 1919.4 MiB)
14:51:15.753 INFO SparkContext - Created broadcast 73 from newAPIHadoopFile at PathSplitSource.java:96
14:51:15.764 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_b452aa5e-9d11-482c-8b9e-a53a9d62ebd6.bam dst=null perm=null proto=rpc
14:51:15.765 INFO FileInputFormat - Total input files to process : 1
14:51:15.765 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest1_b452aa5e-9d11-482c-8b9e-a53a9d62ebd6.bam dst=null perm=null proto=rpc
14:51:15.781 INFO SparkContext - Starting job: collect at ReadsSparkSinkUnitTest.java:182
14:51:15.782 INFO DAGScheduler - Got job 33 (collect at ReadsSparkSinkUnitTest.java:182) with 1 output partitions
14:51:15.782 INFO DAGScheduler - Final stage: ResultStage 46 (collect at ReadsSparkSinkUnitTest.java:182)
14:51:15.782 INFO DAGScheduler - Parents of final stage: List()
14:51:15.782 INFO DAGScheduler - Missing parents: List()
14:51:15.782 INFO DAGScheduler - Submitting ResultStage 46 (MapPartitionsRDD[164] at filter at ReadsSparkSource.java:96), which has no missing parents
14:51:15.789 INFO MemoryStore - Block broadcast_74 stored as values in memory (estimated size 148.2 KiB, free 1916.3 MiB)
14:51:15.790 INFO MemoryStore - Block broadcast_74_piece0 stored as bytes in memory (estimated size 54.6 KiB, free 1916.2 MiB)
14:51:15.790 INFO BlockManagerInfo - Added broadcast_74_piece0 in memory on localhost:44923 (size: 54.6 KiB, free: 1919.3 MiB)
14:51:15.790 INFO SparkContext - Created broadcast 74 from broadcast at DAGScheduler.scala:1580
14:51:15.790 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 46 (MapPartitionsRDD[164] at filter at ReadsSparkSource.java:96) (first 15 tasks are for partitions Vector(0))
14:51:15.791 INFO TaskSchedulerImpl - Adding task set 46.0 with 1 tasks resource profile 0
14:51:15.791 INFO TaskSetManager - Starting task 0.0 in stage 46.0 (TID 84) (localhost, executor driver, partition 0, ANY, 7852 bytes)
14:51:15.792 INFO Executor - Running task 0.0 in stage 46.0 (TID 84)
14:51:15.806 INFO NewHadoopRDD - Input split: hdfs://localhost:43595/user/runner/ReadsSparkSinkUnitTest1_b452aa5e-9d11-482c-8b9e-a53a9d62ebd6.bam:0+237038
14:51:15.807 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_b452aa5e-9d11-482c-8b9e-a53a9d62ebd6.bam dst=null perm=null proto=rpc
14:51:15.808 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest1_b452aa5e-9d11-482c-8b9e-a53a9d62ebd6.bam dst=null perm=null proto=rpc
14:51:15.809 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_b452aa5e-9d11-482c-8b9e-a53a9d62ebd6.bam.bai dst=null perm=null proto=rpc
14:51:15.810 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_b452aa5e-9d11-482c-8b9e-a53a9d62ebd6.bai dst=null perm=null proto=rpc
14:51:15.815 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
14:51:15.821 INFO Executor - Finished task 0.0 in stage 46.0 (TID 84). 651526 bytes result sent to driver
14:51:15.824 INFO TaskSetManager - Finished task 0.0 in stage 46.0 (TID 84) in 33 ms on localhost (executor driver) (1/1)
14:51:15.824 INFO TaskSchedulerImpl - Removed TaskSet 46.0, whose tasks have all completed, from pool
14:51:15.825 INFO DAGScheduler - ResultStage 46 (collect at ReadsSparkSinkUnitTest.java:182) finished in 0.042 s
14:51:15.825 INFO DAGScheduler - Job 33 is finished. Cancelling potential speculative or zombie tasks for this job
14:51:15.825 INFO TaskSchedulerImpl - Killing all running tasks in stage 46: Stage finished
14:51:15.825 INFO DAGScheduler - Job 33 finished: collect at ReadsSparkSinkUnitTest.java:182, took 0.043764 s
14:51:15.842 INFO SparkContext - Starting job: count at ReadsSparkSinkUnitTest.java:185
14:51:15.843 INFO DAGScheduler - Got job 34 (count at ReadsSparkSinkUnitTest.java:185) with 1 output partitions
14:51:15.843 INFO DAGScheduler - Final stage: ResultStage 47 (count at ReadsSparkSinkUnitTest.java:185)
14:51:15.843 INFO DAGScheduler - Parents of final stage: List()
14:51:15.843 INFO DAGScheduler - Missing parents: List()
14:51:15.844 INFO DAGScheduler - Submitting ResultStage 47 (MapPartitionsRDD[146] at filter at ReadsSparkSource.java:96), which has no missing parents
14:51:15.868 INFO MemoryStore - Block broadcast_75 stored as values in memory (estimated size 426.1 KiB, free 1915.8 MiB)
14:51:15.870 INFO MemoryStore - Block broadcast_75_piece0 stored as bytes in memory (estimated size 153.6 KiB, free 1915.7 MiB)
14:51:15.870 INFO BlockManagerInfo - Added broadcast_75_piece0 in memory on localhost:44923 (size: 153.6 KiB, free: 1919.2 MiB)
14:51:15.870 INFO SparkContext - Created broadcast 75 from broadcast at DAGScheduler.scala:1580
14:51:15.870 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 47 (MapPartitionsRDD[146] at filter at ReadsSparkSource.java:96) (first 15 tasks are for partitions Vector(0))
14:51:15.871 INFO TaskSchedulerImpl - Adding task set 47.0 with 1 tasks resource profile 0
14:51:15.871 INFO TaskSetManager - Starting task 0.0 in stage 47.0 (TID 85) (localhost, executor driver, partition 0, PROCESS_LOCAL, 7872 bytes)
14:51:15.872 INFO Executor - Running task 0.0 in stage 47.0 (TID 85)
14:51:15.914 INFO NewHadoopRDD - Input split: file:/home/runner/work/gatk/gatk/src/test/resources/org/broadinstitute/hellbender/tools/BQSR/HiSeq.1mb.1RG.2k_lines.bam:0+222075
14:51:15.926 INFO Executor - Finished task 0.0 in stage 47.0 (TID 85). 989 bytes result sent to driver
14:51:15.927 INFO TaskSetManager - Finished task 0.0 in stage 47.0 (TID 85) in 56 ms on localhost (executor driver) (1/1)
14:51:15.927 INFO TaskSchedulerImpl - Removed TaskSet 47.0, whose tasks have all completed, from pool
14:51:15.927 INFO DAGScheduler - ResultStage 47 (count at ReadsSparkSinkUnitTest.java:185) finished in 0.083 s
14:51:15.927 INFO DAGScheduler - Job 34 is finished. Cancelling potential speculative or zombie tasks for this job
14:51:15.927 INFO TaskSchedulerImpl - Killing all running tasks in stage 47: Stage finished
14:51:15.927 INFO DAGScheduler - Job 34 finished: count at ReadsSparkSinkUnitTest.java:185, took 0.085006 s
14:51:15.932 INFO SparkContext - Starting job: count at ReadsSparkSinkUnitTest.java:185
14:51:15.933 INFO DAGScheduler - Got job 35 (count at ReadsSparkSinkUnitTest.java:185) with 1 output partitions
14:51:15.933 INFO DAGScheduler - Final stage: ResultStage 48 (count at ReadsSparkSinkUnitTest.java:185)
14:51:15.933 INFO DAGScheduler - Parents of final stage: List()
14:51:15.933 INFO DAGScheduler - Missing parents: List()
14:51:15.933 INFO DAGScheduler - Submitting ResultStage 48 (MapPartitionsRDD[164] at filter at ReadsSparkSource.java:96), which has no missing parents
14:51:15.940 INFO MemoryStore - Block broadcast_76 stored as values in memory (estimated size 148.1 KiB, free 1915.5 MiB)
14:51:15.941 INFO MemoryStore - Block broadcast_76_piece0 stored as bytes in memory (estimated size 54.6 KiB, free 1915.5 MiB)
14:51:15.941 INFO BlockManagerInfo - Added broadcast_76_piece0 in memory on localhost:44923 (size: 54.6 KiB, free: 1919.1 MiB)
14:51:15.942 INFO SparkContext - Created broadcast 76 from broadcast at DAGScheduler.scala:1580
14:51:15.942 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 48 (MapPartitionsRDD[164] at filter at ReadsSparkSource.java:96) (first 15 tasks are for partitions Vector(0))
14:51:15.942 INFO TaskSchedulerImpl - Adding task set 48.0 with 1 tasks resource profile 0
14:51:15.943 INFO TaskSetManager - Starting task 0.0 in stage 48.0 (TID 86) (localhost, executor driver, partition 0, ANY, 7852 bytes)
14:51:15.943 INFO Executor - Running task 0.0 in stage 48.0 (TID 86)
14:51:15.958 INFO NewHadoopRDD - Input split: hdfs://localhost:43595/user/runner/ReadsSparkSinkUnitTest1_b452aa5e-9d11-482c-8b9e-a53a9d62ebd6.bam:0+237038
14:51:15.959 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_b452aa5e-9d11-482c-8b9e-a53a9d62ebd6.bam dst=null perm=null proto=rpc
14:51:15.960 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest1_b452aa5e-9d11-482c-8b9e-a53a9d62ebd6.bam dst=null perm=null proto=rpc
14:51:15.961 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_b452aa5e-9d11-482c-8b9e-a53a9d62ebd6.bam.bai dst=null perm=null proto=rpc
14:51:15.962 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_b452aa5e-9d11-482c-8b9e-a53a9d62ebd6.bai dst=null perm=null proto=rpc
14:51:15.963 WARN DFSUtil - Unexpected value for data transfer bytes=5760 duration=0
14:51:15.966 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
14:51:15.967 WARN DFSUtil - Unexpected value for data transfer bytes=32 duration=0
14:51:15.969 INFO Executor - Finished task 0.0 in stage 48.0 (TID 86). 989 bytes result sent to driver
14:51:15.969 INFO TaskSetManager - Finished task 0.0 in stage 48.0 (TID 86) in 26 ms on localhost (executor driver) (1/1)
14:51:15.969 INFO TaskSchedulerImpl - Removed TaskSet 48.0, whose tasks have all completed, from pool
14:51:15.970 INFO DAGScheduler - ResultStage 48 (count at ReadsSparkSinkUnitTest.java:185) finished in 0.037 s
14:51:15.970 INFO DAGScheduler - Job 35 is finished. Cancelling potential speculative or zombie tasks for this job
14:51:15.970 INFO TaskSchedulerImpl - Killing all running tasks in stage 48: Stage finished
14:51:15.970 INFO DAGScheduler - Job 35 finished: count at ReadsSparkSinkUnitTest.java:185, took 0.037808 s
14:51:15.974 INFO MemoryStore - Block broadcast_77 stored as values in memory (estimated size 297.9 KiB, free 1915.2 MiB)
14:51:15.980 INFO MemoryStore - Block broadcast_77_piece0 stored as bytes in memory (estimated size 50.2 KiB, free 1915.1 MiB)
14:51:15.980 INFO BlockManagerInfo - Added broadcast_77_piece0 in memory on localhost:44923 (size: 50.2 KiB, free: 1919.1 MiB)
14:51:15.981 INFO SparkContext - Created broadcast 77 from newAPIHadoopFile at PathSplitSource.java:96
14:51:16.005 INFO MemoryStore - Block broadcast_78 stored as values in memory (estimated size 297.9 KiB, free 1914.8 MiB)
14:51:16.012 INFO MemoryStore - Block broadcast_78_piece0 stored as bytes in memory (estimated size 50.2 KiB, free 1914.8 MiB)
14:51:16.012 INFO BlockManagerInfo - Added broadcast_78_piece0 in memory on localhost:44923 (size: 50.2 KiB, free: 1919.0 MiB)
14:51:16.012 INFO SparkContext - Created broadcast 78 from newAPIHadoopFile at PathSplitSource.java:96
14:51:16.028 INFO FsDatasetAsyncDiskService - Scheduling blk_1073741840_1016 replica FinalizedReplica, blk_1073741840_1016, FINALIZED
getNumBytes() = 5472
getBytesOnDisk() = 5472
getVisibleLength()= 5472
getVolume() = /tmp/minicluster_storage16268522075870465194/data/data2
getBlockURI() = file:/tmp/minicluster_storage16268522075870465194/data/data2/current/BP-1768883704-10.1.0.125-1772635868711/current/finalized/subdir0/subdir0/blk_1073741840 for deletion
14:51:16.028 INFO FsDatasetAsyncDiskService - Scheduling blk_1073741833_1009 replica FinalizedReplica, blk_1073741833_1009, FINALIZED
getNumBytes() = 13492
getBytesOnDisk() = 13492
getVisibleLength()= 13492
getVolume() = /tmp/minicluster_storage16268522075870465194/data/data1
getBlockURI() = file:/tmp/minicluster_storage16268522075870465194/data/data1/current/BP-1768883704-10.1.0.125-1772635868711/current/finalized/subdir0/subdir0/blk_1073741833 for deletion
14:51:16.028 INFO FsDatasetAsyncDiskService - Scheduling blk_1073741834_1010 replica FinalizedReplica, blk_1073741834_1010, FINALIZED
getNumBytes() = 5472
getBytesOnDisk() = 5472
getVisibleLength()= 5472
getVolume() = /tmp/minicluster_storage16268522075870465194/data/data2
getBlockURI() = file:/tmp/minicluster_storage16268522075870465194/data/data2/current/BP-1768883704-10.1.0.125-1772635868711/current/finalized/subdir0/subdir0/blk_1073741834 for deletion
14:51:16.028 INFO FsDatasetAsyncDiskService - Deleted BP-1768883704-10.1.0.125-1772635868711 blk_1073741840_1016 URI file:/tmp/minicluster_storage16268522075870465194/data/data2/current/BP-1768883704-10.1.0.125-1772635868711/current/finalized/subdir0/subdir0/blk_1073741840
14:51:16.028 INFO FsDatasetAsyncDiskService - Deleted BP-1768883704-10.1.0.125-1772635868711 blk_1073741834_1010 URI file:/tmp/minicluster_storage16268522075870465194/data/data2/current/BP-1768883704-10.1.0.125-1772635868711/current/finalized/subdir0/subdir0/blk_1073741834
14:51:16.029 INFO FsDatasetAsyncDiskService - Deleted BP-1768883704-10.1.0.125-1772635868711 blk_1073741833_1009 URI file:/tmp/minicluster_storage16268522075870465194/data/data1/current/BP-1768883704-10.1.0.125-1772635868711/current/finalized/subdir0/subdir0/blk_1073741833
14:51:16.042 INFO FileInputFormat - Total input files to process : 1
14:51:16.044 INFO MemoryStore - Block broadcast_79 stored as values in memory (estimated size 160.7 KiB, free 1914.6 MiB)
14:51:16.045 INFO MemoryStore - Block broadcast_79_piece0 stored as bytes in memory (estimated size 9.6 KiB, free 1914.6 MiB)
14:51:16.045 INFO BlockManagerInfo - Added broadcast_79_piece0 in memory on localhost:44923 (size: 9.6 KiB, free: 1919.0 MiB)
14:51:16.045 INFO SparkContext - Created broadcast 79 from broadcast at ReadsSparkSink.java:133
14:51:16.046 WARN HtsjdkReadsRddStorage - Unrecognized write option: DISABLE
14:51:16.046 WARN HtsjdkReadsRddStorage - Unrecognized write option: DISABLE
14:51:16.047 INFO MemoryStore - Block broadcast_80 stored as values in memory (estimated size 163.2 KiB, free 1914.5 MiB)
14:51:16.048 INFO MemoryStore - Block broadcast_80_piece0 stored as bytes in memory (estimated size 9.6 KiB, free 1914.4 MiB)
14:51:16.048 INFO BlockManagerInfo - Added broadcast_80_piece0 in memory on localhost:44923 (size: 9.6 KiB, free: 1919.0 MiB)
14:51:16.049 INFO SparkContext - Created broadcast 80 from broadcast at BamSink.java:76
14:51:16.051 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_5b55bbc6-94da-4902-8f0b-9b58b8bcf005.bam.parts dst=null perm=null proto=rpc
14:51:16.052 INFO PathOutputCommitterFactory - No output committer factory defined, defaulting to FileOutputCommitterFactory
14:51:16.052 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
14:51:16.052 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
14:51:16.053 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=mkdirs src=/user/runner/ReadsSparkSinkUnitTest1_5b55bbc6-94da-4902-8f0b-9b58b8bcf005.bam.parts/_temporary/0 dst=null perm=runner:supergroup:rwxr-xr-x proto=rpc
14:51:16.059 INFO SparkContext - Starting job: runJob at SparkHadoopWriter.scala:83
14:51:16.060 INFO DAGScheduler - Registering RDD 178 (mapToPair at SparkUtils.java:161) as input to shuffle 11
14:51:16.060 INFO DAGScheduler - Got job 36 (runJob at SparkHadoopWriter.scala:83) with 1 output partitions
14:51:16.060 INFO DAGScheduler - Final stage: ResultStage 50 (runJob at SparkHadoopWriter.scala:83)
14:51:16.060 INFO DAGScheduler - Parents of final stage: List(ShuffleMapStage 49)
14:51:16.060 INFO DAGScheduler - Missing parents: List(ShuffleMapStage 49)
14:51:16.061 INFO DAGScheduler - Submitting ShuffleMapStage 49 (MapPartitionsRDD[178] at mapToPair at SparkUtils.java:161), which has no missing parents
14:51:16.073 INFO BlockManagerInfo - Removed broadcast_71_piece0 on localhost:44923 in memory (size: 67.1 KiB, free: 1919.1 MiB)
14:51:16.074 INFO BlockManagerInfo - Removed broadcast_65_piece0 on localhost:44923 in memory (size: 153.6 KiB, free: 1919.2 MiB)
14:51:16.075 INFO BlockManagerInfo - Removed broadcast_69_piece0 on localhost:44923 in memory (size: 9.6 KiB, free: 1919.2 MiB)
14:51:16.075 INFO BlockManagerInfo - Removed broadcast_75_piece0 on localhost:44923 in memory (size: 153.6 KiB, free: 1919.4 MiB)
14:51:16.076 INFO BlockManagerInfo - Removed broadcast_67_piece0 on localhost:44923 in memory (size: 50.2 KiB, free: 1919.4 MiB)
14:51:16.077 INFO BlockManagerInfo - Removed broadcast_76_piece0 on localhost:44923 in memory (size: 54.6 KiB, free: 1919.5 MiB)
14:51:16.077 INFO BlockManagerInfo - Removed broadcast_74_piece0 on localhost:44923 in memory (size: 54.6 KiB, free: 1919.5 MiB)
14:51:16.078 INFO BlockManagerInfo - Removed broadcast_72_piece0 on localhost:44923 in memory (size: 233.0 B, free: 1919.5 MiB)
14:51:16.079 INFO BlockManagerInfo - Removed broadcast_62_piece0 on localhost:44923 in memory (size: 50.2 KiB, free: 1919.6 MiB)
14:51:16.080 INFO BlockManagerInfo - Removed broadcast_66_piece0 on localhost:44923 in memory (size: 50.2 KiB, free: 1919.6 MiB)
14:51:16.081 INFO BlockManagerInfo - Removed broadcast_70_piece0 on localhost:44923 in memory (size: 166.1 KiB, free: 1919.8 MiB)
14:51:16.082 INFO BlockManagerInfo - Removed broadcast_73_piece0 on localhost:44923 in memory (size: 50.2 KiB, free: 1919.8 MiB)
14:51:16.082 INFO BlockManagerInfo - Removed broadcast_68_piece0 on localhost:44923 in memory (size: 9.6 KiB, free: 1919.8 MiB)
14:51:16.083 INFO BlockManagerInfo - Removed broadcast_56_piece0 on localhost:44923 in memory (size: 50.2 KiB, free: 1919.9 MiB)
14:51:16.084 INFO BlockManagerInfo - Removed broadcast_78_piece0 on localhost:44923 in memory (size: 50.2 KiB, free: 1919.9 MiB)
14:51:16.095 INFO MemoryStore - Block broadcast_81 stored as values in memory (estimated size 520.4 KiB, free 1918.8 MiB)
14:51:16.097 INFO MemoryStore - Block broadcast_81_piece0 stored as bytes in memory (estimated size 166.1 KiB, free 1918.7 MiB)
14:51:16.097 INFO BlockManagerInfo - Added broadcast_81_piece0 in memory on localhost:44923 (size: 166.1 KiB, free: 1919.8 MiB)
14:51:16.098 INFO SparkContext - Created broadcast 81 from broadcast at DAGScheduler.scala:1580
14:51:16.098 INFO DAGScheduler - Submitting 1 missing tasks from ShuffleMapStage 49 (MapPartitionsRDD[178] at mapToPair at SparkUtils.java:161) (first 15 tasks are for partitions Vector(0))
14:51:16.098 INFO TaskSchedulerImpl - Adding task set 49.0 with 1 tasks resource profile 0
14:51:16.099 INFO TaskSetManager - Starting task 0.0 in stage 49.0 (TID 87) (localhost, executor driver, partition 0, PROCESS_LOCAL, 7861 bytes)
14:51:16.100 INFO Executor - Running task 0.0 in stage 49.0 (TID 87)
14:51:16.147 INFO NewHadoopRDD - Input split: file:/home/runner/work/gatk/gatk/src/test/resources/org/broadinstitute/hellbender/tools/BQSR/HiSeq.1mb.1RG.2k_lines.bam:0+222075
14:51:16.169 INFO Executor - Finished task 0.0 in stage 49.0 (TID 87). 1148 bytes result sent to driver
14:51:16.169 INFO TaskSetManager - Finished task 0.0 in stage 49.0 (TID 87) in 70 ms on localhost (executor driver) (1/1)
14:51:16.170 INFO TaskSchedulerImpl - Removed TaskSet 49.0, whose tasks have all completed, from pool
14:51:16.170 INFO DAGScheduler - ShuffleMapStage 49 (mapToPair at SparkUtils.java:161) finished in 0.109 s
14:51:16.170 INFO DAGScheduler - looking for newly runnable stages
14:51:16.170 INFO DAGScheduler - running: HashSet()
14:51:16.170 INFO DAGScheduler - waiting: HashSet(ResultStage 50)
14:51:16.170 INFO DAGScheduler - failed: HashSet()
14:51:16.170 INFO DAGScheduler - Submitting ResultStage 50 (MapPartitionsRDD[183] at mapToPair at BamSink.java:91), which has no missing parents
14:51:16.182 INFO MemoryStore - Block broadcast_82 stored as values in memory (estimated size 241.5 KiB, free 1918.4 MiB)
14:51:16.183 INFO MemoryStore - Block broadcast_82_piece0 stored as bytes in memory (estimated size 67.1 KiB, free 1918.4 MiB)
14:51:16.183 INFO BlockManagerInfo - Added broadcast_82_piece0 in memory on localhost:44923 (size: 67.1 KiB, free: 1919.7 MiB)
14:51:16.183 INFO SparkContext - Created broadcast 82 from broadcast at DAGScheduler.scala:1580
14:51:16.184 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 50 (MapPartitionsRDD[183] at mapToPair at BamSink.java:91) (first 15 tasks are for partitions Vector(0))
14:51:16.184 INFO TaskSchedulerImpl - Adding task set 50.0 with 1 tasks resource profile 0
14:51:16.185 INFO TaskSetManager - Starting task 0.0 in stage 50.0 (TID 88) (localhost, executor driver, partition 0, NODE_LOCAL, 7513 bytes)
14:51:16.185 INFO Executor - Running task 0.0 in stage 50.0 (TID 88)
14:51:16.191 INFO ShuffleBlockFetcherIterator - Getting 1 (343.8 KiB) non-empty blocks including 1 (343.8 KiB) local and 0 (0.0 B) host-local and 0 (0.0 B) push-merged-local and 0 (0.0 B) remote blocks
14:51:16.191 INFO ShuffleBlockFetcherIterator - Started 0 remote fetches in 0 ms
14:51:16.207 INFO PathOutputCommitterFactory - No output committer factory defined, defaulting to FileOutputCommitterFactory
14:51:16.207 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
14:51:16.207 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
14:51:16.207 INFO PathOutputCommitterFactory - No output committer factory defined, defaulting to FileOutputCommitterFactory
14:51:16.208 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
14:51:16.208 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
14:51:16.209 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=create src=/user/runner/ReadsSparkSinkUnitTest1_5b55bbc6-94da-4902-8f0b-9b58b8bcf005.bam.parts/_temporary/0/_temporary/attempt_202603041451164975581314761497995_0183_r_000000_0/part-r-00000 dst=null perm=runner:supergroup:rw-r--r-- proto=rpc
14:51:16.213 INFO StateChange - BLOCK* allocate blk_1073741849_1025, replicas=127.0.0.1:34059 for /user/runner/ReadsSparkSinkUnitTest1_5b55bbc6-94da-4902-8f0b-9b58b8bcf005.bam.parts/_temporary/0/_temporary/attempt_202603041451164975581314761497995_0183_r_000000_0/part-r-00000
14:51:16.215 INFO DataNode - Receiving BP-1768883704-10.1.0.125-1772635868711:blk_1073741849_1025 src: /127.0.0.1:36362 dest: /127.0.0.1:34059
14:51:16.218 INFO clienttrace - src: /127.0.0.1:36362, dest: /127.0.0.1:34059, bytes: 231298, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-951833550_1, offset: 0, srvID: 3c05d188-bc2c-46c8-958b-79f7184a02a2, blockid: BP-1768883704-10.1.0.125-1772635868711:blk_1073741849_1025, duration(ns): 1724191
14:51:16.218 INFO DataNode - PacketResponder: BP-1768883704-10.1.0.125-1772635868711:blk_1073741849_1025, type=LAST_IN_PIPELINE terminating
14:51:16.219 INFO StateChange - DIR* completeFile: /user/runner/ReadsSparkSinkUnitTest1_5b55bbc6-94da-4902-8f0b-9b58b8bcf005.bam.parts/_temporary/0/_temporary/attempt_202603041451164975581314761497995_0183_r_000000_0/part-r-00000 is closed by DFSClient_NONMAPREDUCE_-951833550_1
14:51:16.220 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_5b55bbc6-94da-4902-8f0b-9b58b8bcf005.bam.parts/_temporary/0/_temporary/attempt_202603041451164975581314761497995_0183_r_000000_0/part-r-00000 dst=null perm=null proto=rpc
14:51:16.221 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_5b55bbc6-94da-4902-8f0b-9b58b8bcf005.bam.parts/_temporary/0/_temporary/attempt_202603041451164975581314761497995_0183_r_000000_0 dst=null perm=null proto=rpc
14:51:16.222 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_5b55bbc6-94da-4902-8f0b-9b58b8bcf005.bam.parts/_temporary/0/_temporary/attempt_202603041451164975581314761497995_0183_r_000000_0 dst=null perm=null proto=rpc
14:51:16.223 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_5b55bbc6-94da-4902-8f0b-9b58b8bcf005.bam.parts/_temporary/0/task_202603041451164975581314761497995_0183_r_000000 dst=null perm=null proto=rpc
14:51:16.223 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=rename src=/user/runner/ReadsSparkSinkUnitTest1_5b55bbc6-94da-4902-8f0b-9b58b8bcf005.bam.parts/_temporary/0/_temporary/attempt_202603041451164975581314761497995_0183_r_000000_0 dst=/user/runner/ReadsSparkSinkUnitTest1_5b55bbc6-94da-4902-8f0b-9b58b8bcf005.bam.parts/_temporary/0/task_202603041451164975581314761497995_0183_r_000000 perm=runner:supergroup:rwxr-xr-x proto=rpc
14:51:16.224 INFO FileOutputCommitter - Saved output of task 'attempt_202603041451164975581314761497995_0183_r_000000_0' to hdfs://localhost:43595/user/runner/ReadsSparkSinkUnitTest1_5b55bbc6-94da-4902-8f0b-9b58b8bcf005.bam.parts/_temporary/0/task_202603041451164975581314761497995_0183_r_000000
14:51:16.224 INFO SparkHadoopMapRedUtil - attempt_202603041451164975581314761497995_0183_r_000000_0: Committed. Elapsed time: 2 ms.
14:51:16.225 INFO Executor - Finished task 0.0 in stage 50.0 (TID 88). 1858 bytes result sent to driver
14:51:16.225 INFO TaskSetManager - Finished task 0.0 in stage 50.0 (TID 88) in 40 ms on localhost (executor driver) (1/1)
14:51:16.225 INFO TaskSchedulerImpl - Removed TaskSet 50.0, whose tasks have all completed, from pool
14:51:16.226 INFO DAGScheduler - ResultStage 50 (runJob at SparkHadoopWriter.scala:83) finished in 0.055 s
14:51:16.226 INFO DAGScheduler - Job 36 is finished. Cancelling potential speculative or zombie tasks for this job
14:51:16.226 INFO TaskSchedulerImpl - Killing all running tasks in stage 50: Stage finished
14:51:16.226 INFO DAGScheduler - Job 36 finished: runJob at SparkHadoopWriter.scala:83, took 0.166671 s
14:51:16.227 INFO SparkHadoopWriter - Start to commit write Job job_202603041451164975581314761497995_0183.
14:51:16.228 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=listStatus src=/user/runner/ReadsSparkSinkUnitTest1_5b55bbc6-94da-4902-8f0b-9b58b8bcf005.bam.parts/_temporary/0 dst=null perm=null proto=rpc
14:51:16.228 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_5b55bbc6-94da-4902-8f0b-9b58b8bcf005.bam.parts dst=null perm=null proto=rpc
14:51:16.229 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=listStatus src=/user/runner/ReadsSparkSinkUnitTest1_5b55bbc6-94da-4902-8f0b-9b58b8bcf005.bam.parts/_temporary/0/task_202603041451164975581314761497995_0183_r_000000 dst=null perm=null proto=rpc
14:51:16.230 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_5b55bbc6-94da-4902-8f0b-9b58b8bcf005.bam.parts/part-r-00000 dst=null perm=null proto=rpc
14:51:16.230 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=rename src=/user/runner/ReadsSparkSinkUnitTest1_5b55bbc6-94da-4902-8f0b-9b58b8bcf005.bam.parts/_temporary/0/task_202603041451164975581314761497995_0183_r_000000/part-r-00000 dst=/user/runner/ReadsSparkSinkUnitTest1_5b55bbc6-94da-4902-8f0b-9b58b8bcf005.bam.parts/part-r-00000 perm=runner:supergroup:rw-r--r-- proto=rpc
14:51:16.231 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=delete src=/user/runner/ReadsSparkSinkUnitTest1_5b55bbc6-94da-4902-8f0b-9b58b8bcf005.bam.parts/_temporary dst=null perm=null proto=rpc
14:51:16.232 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=create src=/user/runner/ReadsSparkSinkUnitTest1_5b55bbc6-94da-4902-8f0b-9b58b8bcf005.bam.parts/_SUCCESS dst=null perm=runner:supergroup:rw-r--r-- proto=rpc
14:51:16.233 INFO StateChange - DIR* completeFile: /user/runner/ReadsSparkSinkUnitTest1_5b55bbc6-94da-4902-8f0b-9b58b8bcf005.bam.parts/_SUCCESS is closed by DFSClient_NONMAPREDUCE_-951833550_1
14:51:16.234 INFO audit - allowed=false ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=delete src=/user/runner/ReadsSparkSinkUnitTest1_5b55bbc6-94da-4902-8f0b-9b58b8bcf005.bam.parts/.spark-staging-183 dst=null perm=null proto=rpc
14:51:16.234 INFO SparkHadoopWriter - Write Job job_202603041451164975581314761497995_0183 committed. Elapsed time: 7 ms.
14:51:16.235 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=create src=/user/runner/ReadsSparkSinkUnitTest1_5b55bbc6-94da-4902-8f0b-9b58b8bcf005.bam.parts/header dst=null perm=runner:supergroup:rw-r--r-- proto=rpc
14:51:16.237 INFO StateChange - BLOCK* allocate blk_1073741850_1026, replicas=127.0.0.1:34059 for /user/runner/ReadsSparkSinkUnitTest1_5b55bbc6-94da-4902-8f0b-9b58b8bcf005.bam.parts/header
14:51:16.238 INFO DataNode - Receiving BP-1768883704-10.1.0.125-1772635868711:blk_1073741850_1026 src: /127.0.0.1:36368 dest: /127.0.0.1:34059
14:51:16.240 INFO clienttrace - src: /127.0.0.1:36368, dest: /127.0.0.1:34059, bytes: 5712, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-951833550_1, offset: 0, srvID: 3c05d188-bc2c-46c8-958b-79f7184a02a2, blockid: BP-1768883704-10.1.0.125-1772635868711:blk_1073741850_1026, duration(ns): 629633
14:51:16.240 INFO DataNode - PacketResponder: BP-1768883704-10.1.0.125-1772635868711:blk_1073741850_1026, type=LAST_IN_PIPELINE terminating
14:51:16.241 INFO FSNamesystem - BLOCK* blk_1073741850_1026 is COMMITTED but not COMPLETE(numNodes= 0 < minimum = 1) in file /user/runner/ReadsSparkSinkUnitTest1_5b55bbc6-94da-4902-8f0b-9b58b8bcf005.bam.parts/header
14:51:16.642 INFO StateChange - DIR* completeFile: /user/runner/ReadsSparkSinkUnitTest1_5b55bbc6-94da-4902-8f0b-9b58b8bcf005.bam.parts/header is closed by DFSClient_NONMAPREDUCE_-951833550_1
14:51:16.643 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=create src=/user/runner/ReadsSparkSinkUnitTest1_5b55bbc6-94da-4902-8f0b-9b58b8bcf005.bam.parts/terminator dst=null perm=runner:supergroup:rw-r--r-- proto=rpc
14:51:16.645 INFO StateChange - BLOCK* allocate blk_1073741851_1027, replicas=127.0.0.1:34059 for /user/runner/ReadsSparkSinkUnitTest1_5b55bbc6-94da-4902-8f0b-9b58b8bcf005.bam.parts/terminator
14:51:16.646 INFO DataNode - Receiving BP-1768883704-10.1.0.125-1772635868711:blk_1073741851_1027 src: /127.0.0.1:36370 dest: /127.0.0.1:34059
14:51:16.648 INFO clienttrace - src: /127.0.0.1:36370, dest: /127.0.0.1:34059, bytes: 28, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-951833550_1, offset: 0, srvID: 3c05d188-bc2c-46c8-958b-79f7184a02a2, blockid: BP-1768883704-10.1.0.125-1772635868711:blk_1073741851_1027, duration(ns): 660321
14:51:16.648 INFO DataNode - PacketResponder: BP-1768883704-10.1.0.125-1772635868711:blk_1073741851_1027, type=LAST_IN_PIPELINE terminating
14:51:16.649 INFO FSNamesystem - BLOCK* blk_1073741851_1027 is COMMITTED but not COMPLETE(numNodes= 0 < minimum = 1) in file /user/runner/ReadsSparkSinkUnitTest1_5b55bbc6-94da-4902-8f0b-9b58b8bcf005.bam.parts/terminator
14:51:17.050 INFO StateChange - DIR* completeFile: /user/runner/ReadsSparkSinkUnitTest1_5b55bbc6-94da-4902-8f0b-9b58b8bcf005.bam.parts/terminator is closed by DFSClient_NONMAPREDUCE_-951833550_1
14:51:17.051 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=listStatus src=/user/runner/ReadsSparkSinkUnitTest1_5b55bbc6-94da-4902-8f0b-9b58b8bcf005.bam.parts dst=null perm=null proto=rpc
14:51:17.053 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=create src=/user/runner/ReadsSparkSinkUnitTest1_5b55bbc6-94da-4902-8f0b-9b58b8bcf005.bam.parts/output dst=null perm=runner:supergroup:rw-r--r-- proto=rpc
14:51:17.054 INFO StateChange - DIR* completeFile: /user/runner/ReadsSparkSinkUnitTest1_5b55bbc6-94da-4902-8f0b-9b58b8bcf005.bam.parts/output is closed by DFSClient_NONMAPREDUCE_-951833550_1
14:51:17.054 INFO HadoopFileSystemWrapper - Concatenating 3 parts to hdfs://localhost:43595/user/runner/ReadsSparkSinkUnitTest1_5b55bbc6-94da-4902-8f0b-9b58b8bcf005.bam
14:51:17.055 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=concat src=[/user/runner/ReadsSparkSinkUnitTest1_5b55bbc6-94da-4902-8f0b-9b58b8bcf005.bam.parts/header, /user/runner/ReadsSparkSinkUnitTest1_5b55bbc6-94da-4902-8f0b-9b58b8bcf005.bam.parts/part-r-00000, /user/runner/ReadsSparkSinkUnitTest1_5b55bbc6-94da-4902-8f0b-9b58b8bcf005.bam.parts/terminator] dst=/user/runner/ReadsSparkSinkUnitTest1_5b55bbc6-94da-4902-8f0b-9b58b8bcf005.bam.parts/output perm=runner:supergroup:rw-r--r-- proto=rpc
14:51:17.055 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_5b55bbc6-94da-4902-8f0b-9b58b8bcf005.bam dst=null perm=null proto=rpc
14:51:17.056 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=rename src=/user/runner/ReadsSparkSinkUnitTest1_5b55bbc6-94da-4902-8f0b-9b58b8bcf005.bam.parts/output dst=/user/runner/ReadsSparkSinkUnitTest1_5b55bbc6-94da-4902-8f0b-9b58b8bcf005.bam perm=runner:supergroup:rw-r--r-- proto=rpc
14:51:17.057 INFO HadoopFileSystemWrapper - Concatenating to hdfs://localhost:43595/user/runner/ReadsSparkSinkUnitTest1_5b55bbc6-94da-4902-8f0b-9b58b8bcf005.bam done
14:51:17.057 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_5b55bbc6-94da-4902-8f0b-9b58b8bcf005.bam dst=null perm=null proto=rpc
14:51:17.058 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=delete src=/user/runner/ReadsSparkSinkUnitTest1_5b55bbc6-94da-4902-8f0b-9b58b8bcf005.bam.parts dst=null perm=null proto=rpc
14:51:17.059 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_5b55bbc6-94da-4902-8f0b-9b58b8bcf005.bam dst=null perm=null proto=rpc
14:51:17.059 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_5b55bbc6-94da-4902-8f0b-9b58b8bcf005.bam dst=null perm=null proto=rpc
14:51:17.060 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_5b55bbc6-94da-4902-8f0b-9b58b8bcf005.bam dst=null perm=null proto=rpc
14:51:17.060 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest1_5b55bbc6-94da-4902-8f0b-9b58b8bcf005.bam dst=null perm=null proto=rpc
14:51:17.061 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_5b55bbc6-94da-4902-8f0b-9b58b8bcf005.bam.bai dst=null perm=null proto=rpc
14:51:17.062 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_5b55bbc6-94da-4902-8f0b-9b58b8bcf005.bai dst=null perm=null proto=rpc
14:51:17.064 WARN DFSUtil - Unexpected value for data transfer bytes=5760 duration=0
14:51:17.066 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
14:51:17.066 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_5b55bbc6-94da-4902-8f0b-9b58b8bcf005.bam.sbi dst=null perm=null proto=rpc
14:51:17.068 INFO MemoryStore - Block broadcast_83 stored as values in memory (estimated size 297.9 KiB, free 1918.1 MiB)
14:51:17.080 INFO MemoryStore - Block broadcast_83_piece0 stored as bytes in memory (estimated size 50.2 KiB, free 1918.0 MiB)
14:51:17.080 INFO BlockManagerInfo - Added broadcast_83_piece0 in memory on localhost:44923 (size: 50.2 KiB, free: 1919.7 MiB)
14:51:17.080 INFO SparkContext - Created broadcast 83 from newAPIHadoopFile at PathSplitSource.java:96
14:51:17.103 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_5b55bbc6-94da-4902-8f0b-9b58b8bcf005.bam dst=null perm=null proto=rpc
14:51:17.103 INFO FileInputFormat - Total input files to process : 1
14:51:17.104 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest1_5b55bbc6-94da-4902-8f0b-9b58b8bcf005.bam dst=null perm=null proto=rpc
14:51:17.142 INFO SparkContext - Starting job: collect at ReadsSparkSinkUnitTest.java:182
14:51:17.143 INFO DAGScheduler - Got job 37 (collect at ReadsSparkSinkUnitTest.java:182) with 1 output partitions
14:51:17.143 INFO DAGScheduler - Final stage: ResultStage 51 (collect at ReadsSparkSinkUnitTest.java:182)
14:51:17.143 INFO DAGScheduler - Parents of final stage: List()
14:51:17.143 INFO DAGScheduler - Missing parents: List()
14:51:17.143 INFO DAGScheduler - Submitting ResultStage 51 (MapPartitionsRDD[190] at filter at ReadsSparkSource.java:96), which has no missing parents
14:51:17.161 INFO MemoryStore - Block broadcast_84 stored as values in memory (estimated size 426.2 KiB, free 1917.6 MiB)
14:51:17.162 INFO MemoryStore - Block broadcast_84_piece0 stored as bytes in memory (estimated size 153.7 KiB, free 1917.4 MiB)
14:51:17.163 INFO BlockManagerInfo - Added broadcast_84_piece0 in memory on localhost:44923 (size: 153.7 KiB, free: 1919.5 MiB)
14:51:17.163 INFO SparkContext - Created broadcast 84 from broadcast at DAGScheduler.scala:1580
14:51:17.163 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 51 (MapPartitionsRDD[190] at filter at ReadsSparkSource.java:96) (first 15 tasks are for partitions Vector(0))
14:51:17.163 INFO TaskSchedulerImpl - Adding task set 51.0 with 1 tasks resource profile 0
14:51:17.164 INFO TaskSetManager - Starting task 0.0 in stage 51.0 (TID 89) (localhost, executor driver, partition 0, ANY, 7852 bytes)
14:51:17.165 INFO Executor - Running task 0.0 in stage 51.0 (TID 89)
14:51:17.203 INFO NewHadoopRDD - Input split: hdfs://localhost:43595/user/runner/ReadsSparkSinkUnitTest1_5b55bbc6-94da-4902-8f0b-9b58b8bcf005.bam:0+237038
14:51:17.204 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_5b55bbc6-94da-4902-8f0b-9b58b8bcf005.bam dst=null perm=null proto=rpc
14:51:17.205 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest1_5b55bbc6-94da-4902-8f0b-9b58b8bcf005.bam dst=null perm=null proto=rpc
14:51:17.207 WARN DFSUtil - Unexpected value for data transfer bytes=5760 duration=0
14:51:17.208 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_5b55bbc6-94da-4902-8f0b-9b58b8bcf005.bam dst=null perm=null proto=rpc
14:51:17.208 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest1_5b55bbc6-94da-4902-8f0b-9b58b8bcf005.bam dst=null perm=null proto=rpc
14:51:17.209 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_5b55bbc6-94da-4902-8f0b-9b58b8bcf005.bam.bai dst=null perm=null proto=rpc
14:51:17.210 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_5b55bbc6-94da-4902-8f0b-9b58b8bcf005.bai dst=null perm=null proto=rpc
14:51:17.212 WARN DFSUtil - Unexpected value for data transfer bytes=5760 duration=0
14:51:17.214 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_5b55bbc6-94da-4902-8f0b-9b58b8bcf005.bam dst=null perm=null proto=rpc
14:51:17.215 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest1_5b55bbc6-94da-4902-8f0b-9b58b8bcf005.bam dst=null perm=null proto=rpc
14:51:17.217 WARN DFSUtil - Unexpected value for data transfer bytes=5760 duration=0
14:51:17.221 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
14:51:17.222 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
14:51:17.223 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
14:51:17.224 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
14:51:17.226 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
14:51:17.227 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
14:51:17.228 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
14:51:17.229 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
14:51:17.230 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
14:51:17.231 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
14:51:17.232 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
14:51:17.233 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
14:51:17.233 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
14:51:17.234 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
14:51:17.235 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
14:51:17.236 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
14:51:17.238 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
14:51:17.239 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
14:51:17.240 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
14:51:17.241 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
14:51:17.243 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
14:51:17.245 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
14:51:17.246 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
14:51:17.247 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
14:51:17.248 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
14:51:17.250 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
14:51:17.252 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
14:51:17.253 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
14:51:17.254 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
14:51:17.255 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
14:51:17.257 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
14:51:17.258 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
14:51:17.259 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
14:51:17.260 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
14:51:17.262 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
14:51:17.262 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
14:51:17.263 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
14:51:17.264 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
14:51:17.265 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
14:51:17.267 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
14:51:17.268 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
14:51:17.269 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
14:51:17.270 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
14:51:17.271 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
14:51:17.272 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
14:51:17.273 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
14:51:17.274 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
14:51:17.275 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
14:51:17.275 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
14:51:17.276 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
14:51:17.277 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
14:51:17.278 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
14:51:17.279 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
14:51:17.279 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_5b55bbc6-94da-4902-8f0b-9b58b8bcf005.bam dst=null perm=null proto=rpc
14:51:17.280 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest1_5b55bbc6-94da-4902-8f0b-9b58b8bcf005.bam dst=null perm=null proto=rpc
14:51:17.281 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_5b55bbc6-94da-4902-8f0b-9b58b8bcf005.bam.bai dst=null perm=null proto=rpc
14:51:17.282 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_5b55bbc6-94da-4902-8f0b-9b58b8bcf005.bai dst=null perm=null proto=rpc
14:51:17.284 WARN DFSUtil - Unexpected value for data transfer bytes=5760 duration=0
14:51:17.287 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
14:51:17.287 WARN DFSUtil - Unexpected value for data transfer bytes=32 duration=0
14:51:17.290 INFO Executor - Finished task 0.0 in stage 51.0 (TID 89). 651526 bytes result sent to driver
14:51:17.294 INFO TaskSetManager - Finished task 0.0 in stage 51.0 (TID 89) in 130 ms on localhost (executor driver) (1/1)
14:51:17.294 INFO TaskSchedulerImpl - Removed TaskSet 51.0, whose tasks have all completed, from pool
14:51:17.294 INFO DAGScheduler - ResultStage 51 (collect at ReadsSparkSinkUnitTest.java:182) finished in 0.151 s
14:51:17.294 INFO DAGScheduler - Job 37 is finished. Cancelling potential speculative or zombie tasks for this job
14:51:17.294 INFO TaskSchedulerImpl - Killing all running tasks in stage 51: Stage finished
14:51:17.295 INFO DAGScheduler - Job 37 finished: collect at ReadsSparkSinkUnitTest.java:182, took 0.152429 s
14:51:17.306 INFO SparkContext - Starting job: count at ReadsSparkSinkUnitTest.java:185
14:51:17.307 INFO DAGScheduler - Got job 38 (count at ReadsSparkSinkUnitTest.java:185) with 1 output partitions
14:51:17.307 INFO DAGScheduler - Final stage: ResultStage 52 (count at ReadsSparkSinkUnitTest.java:185)
14:51:17.307 INFO DAGScheduler - Parents of final stage: List()
14:51:17.307 INFO DAGScheduler - Missing parents: List()
14:51:17.307 INFO DAGScheduler - Submitting ResultStage 52 (MapPartitionsRDD[171] at filter at ReadsSparkSource.java:96), which has no missing parents
14:51:17.325 INFO MemoryStore - Block broadcast_85 stored as values in memory (estimated size 426.1 KiB, free 1917.0 MiB)
14:51:17.326 INFO MemoryStore - Block broadcast_85_piece0 stored as bytes in memory (estimated size 153.6 KiB, free 1916.9 MiB)
14:51:17.327 INFO BlockManagerInfo - Added broadcast_85_piece0 in memory on localhost:44923 (size: 153.6 KiB, free: 1919.4 MiB)
14:51:17.327 INFO SparkContext - Created broadcast 85 from broadcast at DAGScheduler.scala:1580
14:51:17.327 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 52 (MapPartitionsRDD[171] at filter at ReadsSparkSource.java:96) (first 15 tasks are for partitions Vector(0))
14:51:17.327 INFO TaskSchedulerImpl - Adding task set 52.0 with 1 tasks resource profile 0
14:51:17.328 INFO TaskSetManager - Starting task 0.0 in stage 52.0 (TID 90) (localhost, executor driver, partition 0, PROCESS_LOCAL, 7872 bytes)
14:51:17.328 INFO Executor - Running task 0.0 in stage 52.0 (TID 90)
14:51:17.364 INFO NewHadoopRDD - Input split: file:/home/runner/work/gatk/gatk/src/test/resources/org/broadinstitute/hellbender/tools/BQSR/HiSeq.1mb.1RG.2k_lines.bam:0+222075
14:51:17.377 INFO Executor - Finished task 0.0 in stage 52.0 (TID 90). 989 bytes result sent to driver
14:51:17.378 INFO TaskSetManager - Finished task 0.0 in stage 52.0 (TID 90) in 50 ms on localhost (executor driver) (1/1)
14:51:17.378 INFO TaskSchedulerImpl - Removed TaskSet 52.0, whose tasks have all completed, from pool
14:51:17.378 INFO DAGScheduler - ResultStage 52 (count at ReadsSparkSinkUnitTest.java:185) finished in 0.070 s
14:51:17.379 INFO DAGScheduler - Job 38 is finished. Cancelling potential speculative or zombie tasks for this job
14:51:17.379 INFO TaskSchedulerImpl - Killing all running tasks in stage 52: Stage finished
14:51:17.379 INFO DAGScheduler - Job 38 finished: count at ReadsSparkSinkUnitTest.java:185, took 0.072494 s
14:51:17.382 INFO SparkContext - Starting job: count at ReadsSparkSinkUnitTest.java:185
14:51:17.383 INFO DAGScheduler - Got job 39 (count at ReadsSparkSinkUnitTest.java:185) with 1 output partitions
14:51:17.383 INFO DAGScheduler - Final stage: ResultStage 53 (count at ReadsSparkSinkUnitTest.java:185)
14:51:17.383 INFO DAGScheduler - Parents of final stage: List()
14:51:17.383 INFO DAGScheduler - Missing parents: List()
14:51:17.383 INFO DAGScheduler - Submitting ResultStage 53 (MapPartitionsRDD[190] at filter at ReadsSparkSource.java:96), which has no missing parents
14:51:17.401 INFO MemoryStore - Block broadcast_86 stored as values in memory (estimated size 426.1 KiB, free 1916.5 MiB)
14:51:17.402 INFO MemoryStore - Block broadcast_86_piece0 stored as bytes in memory (estimated size 153.6 KiB, free 1916.3 MiB)
14:51:17.403 INFO BlockManagerInfo - Added broadcast_86_piece0 in memory on localhost:44923 (size: 153.6 KiB, free: 1919.2 MiB)
14:51:17.403 INFO SparkContext - Created broadcast 86 from broadcast at DAGScheduler.scala:1580
14:51:17.403 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 53 (MapPartitionsRDD[190] at filter at ReadsSparkSource.java:96) (first 15 tasks are for partitions Vector(0))
14:51:17.403 INFO TaskSchedulerImpl - Adding task set 53.0 with 1 tasks resource profile 0
14:51:17.404 INFO TaskSetManager - Starting task 0.0 in stage 53.0 (TID 91) (localhost, executor driver, partition 0, ANY, 7852 bytes)
14:51:17.404 INFO Executor - Running task 0.0 in stage 53.0 (TID 91)
14:51:17.440 INFO NewHadoopRDD - Input split: hdfs://localhost:43595/user/runner/ReadsSparkSinkUnitTest1_5b55bbc6-94da-4902-8f0b-9b58b8bcf005.bam:0+237038
14:51:17.441 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_5b55bbc6-94da-4902-8f0b-9b58b8bcf005.bam dst=null perm=null proto=rpc
14:51:17.442 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest1_5b55bbc6-94da-4902-8f0b-9b58b8bcf005.bam dst=null perm=null proto=rpc
14:51:17.443 WARN DFSUtil - Unexpected value for data transfer bytes=5760 duration=0
14:51:17.444 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_5b55bbc6-94da-4902-8f0b-9b58b8bcf005.bam dst=null perm=null proto=rpc
14:51:17.444 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest1_5b55bbc6-94da-4902-8f0b-9b58b8bcf005.bam dst=null perm=null proto=rpc
14:51:17.446 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_5b55bbc6-94da-4902-8f0b-9b58b8bcf005.bam.bai dst=null perm=null proto=rpc
14:51:17.446 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_5b55bbc6-94da-4902-8f0b-9b58b8bcf005.bai dst=null perm=null proto=rpc
14:51:17.448 WARN DFSUtil - Unexpected value for data transfer bytes=5760 duration=0
14:51:17.450 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_5b55bbc6-94da-4902-8f0b-9b58b8bcf005.bam dst=null perm=null proto=rpc
14:51:17.451 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest1_5b55bbc6-94da-4902-8f0b-9b58b8bcf005.bam dst=null perm=null proto=rpc
14:51:17.451 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
14:51:17.452 WARN DFSUtil - Unexpected value for data transfer bytes=5760 duration=0
14:51:17.457 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
14:51:17.458 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
14:51:17.460 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
14:51:17.461 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
14:51:17.462 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
14:51:17.463 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
14:51:17.463 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
14:51:17.464 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
14:51:17.465 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
14:51:17.466 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
14:51:17.466 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
14:51:17.467 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
14:51:17.469 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
14:51:17.470 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
14:51:17.471 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
14:51:17.472 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
14:51:17.473 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
14:51:17.474 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
14:51:17.475 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
14:51:17.475 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
14:51:17.476 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
14:51:17.477 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
14:51:17.478 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
14:51:17.479 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
14:51:17.480 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
14:51:17.481 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
14:51:17.481 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
14:51:17.482 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
14:51:17.483 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
14:51:17.485 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
14:51:17.486 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
14:51:17.487 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
14:51:17.488 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
14:51:17.489 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
14:51:17.490 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
14:51:17.491 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
14:51:17.493 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
14:51:17.494 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
14:51:17.495 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
14:51:17.496 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
14:51:17.498 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
14:51:17.498 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
14:51:17.499 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
14:51:17.500 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
14:51:17.501 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
14:51:17.503 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
14:51:17.504 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
14:51:17.505 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
14:51:17.506 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
14:51:17.508 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
14:51:17.509 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
14:51:17.510 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
14:51:17.511 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
14:51:17.512 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
14:51:17.512 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
14:51:17.512 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_5b55bbc6-94da-4902-8f0b-9b58b8bcf005.bam dst=null perm=null proto=rpc
14:51:17.513 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest1_5b55bbc6-94da-4902-8f0b-9b58b8bcf005.bam dst=null perm=null proto=rpc
14:51:17.514 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_5b55bbc6-94da-4902-8f0b-9b58b8bcf005.bam.bai dst=null perm=null proto=rpc
14:51:17.514 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_5b55bbc6-94da-4902-8f0b-9b58b8bcf005.bai dst=null perm=null proto=rpc
14:51:17.516 WARN DFSUtil - Unexpected value for data transfer bytes=5760 duration=0
14:51:17.519 WARN DFSUtil - Unexpected value for data transfer bytes=32 duration=0
14:51:17.521 INFO Executor - Finished task 0.0 in stage 53.0 (TID 91). 989 bytes result sent to driver
14:51:17.521 INFO TaskSetManager - Finished task 0.0 in stage 53.0 (TID 91) in 117 ms on localhost (executor driver) (1/1)
14:51:17.522 INFO TaskSchedulerImpl - Removed TaskSet 53.0, whose tasks have all completed, from pool
14:51:17.522 INFO DAGScheduler - ResultStage 53 (count at ReadsSparkSinkUnitTest.java:185) finished in 0.138 s
14:51:17.522 INFO DAGScheduler - Job 39 is finished. Cancelling potential speculative or zombie tasks for this job
14:51:17.522 INFO TaskSchedulerImpl - Killing all running tasks in stage 53: Stage finished
14:51:17.522 INFO DAGScheduler - Job 39 finished: count at ReadsSparkSinkUnitTest.java:185, took 0.139480 s
14:51:17.527 INFO MemoryStore - Block broadcast_87 stored as values in memory (estimated size 298.0 KiB, free 1916.0 MiB)
14:51:17.533 INFO MemoryStore - Block broadcast_87_piece0 stored as bytes in memory (estimated size 50.3 KiB, free 1916.0 MiB)
14:51:17.533 INFO BlockManagerInfo - Added broadcast_87_piece0 in memory on localhost:44923 (size: 50.3 KiB, free: 1919.2 MiB)
14:51:17.534 INFO SparkContext - Created broadcast 87 from newAPIHadoopFile at PathSplitSource.java:96
14:51:17.559 INFO MemoryStore - Block broadcast_88 stored as values in memory (estimated size 298.0 KiB, free 1915.7 MiB)
14:51:17.565 INFO MemoryStore - Block broadcast_88_piece0 stored as bytes in memory (estimated size 50.3 KiB, free 1915.6 MiB)
14:51:17.565 INFO BlockManagerInfo - Added broadcast_88_piece0 in memory on localhost:44923 (size: 50.3 KiB, free: 1919.1 MiB)
14:51:17.566 INFO SparkContext - Created broadcast 88 from newAPIHadoopFile at PathSplitSource.java:96
14:51:17.582 INFO BlockManagerInfo - Removed broadcast_83_piece0 on localhost:44923 in memory (size: 50.2 KiB, free: 1919.2 MiB)
14:51:17.584 INFO BlockManagerInfo - Removed broadcast_86_piece0 on localhost:44923 in memory (size: 153.6 KiB, free: 1919.3 MiB)
14:51:17.585 INFO BlockManagerInfo - Removed broadcast_84_piece0 on localhost:44923 in memory (size: 153.7 KiB, free: 1919.5 MiB)
14:51:17.586 INFO BlockManagerInfo - Removed broadcast_85_piece0 on localhost:44923 in memory (size: 153.6 KiB, free: 1919.6 MiB)
14:51:17.587 INFO BlockManagerInfo - Removed broadcast_81_piece0 on localhost:44923 in memory (size: 166.1 KiB, free: 1919.8 MiB)
14:51:17.588 INFO BlockManagerInfo - Removed broadcast_77_piece0 on localhost:44923 in memory (size: 50.2 KiB, free: 1919.8 MiB)
14:51:17.589 INFO BlockManagerInfo - Removed broadcast_80_piece0 on localhost:44923 in memory (size: 9.6 KiB, free: 1919.8 MiB)
14:51:17.589 INFO BlockManagerInfo - Removed broadcast_82_piece0 on localhost:44923 in memory (size: 67.1 KiB, free: 1919.9 MiB)
14:51:17.590 INFO BlockManagerInfo - Removed broadcast_79_piece0 on localhost:44923 in memory (size: 9.6 KiB, free: 1919.9 MiB)
14:51:17.605 INFO FileInputFormat - Total input files to process : 1
14:51:17.608 INFO MemoryStore - Block broadcast_89 stored as values in memory (estimated size 160.7 KiB, free 1919.2 MiB)
14:51:17.609 INFO MemoryStore - Block broadcast_89_piece0 stored as bytes in memory (estimated size 9.6 KiB, free 1919.2 MiB)
14:51:17.609 INFO BlockManagerInfo - Added broadcast_89_piece0 in memory on localhost:44923 (size: 9.6 KiB, free: 1919.9 MiB)
14:51:17.610 INFO SparkContext - Created broadcast 89 from broadcast at ReadsSparkSink.java:133
14:51:17.611 INFO MemoryStore - Block broadcast_90 stored as values in memory (estimated size 163.2 KiB, free 1919.0 MiB)
14:51:17.612 INFO MemoryStore - Block broadcast_90_piece0 stored as bytes in memory (estimated size 9.6 KiB, free 1919.0 MiB)
14:51:17.613 INFO BlockManagerInfo - Added broadcast_90_piece0 in memory on localhost:44923 (size: 9.6 KiB, free: 1919.9 MiB)
14:51:17.613 INFO SparkContext - Created broadcast 90 from broadcast at BamSink.java:76
14:51:17.616 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest2_0ab91b94-e02b-4351-a112-17bd1d634f90.bam.parts dst=null perm=null proto=rpc
14:51:17.616 INFO PathOutputCommitterFactory - No output committer factory defined, defaulting to FileOutputCommitterFactory
14:51:17.616 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
14:51:17.616 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
14:51:17.618 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=mkdirs src=/user/runner/ReadsSparkSinkUnitTest2_0ab91b94-e02b-4351-a112-17bd1d634f90.bam.parts/_temporary/0 dst=null perm=runner:supergroup:rwxr-xr-x proto=rpc
14:51:17.624 INFO SparkContext - Starting job: runJob at SparkHadoopWriter.scala:83
14:51:17.625 INFO DAGScheduler - Registering RDD 204 (mapToPair at SparkUtils.java:161) as input to shuffle 12
14:51:17.625 INFO DAGScheduler - Got job 40 (runJob at SparkHadoopWriter.scala:83) with 1 output partitions
14:51:17.626 INFO DAGScheduler - Final stage: ResultStage 55 (runJob at SparkHadoopWriter.scala:83)
14:51:17.626 INFO DAGScheduler - Parents of final stage: List(ShuffleMapStage 54)
14:51:17.626 INFO DAGScheduler - Missing parents: List(ShuffleMapStage 54)
14:51:17.626 INFO DAGScheduler - Submitting ShuffleMapStage 54 (MapPartitionsRDD[204] at mapToPair at SparkUtils.java:161), which has no missing parents
14:51:17.655 INFO MemoryStore - Block broadcast_91 stored as values in memory (estimated size 520.4 KiB, free 1918.5 MiB)
14:51:17.657 INFO MemoryStore - Block broadcast_91_piece0 stored as bytes in memory (estimated size 166.1 KiB, free 1918.3 MiB)
14:51:17.657 INFO BlockManagerInfo - Added broadcast_91_piece0 in memory on localhost:44923 (size: 166.1 KiB, free: 1919.7 MiB)
14:51:17.657 INFO SparkContext - Created broadcast 91 from broadcast at DAGScheduler.scala:1580
14:51:17.658 INFO DAGScheduler - Submitting 1 missing tasks from ShuffleMapStage 54 (MapPartitionsRDD[204] at mapToPair at SparkUtils.java:161) (first 15 tasks are for partitions Vector(0))
14:51:17.658 INFO TaskSchedulerImpl - Adding task set 54.0 with 1 tasks resource profile 0
14:51:17.659 INFO TaskSetManager - Starting task 0.0 in stage 54.0 (TID 92) (localhost, executor driver, partition 0, PROCESS_LOCAL, 7901 bytes)
14:51:17.659 INFO Executor - Running task 0.0 in stage 54.0 (TID 92)
14:51:17.699 INFO NewHadoopRDD - Input split: file:/home/runner/work/gatk/gatk/src/test/resources/org/broadinstitute/hellbender/tools/BQSR/expected.HiSeq.1mb.1RG.2k_lines.alternate.recalibrated.DIQ.bam:0+216896
14:51:17.720 INFO Executor - Finished task 0.0 in stage 54.0 (TID 92). 1148 bytes result sent to driver
14:51:17.721 INFO TaskSetManager - Finished task 0.0 in stage 54.0 (TID 92) in 63 ms on localhost (executor driver) (1/1)
14:51:17.721 INFO TaskSchedulerImpl - Removed TaskSet 54.0, whose tasks have all completed, from pool
14:51:17.721 INFO DAGScheduler - ShuffleMapStage 54 (mapToPair at SparkUtils.java:161) finished in 0.095 s
14:51:17.721 INFO DAGScheduler - looking for newly runnable stages
14:51:17.721 INFO DAGScheduler - running: HashSet()
14:51:17.721 INFO DAGScheduler - waiting: HashSet(ResultStage 55)
14:51:17.721 INFO DAGScheduler - failed: HashSet()
14:51:17.722 INFO DAGScheduler - Submitting ResultStage 55 (MapPartitionsRDD[209] at mapToPair at BamSink.java:91), which has no missing parents
14:51:17.731 INFO MemoryStore - Block broadcast_92 stored as values in memory (estimated size 241.5 KiB, free 1918.1 MiB)
14:51:17.732 INFO MemoryStore - Block broadcast_92_piece0 stored as bytes in memory (estimated size 67.1 KiB, free 1918.0 MiB)
14:51:17.732 INFO BlockManagerInfo - Added broadcast_92_piece0 in memory on localhost:44923 (size: 67.1 KiB, free: 1919.7 MiB)
14:51:17.732 INFO SparkContext - Created broadcast 92 from broadcast at DAGScheduler.scala:1580
14:51:17.733 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 55 (MapPartitionsRDD[209] at mapToPair at BamSink.java:91) (first 15 tasks are for partitions Vector(0))
14:51:17.733 INFO TaskSchedulerImpl - Adding task set 55.0 with 1 tasks resource profile 0
14:51:17.734 INFO TaskSetManager - Starting task 0.0 in stage 55.0 (TID 93) (localhost, executor driver, partition 0, NODE_LOCAL, 7513 bytes)
14:51:17.734 INFO Executor - Running task 0.0 in stage 55.0 (TID 93)
14:51:17.739 INFO ShuffleBlockFetcherIterator - Getting 1 (343.8 KiB) non-empty blocks including 1 (343.8 KiB) local and 0 (0.0 B) host-local and 0 (0.0 B) push-merged-local and 0 (0.0 B) remote blocks
14:51:17.740 INFO ShuffleBlockFetcherIterator - Started 0 remote fetches in 0 ms
14:51:17.757 INFO PathOutputCommitterFactory - No output committer factory defined, defaulting to FileOutputCommitterFactory
14:51:17.757 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
14:51:17.757 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
14:51:17.757 INFO PathOutputCommitterFactory - No output committer factory defined, defaulting to FileOutputCommitterFactory
14:51:17.757 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
14:51:17.757 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
14:51:17.759 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=create src=/user/runner/ReadsSparkSinkUnitTest2_0ab91b94-e02b-4351-a112-17bd1d634f90.bam.parts/_temporary/0/_temporary/attempt_202603041451172122125325142555848_0209_r_000000_0/part-r-00000 dst=null perm=runner:supergroup:rw-r--r-- proto=rpc
14:51:17.760 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=create src=/user/runner/ReadsSparkSinkUnitTest2_0ab91b94-e02b-4351-a112-17bd1d634f90.bam.parts/_temporary/0/_temporary/attempt_202603041451172122125325142555848_0209_r_000000_0/.part-r-00000.sbi dst=null perm=runner:supergroup:rw-r--r-- proto=rpc
14:51:17.762 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=create src=/user/runner/ReadsSparkSinkUnitTest2_0ab91b94-e02b-4351-a112-17bd1d634f90.bam.parts/_temporary/0/_temporary/attempt_202603041451172122125325142555848_0209_r_000000_0/.part-r-00000.bai dst=null perm=runner:supergroup:rw-r--r-- proto=rpc
14:51:17.765 INFO StateChange - BLOCK* allocate blk_1073741852_1028, replicas=127.0.0.1:34059 for /user/runner/ReadsSparkSinkUnitTest2_0ab91b94-e02b-4351-a112-17bd1d634f90.bam.parts/_temporary/0/_temporary/attempt_202603041451172122125325142555848_0209_r_000000_0/part-r-00000
14:51:17.766 INFO DataNode - Receiving BP-1768883704-10.1.0.125-1772635868711:blk_1073741852_1028 src: /127.0.0.1:37012 dest: /127.0.0.1:34059
14:51:17.770 INFO clienttrace - src: /127.0.0.1:37012, dest: /127.0.0.1:34059, bytes: 229774, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-951833550_1, offset: 0, srvID: 3c05d188-bc2c-46c8-958b-79f7184a02a2, blockid: BP-1768883704-10.1.0.125-1772635868711:blk_1073741852_1028, duration(ns): 2498807
14:51:17.770 INFO DataNode - PacketResponder: BP-1768883704-10.1.0.125-1772635868711:blk_1073741852_1028, type=LAST_IN_PIPELINE terminating
14:51:17.771 INFO FSNamesystem - BLOCK* blk_1073741852_1028 is COMMITTED but not COMPLETE(numNodes= 0 < minimum = 1) in file /user/runner/ReadsSparkSinkUnitTest2_0ab91b94-e02b-4351-a112-17bd1d634f90.bam.parts/_temporary/0/_temporary/attempt_202603041451172122125325142555848_0209_r_000000_0/part-r-00000
14:51:18.172 INFO StateChange - DIR* completeFile: /user/runner/ReadsSparkSinkUnitTest2_0ab91b94-e02b-4351-a112-17bd1d634f90.bam.parts/_temporary/0/_temporary/attempt_202603041451172122125325142555848_0209_r_000000_0/part-r-00000 is closed by DFSClient_NONMAPREDUCE_-951833550_1
14:51:18.173 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest2_0ab91b94-e02b-4351-a112-17bd1d634f90.bam.parts/_temporary/0/_temporary/attempt_202603041451172122125325142555848_0209_r_000000_0/part-r-00000 dst=null perm=null proto=rpc
14:51:18.174 INFO StateChange - BLOCK* allocate blk_1073741853_1029, replicas=127.0.0.1:34059 for /user/runner/ReadsSparkSinkUnitTest2_0ab91b94-e02b-4351-a112-17bd1d634f90.bam.parts/_temporary/0/_temporary/attempt_202603041451172122125325142555848_0209_r_000000_0/.part-r-00000.sbi
14:51:18.175 INFO DataNode - Receiving BP-1768883704-10.1.0.125-1772635868711:blk_1073741853_1029 src: /127.0.0.1:37026 dest: /127.0.0.1:34059
14:51:18.177 INFO clienttrace - src: /127.0.0.1:37026, dest: /127.0.0.1:34059, bytes: 212, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-951833550_1, offset: 0, srvID: 3c05d188-bc2c-46c8-958b-79f7184a02a2, blockid: BP-1768883704-10.1.0.125-1772635868711:blk_1073741853_1029, duration(ns): 596442
14:51:18.177 INFO DataNode - PacketResponder: BP-1768883704-10.1.0.125-1772635868711:blk_1073741853_1029, type=LAST_IN_PIPELINE terminating
14:51:18.178 INFO FSNamesystem - BLOCK* blk_1073741853_1029 is COMMITTED but not COMPLETE(numNodes= 0 < minimum = 1) in file /user/runner/ReadsSparkSinkUnitTest2_0ab91b94-e02b-4351-a112-17bd1d634f90.bam.parts/_temporary/0/_temporary/attempt_202603041451172122125325142555848_0209_r_000000_0/.part-r-00000.sbi
14:51:18.579 INFO StateChange - DIR* completeFile: /user/runner/ReadsSparkSinkUnitTest2_0ab91b94-e02b-4351-a112-17bd1d634f90.bam.parts/_temporary/0/_temporary/attempt_202603041451172122125325142555848_0209_r_000000_0/.part-r-00000.sbi is closed by DFSClient_NONMAPREDUCE_-951833550_1
14:51:18.582 INFO StateChange - BLOCK* allocate blk_1073741854_1030, replicas=127.0.0.1:34059 for /user/runner/ReadsSparkSinkUnitTest2_0ab91b94-e02b-4351-a112-17bd1d634f90.bam.parts/_temporary/0/_temporary/attempt_202603041451172122125325142555848_0209_r_000000_0/.part-r-00000.bai
14:51:18.583 INFO DataNode - Receiving BP-1768883704-10.1.0.125-1772635868711:blk_1073741854_1030 src: /127.0.0.1:37036 dest: /127.0.0.1:34059
14:51:18.584 INFO clienttrace - src: /127.0.0.1:37036, dest: /127.0.0.1:34059, bytes: 5472, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-951833550_1, offset: 0, srvID: 3c05d188-bc2c-46c8-958b-79f7184a02a2, blockid: BP-1768883704-10.1.0.125-1772635868711:blk_1073741854_1030, duration(ns): 524135
14:51:18.584 INFO DataNode - PacketResponder: BP-1768883704-10.1.0.125-1772635868711:blk_1073741854_1030, type=LAST_IN_PIPELINE terminating
14:51:18.585 INFO FSNamesystem - BLOCK* blk_1073741854_1030 is COMMITTED but not COMPLETE(numNodes= 0 < minimum = 1) in file /user/runner/ReadsSparkSinkUnitTest2_0ab91b94-e02b-4351-a112-17bd1d634f90.bam.parts/_temporary/0/_temporary/attempt_202603041451172122125325142555848_0209_r_000000_0/.part-r-00000.bai
14:51:18.986 INFO StateChange - DIR* completeFile: /user/runner/ReadsSparkSinkUnitTest2_0ab91b94-e02b-4351-a112-17bd1d634f90.bam.parts/_temporary/0/_temporary/attempt_202603041451172122125325142555848_0209_r_000000_0/.part-r-00000.bai is closed by DFSClient_NONMAPREDUCE_-951833550_1
14:51:18.987 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest2_0ab91b94-e02b-4351-a112-17bd1d634f90.bam.parts/_temporary/0/_temporary/attempt_202603041451172122125325142555848_0209_r_000000_0 dst=null perm=null proto=rpc
14:51:18.988 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest2_0ab91b94-e02b-4351-a112-17bd1d634f90.bam.parts/_temporary/0/_temporary/attempt_202603041451172122125325142555848_0209_r_000000_0 dst=null perm=null proto=rpc
14:51:18.989 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest2_0ab91b94-e02b-4351-a112-17bd1d634f90.bam.parts/_temporary/0/task_202603041451172122125325142555848_0209_r_000000 dst=null perm=null proto=rpc
14:51:18.990 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=rename src=/user/runner/ReadsSparkSinkUnitTest2_0ab91b94-e02b-4351-a112-17bd1d634f90.bam.parts/_temporary/0/_temporary/attempt_202603041451172122125325142555848_0209_r_000000_0 dst=/user/runner/ReadsSparkSinkUnitTest2_0ab91b94-e02b-4351-a112-17bd1d634f90.bam.parts/_temporary/0/task_202603041451172122125325142555848_0209_r_000000 perm=runner:supergroup:rwxr-xr-x proto=rpc
14:51:18.990 INFO FileOutputCommitter - Saved output of task 'attempt_202603041451172122125325142555848_0209_r_000000_0' to hdfs://localhost:43595/user/runner/ReadsSparkSinkUnitTest2_0ab91b94-e02b-4351-a112-17bd1d634f90.bam.parts/_temporary/0/task_202603041451172122125325142555848_0209_r_000000
14:51:18.990 INFO SparkHadoopMapRedUtil - attempt_202603041451172122125325142555848_0209_r_000000_0: Committed. Elapsed time: 2 ms.
14:51:18.991 INFO Executor - Finished task 0.0 in stage 55.0 (TID 93). 1858 bytes result sent to driver
14:51:18.992 INFO TaskSetManager - Finished task 0.0 in stage 55.0 (TID 93) in 1258 ms on localhost (executor driver) (1/1)
14:51:18.992 INFO TaskSchedulerImpl - Removed TaskSet 55.0, whose tasks have all completed, from pool
14:51:18.992 INFO DAGScheduler - ResultStage 55 (runJob at SparkHadoopWriter.scala:83) finished in 1.270 s
14:51:18.992 INFO DAGScheduler - Job 40 is finished. Cancelling potential speculative or zombie tasks for this job
14:51:18.992 INFO TaskSchedulerImpl - Killing all running tasks in stage 55: Stage finished
14:51:18.992 INFO DAGScheduler - Job 40 finished: runJob at SparkHadoopWriter.scala:83, took 1.367884 s
14:51:18.993 INFO SparkHadoopWriter - Start to commit write Job job_202603041451172122125325142555848_0209.
14:51:18.994 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=listStatus src=/user/runner/ReadsSparkSinkUnitTest2_0ab91b94-e02b-4351-a112-17bd1d634f90.bam.parts/_temporary/0 dst=null perm=null proto=rpc
14:51:18.995 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest2_0ab91b94-e02b-4351-a112-17bd1d634f90.bam.parts dst=null perm=null proto=rpc
14:51:18.995 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=listStatus src=/user/runner/ReadsSparkSinkUnitTest2_0ab91b94-e02b-4351-a112-17bd1d634f90.bam.parts/_temporary/0/task_202603041451172122125325142555848_0209_r_000000 dst=null perm=null proto=rpc
14:51:18.996 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest2_0ab91b94-e02b-4351-a112-17bd1d634f90.bam.parts/.part-r-00000.bai dst=null perm=null proto=rpc
14:51:18.997 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=rename src=/user/runner/ReadsSparkSinkUnitTest2_0ab91b94-e02b-4351-a112-17bd1d634f90.bam.parts/_temporary/0/task_202603041451172122125325142555848_0209_r_000000/.part-r-00000.bai dst=/user/runner/ReadsSparkSinkUnitTest2_0ab91b94-e02b-4351-a112-17bd1d634f90.bam.parts/.part-r-00000.bai perm=runner:supergroup:rw-r--r-- proto=rpc
14:51:18.997 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest2_0ab91b94-e02b-4351-a112-17bd1d634f90.bam.parts/.part-r-00000.sbi dst=null perm=null proto=rpc
14:51:18.998 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=rename src=/user/runner/ReadsSparkSinkUnitTest2_0ab91b94-e02b-4351-a112-17bd1d634f90.bam.parts/_temporary/0/task_202603041451172122125325142555848_0209_r_000000/.part-r-00000.sbi dst=/user/runner/ReadsSparkSinkUnitTest2_0ab91b94-e02b-4351-a112-17bd1d634f90.bam.parts/.part-r-00000.sbi perm=runner:supergroup:rw-r--r-- proto=rpc
14:51:18.998 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest2_0ab91b94-e02b-4351-a112-17bd1d634f90.bam.parts/part-r-00000 dst=null perm=null proto=rpc
14:51:18.999 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=rename src=/user/runner/ReadsSparkSinkUnitTest2_0ab91b94-e02b-4351-a112-17bd1d634f90.bam.parts/_temporary/0/task_202603041451172122125325142555848_0209_r_000000/part-r-00000 dst=/user/runner/ReadsSparkSinkUnitTest2_0ab91b94-e02b-4351-a112-17bd1d634f90.bam.parts/part-r-00000 perm=runner:supergroup:rw-r--r-- proto=rpc
14:51:19.000 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=delete src=/user/runner/ReadsSparkSinkUnitTest2_0ab91b94-e02b-4351-a112-17bd1d634f90.bam.parts/_temporary dst=null perm=null proto=rpc
14:51:19.001 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=create src=/user/runner/ReadsSparkSinkUnitTest2_0ab91b94-e02b-4351-a112-17bd1d634f90.bam.parts/_SUCCESS dst=null perm=runner:supergroup:rw-r--r-- proto=rpc
14:51:19.002 INFO StateChange - DIR* completeFile: /user/runner/ReadsSparkSinkUnitTest2_0ab91b94-e02b-4351-a112-17bd1d634f90.bam.parts/_SUCCESS is closed by DFSClient_NONMAPREDUCE_-951833550_1
14:51:19.003 INFO audit - allowed=false ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=delete src=/user/runner/ReadsSparkSinkUnitTest2_0ab91b94-e02b-4351-a112-17bd1d634f90.bam.parts/.spark-staging-209 dst=null perm=null proto=rpc
14:51:19.003 INFO SparkHadoopWriter - Write Job job_202603041451172122125325142555848_0209 committed. Elapsed time: 9 ms.
14:51:19.004 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=create src=/user/runner/ReadsSparkSinkUnitTest2_0ab91b94-e02b-4351-a112-17bd1d634f90.bam.parts/header dst=null perm=runner:supergroup:rw-r--r-- proto=rpc
14:51:19.006 INFO StateChange - BLOCK* allocate blk_1073741855_1031, replicas=127.0.0.1:34059 for /user/runner/ReadsSparkSinkUnitTest2_0ab91b94-e02b-4351-a112-17bd1d634f90.bam.parts/header
14:51:19.007 INFO DataNode - Receiving BP-1768883704-10.1.0.125-1772635868711:blk_1073741855_1031 src: /127.0.0.1:37040 dest: /127.0.0.1:34059
14:51:19.009 INFO clienttrace - src: /127.0.0.1:37040, dest: /127.0.0.1:34059, bytes: 5712, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-951833550_1, offset: 0, srvID: 3c05d188-bc2c-46c8-958b-79f7184a02a2, blockid: BP-1768883704-10.1.0.125-1772635868711:blk_1073741855_1031, duration(ns): 619521
14:51:19.009 INFO DataNode - PacketResponder: BP-1768883704-10.1.0.125-1772635868711:blk_1073741855_1031, type=LAST_IN_PIPELINE terminating
14:51:19.010 INFO FSNamesystem - BLOCK* blk_1073741855_1031 is COMMITTED but not COMPLETE(numNodes= 0 < minimum = 1) in file /user/runner/ReadsSparkSinkUnitTest2_0ab91b94-e02b-4351-a112-17bd1d634f90.bam.parts/header
14:51:19.028 INFO FsDatasetAsyncDiskService - Scheduling blk_1073741845_1021 replica FinalizedReplica, blk_1073741845_1021, FINALIZED
getNumBytes() = 212
getBytesOnDisk() = 212
getVisibleLength()= 212
getVolume() = /tmp/minicluster_storage16268522075870465194/data/data1
getBlockURI() = file:/tmp/minicluster_storage16268522075870465194/data/data1/current/BP-1768883704-10.1.0.125-1772635868711/current/finalized/subdir0/subdir0/blk_1073741845 for deletion
14:51:19.028 INFO FsDatasetAsyncDiskService - Deleted BP-1768883704-10.1.0.125-1772635868711 blk_1073741845_1021 URI file:/tmp/minicluster_storage16268522075870465194/data/data1/current/BP-1768883704-10.1.0.125-1772635868711/current/finalized/subdir0/subdir0/blk_1073741845
14:51:19.411 INFO StateChange - DIR* completeFile: /user/runner/ReadsSparkSinkUnitTest2_0ab91b94-e02b-4351-a112-17bd1d634f90.bam.parts/header is closed by DFSClient_NONMAPREDUCE_-951833550_1
14:51:19.412 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=create src=/user/runner/ReadsSparkSinkUnitTest2_0ab91b94-e02b-4351-a112-17bd1d634f90.bam.parts/terminator dst=null perm=runner:supergroup:rw-r--r-- proto=rpc
14:51:19.414 INFO StateChange - BLOCK* allocate blk_1073741856_1032, replicas=127.0.0.1:34059 for /user/runner/ReadsSparkSinkUnitTest2_0ab91b94-e02b-4351-a112-17bd1d634f90.bam.parts/terminator
14:51:19.415 INFO DataNode - Receiving BP-1768883704-10.1.0.125-1772635868711:blk_1073741856_1032 src: /127.0.0.1:37048 dest: /127.0.0.1:34059
14:51:19.416 INFO clienttrace - src: /127.0.0.1:37048, dest: /127.0.0.1:34059, bytes: 28, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-951833550_1, offset: 0, srvID: 3c05d188-bc2c-46c8-958b-79f7184a02a2, blockid: BP-1768883704-10.1.0.125-1772635868711:blk_1073741856_1032, duration(ns): 473192
14:51:19.416 INFO DataNode - PacketResponder: BP-1768883704-10.1.0.125-1772635868711:blk_1073741856_1032, type=LAST_IN_PIPELINE terminating
14:51:19.417 INFO FSNamesystem - BLOCK* blk_1073741856_1032 is COMMITTED but not COMPLETE(numNodes= 0 < minimum = 1) in file /user/runner/ReadsSparkSinkUnitTest2_0ab91b94-e02b-4351-a112-17bd1d634f90.bam.parts/terminator
14:51:19.818 INFO StateChange - DIR* completeFile: /user/runner/ReadsSparkSinkUnitTest2_0ab91b94-e02b-4351-a112-17bd1d634f90.bam.parts/terminator is closed by DFSClient_NONMAPREDUCE_-951833550_1
14:51:19.819 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=listStatus src=/user/runner/ReadsSparkSinkUnitTest2_0ab91b94-e02b-4351-a112-17bd1d634f90.bam.parts dst=null perm=null proto=rpc
14:51:19.820 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=create src=/user/runner/ReadsSparkSinkUnitTest2_0ab91b94-e02b-4351-a112-17bd1d634f90.bam.parts/output dst=null perm=runner:supergroup:rw-r--r-- proto=rpc
14:51:19.821 INFO StateChange - DIR* completeFile: /user/runner/ReadsSparkSinkUnitTest2_0ab91b94-e02b-4351-a112-17bd1d634f90.bam.parts/output is closed by DFSClient_NONMAPREDUCE_-951833550_1
14:51:19.821 INFO HadoopFileSystemWrapper - Concatenating 3 parts to hdfs://localhost:43595/user/runner/ReadsSparkSinkUnitTest2_0ab91b94-e02b-4351-a112-17bd1d634f90.bam
14:51:19.822 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=concat src=[/user/runner/ReadsSparkSinkUnitTest2_0ab91b94-e02b-4351-a112-17bd1d634f90.bam.parts/header, /user/runner/ReadsSparkSinkUnitTest2_0ab91b94-e02b-4351-a112-17bd1d634f90.bam.parts/part-r-00000, /user/runner/ReadsSparkSinkUnitTest2_0ab91b94-e02b-4351-a112-17bd1d634f90.bam.parts/terminator] dst=/user/runner/ReadsSparkSinkUnitTest2_0ab91b94-e02b-4351-a112-17bd1d634f90.bam.parts/output perm=runner:supergroup:rw-r--r-- proto=rpc
14:51:19.822 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest2_0ab91b94-e02b-4351-a112-17bd1d634f90.bam dst=null perm=null proto=rpc
14:51:19.823 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=rename src=/user/runner/ReadsSparkSinkUnitTest2_0ab91b94-e02b-4351-a112-17bd1d634f90.bam.parts/output dst=/user/runner/ReadsSparkSinkUnitTest2_0ab91b94-e02b-4351-a112-17bd1d634f90.bam perm=runner:supergroup:rw-r--r-- proto=rpc
14:51:19.823 INFO HadoopFileSystemWrapper - Concatenating to hdfs://localhost:43595/user/runner/ReadsSparkSinkUnitTest2_0ab91b94-e02b-4351-a112-17bd1d634f90.bam done
14:51:19.824 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest2_0ab91b94-e02b-4351-a112-17bd1d634f90.bam dst=null perm=null proto=rpc
14:51:19.824 INFO IndexFileMerger - Merging .sbi files in temp directory hdfs://localhost:43595/user/runner/ReadsSparkSinkUnitTest2_0ab91b94-e02b-4351-a112-17bd1d634f90.bam.parts/ to hdfs://localhost:43595/user/runner/ReadsSparkSinkUnitTest2_0ab91b94-e02b-4351-a112-17bd1d634f90.bam.sbi
14:51:19.824 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=listStatus src=/user/runner/ReadsSparkSinkUnitTest2_0ab91b94-e02b-4351-a112-17bd1d634f90.bam.parts dst=null perm=null proto=rpc
14:51:19.825 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=create src=/user/runner/ReadsSparkSinkUnitTest2_0ab91b94-e02b-4351-a112-17bd1d634f90.bam.sbi dst=null perm=runner:supergroup:rw-r--r-- proto=rpc
14:51:19.827 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest2_0ab91b94-e02b-4351-a112-17bd1d634f90.bam.parts/.part-r-00000.sbi dst=null perm=null proto=rpc
14:51:19.827 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest2_0ab91b94-e02b-4351-a112-17bd1d634f90.bam.parts/.part-r-00000.sbi dst=null perm=null proto=rpc
14:51:19.829 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=delete src=/user/runner/ReadsSparkSinkUnitTest2_0ab91b94-e02b-4351-a112-17bd1d634f90.bam.parts/.part-r-00000.sbi dst=null perm=null proto=rpc
14:51:19.830 INFO StateChange - BLOCK* allocate blk_1073741857_1033, replicas=127.0.0.1:34059 for /user/runner/ReadsSparkSinkUnitTest2_0ab91b94-e02b-4351-a112-17bd1d634f90.bam.sbi
14:51:19.831 INFO DataNode - Receiving BP-1768883704-10.1.0.125-1772635868711:blk_1073741857_1033 src: /127.0.0.1:37050 dest: /127.0.0.1:34059
14:51:19.833 INFO clienttrace - src: /127.0.0.1:37050, dest: /127.0.0.1:34059, bytes: 212, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-951833550_1, offset: 0, srvID: 3c05d188-bc2c-46c8-958b-79f7184a02a2, blockid: BP-1768883704-10.1.0.125-1772635868711:blk_1073741857_1033, duration(ns): 593529
14:51:19.833 INFO DataNode - PacketResponder: BP-1768883704-10.1.0.125-1772635868711:blk_1073741857_1033, type=LAST_IN_PIPELINE terminating
14:51:19.834 INFO StateChange - DIR* completeFile: /user/runner/ReadsSparkSinkUnitTest2_0ab91b94-e02b-4351-a112-17bd1d634f90.bam.sbi is closed by DFSClient_NONMAPREDUCE_-951833550_1
14:51:19.834 INFO IndexFileMerger - Done merging .sbi files
14:51:19.834 INFO IndexFileMerger - Merging .bai files in temp directory hdfs://localhost:43595/user/runner/ReadsSparkSinkUnitTest2_0ab91b94-e02b-4351-a112-17bd1d634f90.bam.parts/ to hdfs://localhost:43595/user/runner/ReadsSparkSinkUnitTest2_0ab91b94-e02b-4351-a112-17bd1d634f90.bam.bai
14:51:19.835 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=listStatus src=/user/runner/ReadsSparkSinkUnitTest2_0ab91b94-e02b-4351-a112-17bd1d634f90.bam.parts dst=null perm=null proto=rpc
14:51:19.836 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=create src=/user/runner/ReadsSparkSinkUnitTest2_0ab91b94-e02b-4351-a112-17bd1d634f90.bam.bai dst=null perm=runner:supergroup:rw-r--r-- proto=rpc
14:51:19.837 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest2_0ab91b94-e02b-4351-a112-17bd1d634f90.bam.parts/.part-r-00000.bai dst=null perm=null proto=rpc
14:51:19.837 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest2_0ab91b94-e02b-4351-a112-17bd1d634f90.bam.parts/.part-r-00000.bai dst=null perm=null proto=rpc
14:51:19.839 WARN DFSUtil - Unexpected value for data transfer bytes=5516 duration=0
14:51:19.840 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=delete src=/user/runner/ReadsSparkSinkUnitTest2_0ab91b94-e02b-4351-a112-17bd1d634f90.bam.parts/.part-r-00000.bai dst=null perm=null proto=rpc
14:51:19.842 INFO StateChange - BLOCK* allocate blk_1073741858_1034, replicas=127.0.0.1:34059 for /user/runner/ReadsSparkSinkUnitTest2_0ab91b94-e02b-4351-a112-17bd1d634f90.bam.bai
14:51:19.843 INFO DataNode - Receiving BP-1768883704-10.1.0.125-1772635868711:blk_1073741858_1034 src: /127.0.0.1:37066 dest: /127.0.0.1:34059
14:51:19.844 INFO clienttrace - src: /127.0.0.1:37066, dest: /127.0.0.1:34059, bytes: 5472, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-951833550_1, offset: 0, srvID: 3c05d188-bc2c-46c8-958b-79f7184a02a2, blockid: BP-1768883704-10.1.0.125-1772635868711:blk_1073741858_1034, duration(ns): 529908
14:51:19.845 INFO DataNode - PacketResponder: BP-1768883704-10.1.0.125-1772635868711:blk_1073741858_1034, type=LAST_IN_PIPELINE terminating
14:51:19.845 INFO FSNamesystem - BLOCK* blk_1073741858_1034 is COMMITTED but not COMPLETE(numNodes= 0 < minimum = 1) in file /user/runner/ReadsSparkSinkUnitTest2_0ab91b94-e02b-4351-a112-17bd1d634f90.bam.bai
14:51:20.247 INFO StateChange - DIR* completeFile: /user/runner/ReadsSparkSinkUnitTest2_0ab91b94-e02b-4351-a112-17bd1d634f90.bam.bai is closed by DFSClient_NONMAPREDUCE_-951833550_1
14:51:20.247 INFO IndexFileMerger - Done merging .bai files
14:51:20.248 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=delete src=/user/runner/ReadsSparkSinkUnitTest2_0ab91b94-e02b-4351-a112-17bd1d634f90.bam.parts dst=null perm=null proto=rpc
14:51:20.258 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest2_0ab91b94-e02b-4351-a112-17bd1d634f90.bam.bai dst=null perm=null proto=rpc
14:51:20.267 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest2_0ab91b94-e02b-4351-a112-17bd1d634f90.bam.sbi dst=null perm=null proto=rpc
14:51:20.268 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest2_0ab91b94-e02b-4351-a112-17bd1d634f90.bam.sbi dst=null perm=null proto=rpc
14:51:20.269 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest2_0ab91b94-e02b-4351-a112-17bd1d634f90.bam.sbi dst=null perm=null proto=rpc
14:51:20.270 WARN DFSUtil - Unexpected value for data transfer bytes=216 duration=0
14:51:20.271 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest2_0ab91b94-e02b-4351-a112-17bd1d634f90.bam dst=null perm=null proto=rpc
14:51:20.272 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest2_0ab91b94-e02b-4351-a112-17bd1d634f90.bam dst=null perm=null proto=rpc
14:51:20.272 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest2_0ab91b94-e02b-4351-a112-17bd1d634f90.bam dst=null perm=null proto=rpc
14:51:20.273 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest2_0ab91b94-e02b-4351-a112-17bd1d634f90.bam dst=null perm=null proto=rpc
14:51:20.274 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest2_0ab91b94-e02b-4351-a112-17bd1d634f90.bam.bai dst=null perm=null proto=rpc
14:51:20.275 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest2_0ab91b94-e02b-4351-a112-17bd1d634f90.bam.bai dst=null perm=null proto=rpc
14:51:20.275 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest2_0ab91b94-e02b-4351-a112-17bd1d634f90.bam.bai dst=null perm=null proto=rpc
14:51:20.277 WARN DFSUtil - Unexpected value for data transfer bytes=5760 duration=0
14:51:20.280 WARN DFSUtil - Unexpected value for data transfer bytes=5516 duration=0
14:51:20.280 WARN DFSUtil - Unexpected value for data transfer bytes=5516 duration=0
14:51:20.281 WARN DFSUtil - Unexpected value for data transfer bytes=231570 duration=0
14:51:20.281 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest2_0ab91b94-e02b-4351-a112-17bd1d634f90.bam.sbi dst=null perm=null proto=rpc
14:51:20.282 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest2_0ab91b94-e02b-4351-a112-17bd1d634f90.bam.sbi dst=null perm=null proto=rpc
14:51:20.283 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest2_0ab91b94-e02b-4351-a112-17bd1d634f90.bam.sbi dst=null perm=null proto=rpc
14:51:20.285 WARN DFSUtil - Unexpected value for data transfer bytes=216 duration=0
14:51:20.285 INFO MemoryStore - Block broadcast_93 stored as values in memory (estimated size 320.0 B, free 1918.0 MiB)
14:51:20.286 INFO MemoryStore - Block broadcast_93_piece0 stored as bytes in memory (estimated size 233.0 B, free 1918.0 MiB)
14:51:20.286 INFO BlockManagerInfo - Added broadcast_93_piece0 in memory on localhost:44923 (size: 233.0 B, free: 1919.7 MiB)
14:51:20.286 INFO SparkContext - Created broadcast 93 from broadcast at BamSource.java:104
14:51:20.288 INFO MemoryStore - Block broadcast_94 stored as values in memory (estimated size 297.9 KiB, free 1917.7 MiB)
14:51:20.294 INFO MemoryStore - Block broadcast_94_piece0 stored as bytes in memory (estimated size 50.2 KiB, free 1917.7 MiB)
14:51:20.295 INFO BlockManagerInfo - Added broadcast_94_piece0 in memory on localhost:44923 (size: 50.2 KiB, free: 1919.6 MiB)
14:51:20.295 INFO SparkContext - Created broadcast 94 from newAPIHadoopFile at PathSplitSource.java:96
14:51:20.306 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest2_0ab91b94-e02b-4351-a112-17bd1d634f90.bam dst=null perm=null proto=rpc
14:51:20.307 INFO FileInputFormat - Total input files to process : 1
14:51:20.307 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest2_0ab91b94-e02b-4351-a112-17bd1d634f90.bam dst=null perm=null proto=rpc
14:51:20.323 INFO SparkContext - Starting job: collect at ReadsSparkSinkUnitTest.java:182
14:51:20.324 INFO DAGScheduler - Got job 41 (collect at ReadsSparkSinkUnitTest.java:182) with 1 output partitions
14:51:20.324 INFO DAGScheduler - Final stage: ResultStage 56 (collect at ReadsSparkSinkUnitTest.java:182)
14:51:20.324 INFO DAGScheduler - Parents of final stage: List()
14:51:20.324 INFO DAGScheduler - Missing parents: List()
14:51:20.324 INFO DAGScheduler - Submitting ResultStage 56 (MapPartitionsRDD[215] at filter at ReadsSparkSource.java:96), which has no missing parents
14:51:20.330 INFO MemoryStore - Block broadcast_95 stored as values in memory (estimated size 148.2 KiB, free 1917.5 MiB)
14:51:20.331 INFO MemoryStore - Block broadcast_95_piece0 stored as bytes in memory (estimated size 54.6 KiB, free 1917.5 MiB)
14:51:20.332 INFO BlockManagerInfo - Added broadcast_95_piece0 in memory on localhost:44923 (size: 54.6 KiB, free: 1919.6 MiB)
14:51:20.332 INFO SparkContext - Created broadcast 95 from broadcast at DAGScheduler.scala:1580
14:51:20.332 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 56 (MapPartitionsRDD[215] at filter at ReadsSparkSource.java:96) (first 15 tasks are for partitions Vector(0))
14:51:20.332 INFO TaskSchedulerImpl - Adding task set 56.0 with 1 tasks resource profile 0
14:51:20.333 INFO TaskSetManager - Starting task 0.0 in stage 56.0 (TID 94) (localhost, executor driver, partition 0, ANY, 7852 bytes)
14:51:20.333 INFO Executor - Running task 0.0 in stage 56.0 (TID 94)
14:51:20.348 INFO NewHadoopRDD - Input split: hdfs://localhost:43595/user/runner/ReadsSparkSinkUnitTest2_0ab91b94-e02b-4351-a112-17bd1d634f90.bam:0+235514
14:51:20.349 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest2_0ab91b94-e02b-4351-a112-17bd1d634f90.bam dst=null perm=null proto=rpc
14:51:20.350 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest2_0ab91b94-e02b-4351-a112-17bd1d634f90.bam dst=null perm=null proto=rpc
14:51:20.352 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest2_0ab91b94-e02b-4351-a112-17bd1d634f90.bam.bai dst=null perm=null proto=rpc
14:51:20.353 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest2_0ab91b94-e02b-4351-a112-17bd1d634f90.bam.bai dst=null perm=null proto=rpc
14:51:20.353 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest2_0ab91b94-e02b-4351-a112-17bd1d634f90.bam.bai dst=null perm=null proto=rpc
14:51:20.355 WARN DFSUtil - Unexpected value for data transfer bytes=5760 duration=0
14:51:20.358 WARN DFSUtil - Unexpected value for data transfer bytes=5516 duration=0
14:51:20.359 WARN DFSUtil - Unexpected value for data transfer bytes=231570 duration=0
14:51:20.360 WARN DFSUtil - Unexpected value for data transfer bytes=32 duration=0
14:51:20.363 INFO Executor - Finished task 0.0 in stage 56.0 (TID 94). 650184 bytes result sent to driver
14:51:20.365 INFO TaskSetManager - Finished task 0.0 in stage 56.0 (TID 94) in 32 ms on localhost (executor driver) (1/1)
14:51:20.365 INFO TaskSchedulerImpl - Removed TaskSet 56.0, whose tasks have all completed, from pool
14:51:20.365 INFO DAGScheduler - ResultStage 56 (collect at ReadsSparkSinkUnitTest.java:182) finished in 0.041 s
14:51:20.366 INFO DAGScheduler - Job 41 is finished. Cancelling potential speculative or zombie tasks for this job
14:51:20.366 INFO TaskSchedulerImpl - Killing all running tasks in stage 56: Stage finished
14:51:20.366 INFO DAGScheduler - Job 41 finished: collect at ReadsSparkSinkUnitTest.java:182, took 0.042626 s
14:51:20.376 INFO SparkContext - Starting job: count at ReadsSparkSinkUnitTest.java:185
14:51:20.376 INFO DAGScheduler - Got job 42 (count at ReadsSparkSinkUnitTest.java:185) with 1 output partitions
14:51:20.376 INFO DAGScheduler - Final stage: ResultStage 57 (count at ReadsSparkSinkUnitTest.java:185)
14:51:20.376 INFO DAGScheduler - Parents of final stage: List()
14:51:20.377 INFO DAGScheduler - Missing parents: List()
14:51:20.377 INFO DAGScheduler - Submitting ResultStage 57 (MapPartitionsRDD[197] at filter at ReadsSparkSource.java:96), which has no missing parents
14:51:20.395 INFO MemoryStore - Block broadcast_96 stored as values in memory (estimated size 426.1 KiB, free 1917.1 MiB)
14:51:20.396 INFO MemoryStore - Block broadcast_96_piece0 stored as bytes in memory (estimated size 153.6 KiB, free 1916.9 MiB)
14:51:20.397 INFO BlockManagerInfo - Added broadcast_96_piece0 in memory on localhost:44923 (size: 153.6 KiB, free: 1919.4 MiB)
14:51:20.397 INFO SparkContext - Created broadcast 96 from broadcast at DAGScheduler.scala:1580
14:51:20.397 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 57 (MapPartitionsRDD[197] at filter at ReadsSparkSource.java:96) (first 15 tasks are for partitions Vector(0))
14:51:20.397 INFO TaskSchedulerImpl - Adding task set 57.0 with 1 tasks resource profile 0
14:51:20.398 INFO TaskSetManager - Starting task 0.0 in stage 57.0 (TID 95) (localhost, executor driver, partition 0, PROCESS_LOCAL, 7912 bytes)
14:51:20.399 INFO Executor - Running task 0.0 in stage 57.0 (TID 95)
14:51:20.436 INFO NewHadoopRDD - Input split: file:/home/runner/work/gatk/gatk/src/test/resources/org/broadinstitute/hellbender/tools/BQSR/expected.HiSeq.1mb.1RG.2k_lines.alternate.recalibrated.DIQ.bam:0+216896
14:51:20.449 INFO Executor - Finished task 0.0 in stage 57.0 (TID 95). 989 bytes result sent to driver
14:51:20.449 INFO TaskSetManager - Finished task 0.0 in stage 57.0 (TID 95) in 51 ms on localhost (executor driver) (1/1)
14:51:20.450 INFO TaskSchedulerImpl - Removed TaskSet 57.0, whose tasks have all completed, from pool
14:51:20.450 INFO DAGScheduler - ResultStage 57 (count at ReadsSparkSinkUnitTest.java:185) finished in 0.073 s
14:51:20.450 INFO DAGScheduler - Job 42 is finished. Cancelling potential speculative or zombie tasks for this job
14:51:20.450 INFO TaskSchedulerImpl - Killing all running tasks in stage 57: Stage finished
14:51:20.450 INFO DAGScheduler - Job 42 finished: count at ReadsSparkSinkUnitTest.java:185, took 0.074122 s
14:51:20.454 INFO SparkContext - Starting job: count at ReadsSparkSinkUnitTest.java:185
14:51:20.454 INFO DAGScheduler - Got job 43 (count at ReadsSparkSinkUnitTest.java:185) with 1 output partitions
14:51:20.454 INFO DAGScheduler - Final stage: ResultStage 58 (count at ReadsSparkSinkUnitTest.java:185)
14:51:20.454 INFO DAGScheduler - Parents of final stage: List()
14:51:20.454 INFO DAGScheduler - Missing parents: List()
14:51:20.455 INFO DAGScheduler - Submitting ResultStage 58 (MapPartitionsRDD[215] at filter at ReadsSparkSource.java:96), which has no missing parents
14:51:20.461 INFO MemoryStore - Block broadcast_97 stored as values in memory (estimated size 148.1 KiB, free 1916.8 MiB)
14:51:20.462 INFO MemoryStore - Block broadcast_97_piece0 stored as bytes in memory (estimated size 54.6 KiB, free 1916.7 MiB)
14:51:20.462 INFO BlockManagerInfo - Added broadcast_97_piece0 in memory on localhost:44923 (size: 54.6 KiB, free: 1919.3 MiB)
14:51:20.462 INFO SparkContext - Created broadcast 97 from broadcast at DAGScheduler.scala:1580
14:51:20.463 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 58 (MapPartitionsRDD[215] at filter at ReadsSparkSource.java:96) (first 15 tasks are for partitions Vector(0))
14:51:20.463 INFO TaskSchedulerImpl - Adding task set 58.0 with 1 tasks resource profile 0
14:51:20.463 INFO TaskSetManager - Starting task 0.0 in stage 58.0 (TID 96) (localhost, executor driver, partition 0, ANY, 7852 bytes)
14:51:20.464 INFO Executor - Running task 0.0 in stage 58.0 (TID 96)
14:51:20.477 INFO NewHadoopRDD - Input split: hdfs://localhost:43595/user/runner/ReadsSparkSinkUnitTest2_0ab91b94-e02b-4351-a112-17bd1d634f90.bam:0+235514
14:51:20.478 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest2_0ab91b94-e02b-4351-a112-17bd1d634f90.bam dst=null perm=null proto=rpc
14:51:20.479 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest2_0ab91b94-e02b-4351-a112-17bd1d634f90.bam dst=null perm=null proto=rpc
14:51:20.480 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest2_0ab91b94-e02b-4351-a112-17bd1d634f90.bam.bai dst=null perm=null proto=rpc
14:51:20.481 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest2_0ab91b94-e02b-4351-a112-17bd1d634f90.bam.bai dst=null perm=null proto=rpc
14:51:20.482 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest2_0ab91b94-e02b-4351-a112-17bd1d634f90.bam.bai dst=null perm=null proto=rpc
14:51:20.484 WARN DFSUtil - Unexpected value for data transfer bytes=5760 duration=0
14:51:20.486 WARN DFSUtil - Unexpected value for data transfer bytes=5516 duration=0
14:51:20.487 WARN DFSUtil - Unexpected value for data transfer bytes=5516 duration=0
14:51:20.489 WARN DFSUtil - Unexpected value for data transfer bytes=231570 duration=0
14:51:20.489 WARN DFSUtil - Unexpected value for data transfer bytes=32 duration=0
14:51:20.491 INFO Executor - Finished task 0.0 in stage 58.0 (TID 96). 989 bytes result sent to driver
14:51:20.492 INFO TaskSetManager - Finished task 0.0 in stage 58.0 (TID 96) in 29 ms on localhost (executor driver) (1/1)
14:51:20.492 INFO TaskSchedulerImpl - Removed TaskSet 58.0, whose tasks have all completed, from pool
14:51:20.492 INFO DAGScheduler - ResultStage 58 (count at ReadsSparkSinkUnitTest.java:185) finished in 0.037 s
14:51:20.492 INFO DAGScheduler - Job 43 is finished. Cancelling potential speculative or zombie tasks for this job
14:51:20.492 INFO TaskSchedulerImpl - Killing all running tasks in stage 58: Stage finished
14:51:20.492 INFO DAGScheduler - Job 43 finished: count at ReadsSparkSinkUnitTest.java:185, took 0.038355 s
14:51:20.497 INFO MemoryStore - Block broadcast_98 stored as values in memory (estimated size 298.0 KiB, free 1916.4 MiB)
14:51:20.504 INFO MemoryStore - Block broadcast_98_piece0 stored as bytes in memory (estimated size 50.2 KiB, free 1916.4 MiB)
14:51:20.504 INFO BlockManagerInfo - Added broadcast_98_piece0 in memory on localhost:44923 (size: 50.2 KiB, free: 1919.3 MiB)
14:51:20.505 INFO SparkContext - Created broadcast 98 from newAPIHadoopFile at PathSplitSource.java:96
14:51:20.533 INFO MemoryStore - Block broadcast_99 stored as values in memory (estimated size 298.0 KiB, free 1916.1 MiB)
14:51:20.544 INFO MemoryStore - Block broadcast_99_piece0 stored as bytes in memory (estimated size 50.2 KiB, free 1916.0 MiB)
14:51:20.545 INFO BlockManagerInfo - Added broadcast_99_piece0 in memory on localhost:44923 (size: 50.2 KiB, free: 1919.3 MiB)
14:51:20.545 INFO SparkContext - Created broadcast 99 from newAPIHadoopFile at PathSplitSource.java:96
14:51:20.569 INFO FileInputFormat - Total input files to process : 1
14:51:20.570 INFO MemoryStore - Block broadcast_100 stored as values in memory (estimated size 19.6 KiB, free 1916.0 MiB)
14:51:20.571 INFO MemoryStore - Block broadcast_100_piece0 stored as bytes in memory (estimated size 1890.0 B, free 1916.0 MiB)
14:51:20.571 INFO BlockManagerInfo - Added broadcast_100_piece0 in memory on localhost:44923 (size: 1890.0 B, free: 1919.2 MiB)
14:51:20.571 INFO SparkContext - Created broadcast 100 from broadcast at ReadsSparkSink.java:133
14:51:20.572 INFO MemoryStore - Block broadcast_101 stored as values in memory (estimated size 20.0 KiB, free 1916.0 MiB)
14:51:20.573 INFO MemoryStore - Block broadcast_101_piece0 stored as bytes in memory (estimated size 1890.0 B, free 1916.0 MiB)
14:51:20.573 INFO BlockManagerInfo - Added broadcast_101_piece0 in memory on localhost:44923 (size: 1890.0 B, free: 1919.2 MiB)
14:51:20.573 INFO SparkContext - Created broadcast 101 from broadcast at BamSink.java:76
14:51:20.576 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest3_bf12d752-09b6-421b-bd3d-c420126cb019.bam.parts dst=null perm=null proto=rpc
14:51:20.576 INFO PathOutputCommitterFactory - No output committer factory defined, defaulting to FileOutputCommitterFactory
14:51:20.576 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
14:51:20.576 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
14:51:20.577 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=mkdirs src=/user/runner/ReadsSparkSinkUnitTest3_bf12d752-09b6-421b-bd3d-c420126cb019.bam.parts/_temporary/0 dst=null perm=runner:supergroup:rwxr-xr-x proto=rpc
14:51:20.583 INFO SparkContext - Starting job: runJob at SparkHadoopWriter.scala:83
14:51:20.584 INFO DAGScheduler - Registering RDD 229 (mapToPair at SparkUtils.java:161) as input to shuffle 13
14:51:20.584 INFO DAGScheduler - Got job 44 (runJob at SparkHadoopWriter.scala:83) with 1 output partitions
14:51:20.584 INFO DAGScheduler - Final stage: ResultStage 60 (runJob at SparkHadoopWriter.scala:83)
14:51:20.584 INFO DAGScheduler - Parents of final stage: List(ShuffleMapStage 59)
14:51:20.584 INFO DAGScheduler - Missing parents: List(ShuffleMapStage 59)
14:51:20.584 INFO DAGScheduler - Submitting ShuffleMapStage 59 (MapPartitionsRDD[229] at mapToPair at SparkUtils.java:161), which has no missing parents
14:51:20.602 INFO MemoryStore - Block broadcast_102 stored as values in memory (estimated size 434.3 KiB, free 1915.6 MiB)
14:51:20.604 INFO MemoryStore - Block broadcast_102_piece0 stored as bytes in memory (estimated size 157.6 KiB, free 1915.4 MiB)
14:51:20.604 INFO BlockManagerInfo - Added broadcast_102_piece0 in memory on localhost:44923 (size: 157.6 KiB, free: 1919.1 MiB)
14:51:20.605 INFO SparkContext - Created broadcast 102 from broadcast at DAGScheduler.scala:1580
14:51:20.605 INFO DAGScheduler - Submitting 1 missing tasks from ShuffleMapStage 59 (MapPartitionsRDD[229] at mapToPair at SparkUtils.java:161) (first 15 tasks are for partitions Vector(0))
14:51:20.605 INFO TaskSchedulerImpl - Adding task set 59.0 with 1 tasks resource profile 0
14:51:20.606 INFO TaskSetManager - Starting task 0.0 in stage 59.0 (TID 97) (localhost, executor driver, partition 0, PROCESS_LOCAL, 7882 bytes)
14:51:20.606 INFO Executor - Running task 0.0 in stage 59.0 (TID 97)
14:51:20.641 INFO NewHadoopRDD - Input split: file:/home/runner/work/gatk/gatk/src/test/resources/org/broadinstitute/hellbender/tools/BQSR/CEUTrio.HiSeq.WGS.b37.ch20.1m-1m1k.NA12878.bam:0+211123
14:51:20.665 INFO BlockManagerInfo - Removed broadcast_93_piece0 on localhost:44923 in memory (size: 233.0 B, free: 1919.1 MiB)
14:51:20.666 INFO BlockManagerInfo - Removed broadcast_87_piece0 on localhost:44923 in memory (size: 50.3 KiB, free: 1919.1 MiB)
14:51:20.667 INFO BlockManagerInfo - Removed broadcast_97_piece0 on localhost:44923 in memory (size: 54.6 KiB, free: 1919.2 MiB)
14:51:20.668 INFO BlockManagerInfo - Removed broadcast_94_piece0 on localhost:44923 in memory (size: 50.2 KiB, free: 1919.2 MiB)
14:51:20.669 INFO Executor - Finished task 0.0 in stage 59.0 (TID 97). 1191 bytes result sent to driver
14:51:20.669 INFO BlockManagerInfo - Removed broadcast_96_piece0 on localhost:44923 in memory (size: 153.6 KiB, free: 1919.4 MiB)
14:51:20.669 INFO TaskSetManager - Finished task 0.0 in stage 59.0 (TID 97) in 63 ms on localhost (executor driver) (1/1)
14:51:20.669 INFO TaskSchedulerImpl - Removed TaskSet 59.0, whose tasks have all completed, from pool
14:51:20.670 INFO DAGScheduler - ShuffleMapStage 59 (mapToPair at SparkUtils.java:161) finished in 0.085 s
14:51:20.670 INFO DAGScheduler - looking for newly runnable stages
14:51:20.670 INFO DAGScheduler - running: HashSet()
14:51:20.670 INFO DAGScheduler - waiting: HashSet(ResultStage 60)
14:51:20.670 INFO DAGScheduler - failed: HashSet()
14:51:20.670 INFO DAGScheduler - Submitting ResultStage 60 (MapPartitionsRDD[234] at mapToPair at BamSink.java:91), which has no missing parents
14:51:20.671 INFO BlockManagerInfo - Removed broadcast_90_piece0 on localhost:44923 in memory (size: 9.6 KiB, free: 1919.4 MiB)
14:51:20.673 INFO BlockManagerInfo - Removed broadcast_88_piece0 on localhost:44923 in memory (size: 50.3 KiB, free: 1919.5 MiB)
14:51:20.673 INFO BlockManagerInfo - Removed broadcast_89_piece0 on localhost:44923 in memory (size: 9.6 KiB, free: 1919.5 MiB)
14:51:20.674 INFO BlockManagerInfo - Removed broadcast_95_piece0 on localhost:44923 in memory (size: 54.6 KiB, free: 1919.5 MiB)
14:51:20.675 INFO BlockManagerInfo - Removed broadcast_91_piece0 on localhost:44923 in memory (size: 166.1 KiB, free: 1919.7 MiB)
14:51:20.675 INFO BlockManagerInfo - Removed broadcast_99_piece0 on localhost:44923 in memory (size: 50.2 KiB, free: 1919.7 MiB)
14:51:20.676 INFO BlockManagerInfo - Removed broadcast_92_piece0 on localhost:44923 in memory (size: 67.1 KiB, free: 1919.8 MiB)
14:51:20.679 INFO MemoryStore - Block broadcast_103 stored as values in memory (estimated size 155.4 KiB, free 1918.9 MiB)
14:51:20.680 INFO MemoryStore - Block broadcast_103_piece0 stored as bytes in memory (estimated size 58.6 KiB, free 1918.8 MiB)
14:51:20.680 INFO BlockManagerInfo - Added broadcast_103_piece0 in memory on localhost:44923 (size: 58.6 KiB, free: 1919.7 MiB)
14:51:20.681 INFO SparkContext - Created broadcast 103 from broadcast at DAGScheduler.scala:1580
14:51:20.681 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 60 (MapPartitionsRDD[234] at mapToPair at BamSink.java:91) (first 15 tasks are for partitions Vector(0))
14:51:20.681 INFO TaskSchedulerImpl - Adding task set 60.0 with 1 tasks resource profile 0
14:51:20.682 INFO TaskSetManager - Starting task 0.0 in stage 60.0 (TID 98) (localhost, executor driver, partition 0, NODE_LOCAL, 7513 bytes)
14:51:20.683 INFO Executor - Running task 0.0 in stage 60.0 (TID 98)
14:51:20.689 INFO ShuffleBlockFetcherIterator - Getting 1 (312.6 KiB) non-empty blocks including 1 (312.6 KiB) local and 0 (0.0 B) host-local and 0 (0.0 B) push-merged-local and 0 (0.0 B) remote blocks
14:51:20.690 INFO ShuffleBlockFetcherIterator - Started 0 remote fetches in 0 ms
14:51:20.705 INFO PathOutputCommitterFactory - No output committer factory defined, defaulting to FileOutputCommitterFactory
14:51:20.705 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
14:51:20.705 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
14:51:20.705 INFO PathOutputCommitterFactory - No output committer factory defined, defaulting to FileOutputCommitterFactory
14:51:20.705 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
14:51:20.705 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
14:51:20.707 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=create src=/user/runner/ReadsSparkSinkUnitTest3_bf12d752-09b6-421b-bd3d-c420126cb019.bam.parts/_temporary/0/_temporary/attempt_202603041451201832022261747192535_0234_r_000000_0/part-r-00000 dst=null perm=runner:supergroup:rw-r--r-- proto=rpc
14:51:20.708 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=create src=/user/runner/ReadsSparkSinkUnitTest3_bf12d752-09b6-421b-bd3d-c420126cb019.bam.parts/_temporary/0/_temporary/attempt_202603041451201832022261747192535_0234_r_000000_0/.part-r-00000.sbi dst=null perm=runner:supergroup:rw-r--r-- proto=rpc
14:51:20.709 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=create src=/user/runner/ReadsSparkSinkUnitTest3_bf12d752-09b6-421b-bd3d-c420126cb019.bam.parts/_temporary/0/_temporary/attempt_202603041451201832022261747192535_0234_r_000000_0/.part-r-00000.bai dst=null perm=runner:supergroup:rw-r--r-- proto=rpc
14:51:20.713 INFO StateChange - BLOCK* allocate blk_1073741859_1035, replicas=127.0.0.1:34059 for /user/runner/ReadsSparkSinkUnitTest3_bf12d752-09b6-421b-bd3d-c420126cb019.bam.parts/_temporary/0/_temporary/attempt_202603041451201832022261747192535_0234_r_000000_0/part-r-00000
14:51:20.714 INFO DataNode - Receiving BP-1768883704-10.1.0.125-1772635868711:blk_1073741859_1035 src: /127.0.0.1:37112 dest: /127.0.0.1:34059
14:51:20.717 INFO clienttrace - src: /127.0.0.1:37112, dest: /127.0.0.1:34059, bytes: 235299, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-951833550_1, offset: 0, srvID: 3c05d188-bc2c-46c8-958b-79f7184a02a2, blockid: BP-1768883704-10.1.0.125-1772635868711:blk_1073741859_1035, duration(ns): 2450266
14:51:20.718 INFO DataNode - PacketResponder: BP-1768883704-10.1.0.125-1772635868711:blk_1073741859_1035, type=LAST_IN_PIPELINE terminating
14:51:20.719 INFO StateChange - DIR* completeFile: /user/runner/ReadsSparkSinkUnitTest3_bf12d752-09b6-421b-bd3d-c420126cb019.bam.parts/_temporary/0/_temporary/attempt_202603041451201832022261747192535_0234_r_000000_0/part-r-00000 is closed by DFSClient_NONMAPREDUCE_-951833550_1
14:51:20.720 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest3_bf12d752-09b6-421b-bd3d-c420126cb019.bam.parts/_temporary/0/_temporary/attempt_202603041451201832022261747192535_0234_r_000000_0/part-r-00000 dst=null perm=null proto=rpc
14:51:20.721 INFO StateChange - BLOCK* allocate blk_1073741860_1036, replicas=127.0.0.1:34059 for /user/runner/ReadsSparkSinkUnitTest3_bf12d752-09b6-421b-bd3d-c420126cb019.bam.parts/_temporary/0/_temporary/attempt_202603041451201832022261747192535_0234_r_000000_0/.part-r-00000.sbi
14:51:20.722 INFO DataNode - Receiving BP-1768883704-10.1.0.125-1772635868711:blk_1073741860_1036 src: /127.0.0.1:37128 dest: /127.0.0.1:34059
14:51:20.723 INFO clienttrace - src: /127.0.0.1:37128, dest: /127.0.0.1:34059, bytes: 204, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-951833550_1, offset: 0, srvID: 3c05d188-bc2c-46c8-958b-79f7184a02a2, blockid: BP-1768883704-10.1.0.125-1772635868711:blk_1073741860_1036, duration(ns): 515677
14:51:20.724 INFO DataNode - PacketResponder: BP-1768883704-10.1.0.125-1772635868711:blk_1073741860_1036, type=LAST_IN_PIPELINE terminating
14:51:20.724 INFO FSNamesystem - BLOCK* blk_1073741860_1036 is COMMITTED but not COMPLETE(numNodes= 0 < minimum = 1) in file /user/runner/ReadsSparkSinkUnitTest3_bf12d752-09b6-421b-bd3d-c420126cb019.bam.parts/_temporary/0/_temporary/attempt_202603041451201832022261747192535_0234_r_000000_0/.part-r-00000.sbi
14:51:21.125 INFO StateChange - DIR* completeFile: /user/runner/ReadsSparkSinkUnitTest3_bf12d752-09b6-421b-bd3d-c420126cb019.bam.parts/_temporary/0/_temporary/attempt_202603041451201832022261747192535_0234_r_000000_0/.part-r-00000.sbi is closed by DFSClient_NONMAPREDUCE_-951833550_1
14:51:21.127 INFO StateChange - BLOCK* allocate blk_1073741861_1037, replicas=127.0.0.1:34059 for /user/runner/ReadsSparkSinkUnitTest3_bf12d752-09b6-421b-bd3d-c420126cb019.bam.parts/_temporary/0/_temporary/attempt_202603041451201832022261747192535_0234_r_000000_0/.part-r-00000.bai
14:51:21.128 INFO DataNode - Receiving BP-1768883704-10.1.0.125-1772635868711:blk_1073741861_1037 src: /127.0.0.1:37138 dest: /127.0.0.1:34059
14:51:21.129 INFO clienttrace - src: /127.0.0.1:37138, dest: /127.0.0.1:34059, bytes: 592, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-951833550_1, offset: 0, srvID: 3c05d188-bc2c-46c8-958b-79f7184a02a2, blockid: BP-1768883704-10.1.0.125-1772635868711:blk_1073741861_1037, duration(ns): 508094
14:51:21.129 INFO DataNode - PacketResponder: BP-1768883704-10.1.0.125-1772635868711:blk_1073741861_1037, type=LAST_IN_PIPELINE terminating
14:51:21.130 INFO StateChange - DIR* completeFile: /user/runner/ReadsSparkSinkUnitTest3_bf12d752-09b6-421b-bd3d-c420126cb019.bam.parts/_temporary/0/_temporary/attempt_202603041451201832022261747192535_0234_r_000000_0/.part-r-00000.bai is closed by DFSClient_NONMAPREDUCE_-951833550_1
14:51:21.131 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest3_bf12d752-09b6-421b-bd3d-c420126cb019.bam.parts/_temporary/0/_temporary/attempt_202603041451201832022261747192535_0234_r_000000_0 dst=null perm=null proto=rpc
14:51:21.132 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest3_bf12d752-09b6-421b-bd3d-c420126cb019.bam.parts/_temporary/0/_temporary/attempt_202603041451201832022261747192535_0234_r_000000_0 dst=null perm=null proto=rpc
14:51:21.133 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest3_bf12d752-09b6-421b-bd3d-c420126cb019.bam.parts/_temporary/0/task_202603041451201832022261747192535_0234_r_000000 dst=null perm=null proto=rpc
14:51:21.133 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=rename src=/user/runner/ReadsSparkSinkUnitTest3_bf12d752-09b6-421b-bd3d-c420126cb019.bam.parts/_temporary/0/_temporary/attempt_202603041451201832022261747192535_0234_r_000000_0 dst=/user/runner/ReadsSparkSinkUnitTest3_bf12d752-09b6-421b-bd3d-c420126cb019.bam.parts/_temporary/0/task_202603041451201832022261747192535_0234_r_000000 perm=runner:supergroup:rwxr-xr-x proto=rpc
14:51:21.133 INFO FileOutputCommitter - Saved output of task 'attempt_202603041451201832022261747192535_0234_r_000000_0' to hdfs://localhost:43595/user/runner/ReadsSparkSinkUnitTest3_bf12d752-09b6-421b-bd3d-c420126cb019.bam.parts/_temporary/0/task_202603041451201832022261747192535_0234_r_000000
14:51:21.134 INFO SparkHadoopMapRedUtil - attempt_202603041451201832022261747192535_0234_r_000000_0: Committed. Elapsed time: 1 ms.
14:51:21.134 INFO Executor - Finished task 0.0 in stage 60.0 (TID 98). 1858 bytes result sent to driver
14:51:21.135 INFO TaskSetManager - Finished task 0.0 in stage 60.0 (TID 98) in 453 ms on localhost (executor driver) (1/1)
14:51:21.135 INFO TaskSchedulerImpl - Removed TaskSet 60.0, whose tasks have all completed, from pool
14:51:21.136 INFO DAGScheduler - ResultStage 60 (runJob at SparkHadoopWriter.scala:83) finished in 0.466 s
14:51:21.136 INFO DAGScheduler - Job 44 is finished. Cancelling potential speculative or zombie tasks for this job
14:51:21.136 INFO TaskSchedulerImpl - Killing all running tasks in stage 60: Stage finished
14:51:21.136 INFO DAGScheduler - Job 44 finished: runJob at SparkHadoopWriter.scala:83, took 0.552798 s
14:51:21.137 INFO SparkHadoopWriter - Start to commit write Job job_202603041451201832022261747192535_0234.
14:51:21.137 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=listStatus src=/user/runner/ReadsSparkSinkUnitTest3_bf12d752-09b6-421b-bd3d-c420126cb019.bam.parts/_temporary/0 dst=null perm=null proto=rpc
14:51:21.138 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest3_bf12d752-09b6-421b-bd3d-c420126cb019.bam.parts dst=null perm=null proto=rpc
14:51:21.139 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=listStatus src=/user/runner/ReadsSparkSinkUnitTest3_bf12d752-09b6-421b-bd3d-c420126cb019.bam.parts/_temporary/0/task_202603041451201832022261747192535_0234_r_000000 dst=null perm=null proto=rpc
14:51:21.139 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest3_bf12d752-09b6-421b-bd3d-c420126cb019.bam.parts/.part-r-00000.bai dst=null perm=null proto=rpc
14:51:21.140 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=rename src=/user/runner/ReadsSparkSinkUnitTest3_bf12d752-09b6-421b-bd3d-c420126cb019.bam.parts/_temporary/0/task_202603041451201832022261747192535_0234_r_000000/.part-r-00000.bai dst=/user/runner/ReadsSparkSinkUnitTest3_bf12d752-09b6-421b-bd3d-c420126cb019.bam.parts/.part-r-00000.bai perm=runner:supergroup:rw-r--r-- proto=rpc
14:51:21.141 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest3_bf12d752-09b6-421b-bd3d-c420126cb019.bam.parts/.part-r-00000.sbi dst=null perm=null proto=rpc
14:51:21.142 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=rename src=/user/runner/ReadsSparkSinkUnitTest3_bf12d752-09b6-421b-bd3d-c420126cb019.bam.parts/_temporary/0/task_202603041451201832022261747192535_0234_r_000000/.part-r-00000.sbi dst=/user/runner/ReadsSparkSinkUnitTest3_bf12d752-09b6-421b-bd3d-c420126cb019.bam.parts/.part-r-00000.sbi perm=runner:supergroup:rw-r--r-- proto=rpc
14:51:21.142 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest3_bf12d752-09b6-421b-bd3d-c420126cb019.bam.parts/part-r-00000 dst=null perm=null proto=rpc
14:51:21.143 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=rename src=/user/runner/ReadsSparkSinkUnitTest3_bf12d752-09b6-421b-bd3d-c420126cb019.bam.parts/_temporary/0/task_202603041451201832022261747192535_0234_r_000000/part-r-00000 dst=/user/runner/ReadsSparkSinkUnitTest3_bf12d752-09b6-421b-bd3d-c420126cb019.bam.parts/part-r-00000 perm=runner:supergroup:rw-r--r-- proto=rpc
14:51:21.144 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=delete src=/user/runner/ReadsSparkSinkUnitTest3_bf12d752-09b6-421b-bd3d-c420126cb019.bam.parts/_temporary dst=null perm=null proto=rpc
14:51:21.145 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=create src=/user/runner/ReadsSparkSinkUnitTest3_bf12d752-09b6-421b-bd3d-c420126cb019.bam.parts/_SUCCESS dst=null perm=runner:supergroup:rw-r--r-- proto=rpc
14:51:21.145 INFO StateChange - DIR* completeFile: /user/runner/ReadsSparkSinkUnitTest3_bf12d752-09b6-421b-bd3d-c420126cb019.bam.parts/_SUCCESS is closed by DFSClient_NONMAPREDUCE_-951833550_1
14:51:21.146 INFO audit - allowed=false ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=delete src=/user/runner/ReadsSparkSinkUnitTest3_bf12d752-09b6-421b-bd3d-c420126cb019.bam.parts/.spark-staging-234 dst=null perm=null proto=rpc
14:51:21.147 INFO SparkHadoopWriter - Write Job job_202603041451201832022261747192535_0234 committed. Elapsed time: 9 ms.
14:51:21.147 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=create src=/user/runner/ReadsSparkSinkUnitTest3_bf12d752-09b6-421b-bd3d-c420126cb019.bam.parts/header dst=null perm=runner:supergroup:rw-r--r-- proto=rpc
14:51:21.149 INFO StateChange - BLOCK* allocate blk_1073741862_1038, replicas=127.0.0.1:34059 for /user/runner/ReadsSparkSinkUnitTest3_bf12d752-09b6-421b-bd3d-c420126cb019.bam.parts/header
14:51:21.150 INFO DataNode - Receiving BP-1768883704-10.1.0.125-1772635868711:blk_1073741862_1038 src: /127.0.0.1:37154 dest: /127.0.0.1:34059
14:51:21.151 INFO clienttrace - src: /127.0.0.1:37154, dest: /127.0.0.1:34059, bytes: 1190, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-951833550_1, offset: 0, srvID: 3c05d188-bc2c-46c8-958b-79f7184a02a2, blockid: BP-1768883704-10.1.0.125-1772635868711:blk_1073741862_1038, duration(ns): 549497
14:51:21.151 INFO DataNode - PacketResponder: BP-1768883704-10.1.0.125-1772635868711:blk_1073741862_1038, type=LAST_IN_PIPELINE terminating
14:51:21.152 INFO FSNamesystem - BLOCK* blk_1073741862_1038 is COMMITTED but not COMPLETE(numNodes= 0 < minimum = 1) in file /user/runner/ReadsSparkSinkUnitTest3_bf12d752-09b6-421b-bd3d-c420126cb019.bam.parts/header
14:51:21.554 INFO StateChange - DIR* completeFile: /user/runner/ReadsSparkSinkUnitTest3_bf12d752-09b6-421b-bd3d-c420126cb019.bam.parts/header is closed by DFSClient_NONMAPREDUCE_-951833550_1
14:51:21.555 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=create src=/user/runner/ReadsSparkSinkUnitTest3_bf12d752-09b6-421b-bd3d-c420126cb019.bam.parts/terminator dst=null perm=runner:supergroup:rw-r--r-- proto=rpc
14:51:21.556 INFO StateChange - BLOCK* allocate blk_1073741863_1039, replicas=127.0.0.1:34059 for /user/runner/ReadsSparkSinkUnitTest3_bf12d752-09b6-421b-bd3d-c420126cb019.bam.parts/terminator
14:51:21.558 INFO DataNode - Receiving BP-1768883704-10.1.0.125-1772635868711:blk_1073741863_1039 src: /127.0.0.1:37166 dest: /127.0.0.1:34059
14:51:21.559 INFO clienttrace - src: /127.0.0.1:37166, dest: /127.0.0.1:34059, bytes: 28, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-951833550_1, offset: 0, srvID: 3c05d188-bc2c-46c8-958b-79f7184a02a2, blockid: BP-1768883704-10.1.0.125-1772635868711:blk_1073741863_1039, duration(ns): 636226
14:51:21.559 INFO DataNode - PacketResponder: BP-1768883704-10.1.0.125-1772635868711:blk_1073741863_1039, type=LAST_IN_PIPELINE terminating
14:51:21.560 INFO FSNamesystem - BLOCK* blk_1073741863_1039 is COMMITTED but not COMPLETE(numNodes= 0 < minimum = 1) in file /user/runner/ReadsSparkSinkUnitTest3_bf12d752-09b6-421b-bd3d-c420126cb019.bam.parts/terminator
14:51:21.961 INFO StateChange - DIR* completeFile: /user/runner/ReadsSparkSinkUnitTest3_bf12d752-09b6-421b-bd3d-c420126cb019.bam.parts/terminator is closed by DFSClient_NONMAPREDUCE_-951833550_1
14:51:21.962 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=listStatus src=/user/runner/ReadsSparkSinkUnitTest3_bf12d752-09b6-421b-bd3d-c420126cb019.bam.parts dst=null perm=null proto=rpc
14:51:21.963 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=create src=/user/runner/ReadsSparkSinkUnitTest3_bf12d752-09b6-421b-bd3d-c420126cb019.bam.parts/output dst=null perm=runner:supergroup:rw-r--r-- proto=rpc
14:51:21.964 INFO StateChange - DIR* completeFile: /user/runner/ReadsSparkSinkUnitTest3_bf12d752-09b6-421b-bd3d-c420126cb019.bam.parts/output is closed by DFSClient_NONMAPREDUCE_-951833550_1
14:51:21.965 INFO HadoopFileSystemWrapper - Concatenating 3 parts to hdfs://localhost:43595/user/runner/ReadsSparkSinkUnitTest3_bf12d752-09b6-421b-bd3d-c420126cb019.bam
14:51:21.965 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=concat src=[/user/runner/ReadsSparkSinkUnitTest3_bf12d752-09b6-421b-bd3d-c420126cb019.bam.parts/header, /user/runner/ReadsSparkSinkUnitTest3_bf12d752-09b6-421b-bd3d-c420126cb019.bam.parts/part-r-00000, /user/runner/ReadsSparkSinkUnitTest3_bf12d752-09b6-421b-bd3d-c420126cb019.bam.parts/terminator] dst=/user/runner/ReadsSparkSinkUnitTest3_bf12d752-09b6-421b-bd3d-c420126cb019.bam.parts/output perm=runner:supergroup:rw-r--r-- proto=rpc
14:51:21.966 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest3_bf12d752-09b6-421b-bd3d-c420126cb019.bam dst=null perm=null proto=rpc
14:51:21.966 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=rename src=/user/runner/ReadsSparkSinkUnitTest3_bf12d752-09b6-421b-bd3d-c420126cb019.bam.parts/output dst=/user/runner/ReadsSparkSinkUnitTest3_bf12d752-09b6-421b-bd3d-c420126cb019.bam perm=runner:supergroup:rw-r--r-- proto=rpc
14:51:21.967 INFO HadoopFileSystemWrapper - Concatenating to hdfs://localhost:43595/user/runner/ReadsSparkSinkUnitTest3_bf12d752-09b6-421b-bd3d-c420126cb019.bam done
14:51:21.967 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest3_bf12d752-09b6-421b-bd3d-c420126cb019.bam dst=null perm=null proto=rpc
14:51:21.967 INFO IndexFileMerger - Merging .sbi files in temp directory hdfs://localhost:43595/user/runner/ReadsSparkSinkUnitTest3_bf12d752-09b6-421b-bd3d-c420126cb019.bam.parts/ to hdfs://localhost:43595/user/runner/ReadsSparkSinkUnitTest3_bf12d752-09b6-421b-bd3d-c420126cb019.bam.sbi
14:51:21.968 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=listStatus src=/user/runner/ReadsSparkSinkUnitTest3_bf12d752-09b6-421b-bd3d-c420126cb019.bam.parts dst=null perm=null proto=rpc
14:51:21.969 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=create src=/user/runner/ReadsSparkSinkUnitTest3_bf12d752-09b6-421b-bd3d-c420126cb019.bam.sbi dst=null perm=runner:supergroup:rw-r--r-- proto=rpc
14:51:21.970 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest3_bf12d752-09b6-421b-bd3d-c420126cb019.bam.parts/.part-r-00000.sbi dst=null perm=null proto=rpc
14:51:21.970 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest3_bf12d752-09b6-421b-bd3d-c420126cb019.bam.parts/.part-r-00000.sbi dst=null perm=null proto=rpc
14:51:21.972 WARN DFSUtil - Unexpected value for data transfer bytes=208 duration=0
14:51:21.972 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=delete src=/user/runner/ReadsSparkSinkUnitTest3_bf12d752-09b6-421b-bd3d-c420126cb019.bam.parts/.part-r-00000.sbi dst=null perm=null proto=rpc
14:51:21.973 INFO StateChange - BLOCK* allocate blk_1073741864_1040, replicas=127.0.0.1:34059 for /user/runner/ReadsSparkSinkUnitTest3_bf12d752-09b6-421b-bd3d-c420126cb019.bam.sbi
14:51:21.975 INFO DataNode - Receiving BP-1768883704-10.1.0.125-1772635868711:blk_1073741864_1040 src: /127.0.0.1:37182 dest: /127.0.0.1:34059
14:51:21.976 INFO clienttrace - src: /127.0.0.1:37182, dest: /127.0.0.1:34059, bytes: 204, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-951833550_1, offset: 0, srvID: 3c05d188-bc2c-46c8-958b-79f7184a02a2, blockid: BP-1768883704-10.1.0.125-1772635868711:blk_1073741864_1040, duration(ns): 550769
14:51:21.976 INFO DataNode - PacketResponder: BP-1768883704-10.1.0.125-1772635868711:blk_1073741864_1040, type=LAST_IN_PIPELINE terminating
14:51:21.977 INFO FSNamesystem - BLOCK* blk_1073741864_1040 is COMMITTED but not COMPLETE(numNodes= 0 < minimum = 1) in file /user/runner/ReadsSparkSinkUnitTest3_bf12d752-09b6-421b-bd3d-c420126cb019.bam.sbi
14:51:22.028 INFO FsDatasetAsyncDiskService - Scheduling blk_1073741853_1029 replica FinalizedReplica, blk_1073741853_1029, FINALIZED
getNumBytes() = 212
getBytesOnDisk() = 212
getVisibleLength()= 212
getVolume() = /tmp/minicluster_storage16268522075870465194/data/data1
getBlockURI() = file:/tmp/minicluster_storage16268522075870465194/data/data1/current/BP-1768883704-10.1.0.125-1772635868711/current/finalized/subdir0/subdir0/blk_1073741853 for deletion
14:51:22.028 INFO FsDatasetAsyncDiskService - Scheduling blk_1073741854_1030 replica FinalizedReplica, blk_1073741854_1030, FINALIZED
getNumBytes() = 5472
getBytesOnDisk() = 5472
getVisibleLength()= 5472
getVolume() = /tmp/minicluster_storage16268522075870465194/data/data2
getBlockURI() = file:/tmp/minicluster_storage16268522075870465194/data/data2/current/BP-1768883704-10.1.0.125-1772635868711/current/finalized/subdir0/subdir0/blk_1073741854 for deletion
14:51:22.028 INFO FsDatasetAsyncDiskService - Deleted BP-1768883704-10.1.0.125-1772635868711 blk_1073741853_1029 URI file:/tmp/minicluster_storage16268522075870465194/data/data1/current/BP-1768883704-10.1.0.125-1772635868711/current/finalized/subdir0/subdir0/blk_1073741853
14:51:22.028 INFO FsDatasetAsyncDiskService - Deleted BP-1768883704-10.1.0.125-1772635868711 blk_1073741854_1030 URI file:/tmp/minicluster_storage16268522075870465194/data/data2/current/BP-1768883704-10.1.0.125-1772635868711/current/finalized/subdir0/subdir0/blk_1073741854
14:51:22.378 INFO StateChange - DIR* completeFile: /user/runner/ReadsSparkSinkUnitTest3_bf12d752-09b6-421b-bd3d-c420126cb019.bam.sbi is closed by DFSClient_NONMAPREDUCE_-951833550_1
14:51:22.378 INFO IndexFileMerger - Done merging .sbi files
14:51:22.378 INFO IndexFileMerger - Merging .bai files in temp directory hdfs://localhost:43595/user/runner/ReadsSparkSinkUnitTest3_bf12d752-09b6-421b-bd3d-c420126cb019.bam.parts/ to hdfs://localhost:43595/user/runner/ReadsSparkSinkUnitTest3_bf12d752-09b6-421b-bd3d-c420126cb019.bam.bai
14:51:22.379 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=listStatus src=/user/runner/ReadsSparkSinkUnitTest3_bf12d752-09b6-421b-bd3d-c420126cb019.bam.parts dst=null perm=null proto=rpc
14:51:22.380 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=create src=/user/runner/ReadsSparkSinkUnitTest3_bf12d752-09b6-421b-bd3d-c420126cb019.bam.bai dst=null perm=runner:supergroup:rw-r--r-- proto=rpc
14:51:22.381 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest3_bf12d752-09b6-421b-bd3d-c420126cb019.bam.parts/.part-r-00000.bai dst=null perm=null proto=rpc
14:51:22.381 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest3_bf12d752-09b6-421b-bd3d-c420126cb019.bam.parts/.part-r-00000.bai dst=null perm=null proto=rpc
14:51:22.383 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=delete src=/user/runner/ReadsSparkSinkUnitTest3_bf12d752-09b6-421b-bd3d-c420126cb019.bam.parts/.part-r-00000.bai dst=null perm=null proto=rpc
14:51:22.386 INFO StateChange - BLOCK* allocate blk_1073741865_1041, replicas=127.0.0.1:34059 for /user/runner/ReadsSparkSinkUnitTest3_bf12d752-09b6-421b-bd3d-c420126cb019.bam.bai
14:51:22.387 INFO DataNode - Receiving BP-1768883704-10.1.0.125-1772635868711:blk_1073741865_1041 src: /127.0.0.1:37194 dest: /127.0.0.1:34059
14:51:22.388 INFO clienttrace - src: /127.0.0.1:37194, dest: /127.0.0.1:34059, bytes: 592, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-951833550_1, offset: 0, srvID: 3c05d188-bc2c-46c8-958b-79f7184a02a2, blockid: BP-1768883704-10.1.0.125-1772635868711:blk_1073741865_1041, duration(ns): 548716
14:51:22.388 INFO DataNode - PacketResponder: BP-1768883704-10.1.0.125-1772635868711:blk_1073741865_1041, type=LAST_IN_PIPELINE terminating
14:51:22.389 INFO FSNamesystem - BLOCK* blk_1073741865_1041 is COMMITTED but not COMPLETE(numNodes= 0 < minimum = 1) in file /user/runner/ReadsSparkSinkUnitTest3_bf12d752-09b6-421b-bd3d-c420126cb019.bam.bai
14:51:22.790 INFO StateChange - DIR* completeFile: /user/runner/ReadsSparkSinkUnitTest3_bf12d752-09b6-421b-bd3d-c420126cb019.bam.bai is closed by DFSClient_NONMAPREDUCE_-951833550_1
14:51:22.790 INFO IndexFileMerger - Done merging .bai files
14:51:22.791 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=delete src=/user/runner/ReadsSparkSinkUnitTest3_bf12d752-09b6-421b-bd3d-c420126cb019.bam.parts dst=null perm=null proto=rpc
14:51:22.801 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest3_bf12d752-09b6-421b-bd3d-c420126cb019.bam.bai dst=null perm=null proto=rpc
14:51:22.810 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest3_bf12d752-09b6-421b-bd3d-c420126cb019.bam.sbi dst=null perm=null proto=rpc
14:51:22.811 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest3_bf12d752-09b6-421b-bd3d-c420126cb019.bam.sbi dst=null perm=null proto=rpc
14:51:22.811 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest3_bf12d752-09b6-421b-bd3d-c420126cb019.bam.sbi dst=null perm=null proto=rpc
14:51:22.813 WARN DFSUtil - Unexpected value for data transfer bytes=208 duration=0
14:51:22.813 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest3_bf12d752-09b6-421b-bd3d-c420126cb019.bam dst=null perm=null proto=rpc
14:51:22.814 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest3_bf12d752-09b6-421b-bd3d-c420126cb019.bam dst=null perm=null proto=rpc
14:51:22.814 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest3_bf12d752-09b6-421b-bd3d-c420126cb019.bam dst=null perm=null proto=rpc
14:51:22.815 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest3_bf12d752-09b6-421b-bd3d-c420126cb019.bam dst=null perm=null proto=rpc
14:51:22.816 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest3_bf12d752-09b6-421b-bd3d-c420126cb019.bam.bai dst=null perm=null proto=rpc
14:51:22.817 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest3_bf12d752-09b6-421b-bd3d-c420126cb019.bam.bai dst=null perm=null proto=rpc
14:51:22.817 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest3_bf12d752-09b6-421b-bd3d-c420126cb019.bam.bai dst=null perm=null proto=rpc
14:51:22.819 WARN DFSUtil - Unexpected value for data transfer bytes=1202 duration=0
14:51:22.821 WARN DFSUtil - Unexpected value for data transfer bytes=600 duration=0
14:51:22.822 WARN DFSUtil - Unexpected value for data transfer bytes=237139 duration=0
14:51:22.822 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest3_bf12d752-09b6-421b-bd3d-c420126cb019.bam.sbi dst=null perm=null proto=rpc
14:51:22.823 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest3_bf12d752-09b6-421b-bd3d-c420126cb019.bam.sbi dst=null perm=null proto=rpc
14:51:22.823 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest3_bf12d752-09b6-421b-bd3d-c420126cb019.bam.sbi dst=null perm=null proto=rpc
14:51:22.824 WARN DFSUtil - Unexpected value for data transfer bytes=208 duration=0
14:51:22.825 INFO MemoryStore - Block broadcast_104 stored as values in memory (estimated size 312.0 B, free 1918.8 MiB)
14:51:22.826 INFO MemoryStore - Block broadcast_104_piece0 stored as bytes in memory (estimated size 231.0 B, free 1918.8 MiB)
14:51:22.826 INFO BlockManagerInfo - Added broadcast_104_piece0 in memory on localhost:44923 (size: 231.0 B, free: 1919.7 MiB)
14:51:22.826 INFO SparkContext - Created broadcast 104 from broadcast at BamSource.java:104
14:51:22.828 INFO MemoryStore - Block broadcast_105 stored as values in memory (estimated size 297.9 KiB, free 1918.5 MiB)
14:51:22.839 INFO MemoryStore - Block broadcast_105_piece0 stored as bytes in memory (estimated size 50.2 KiB, free 1918.5 MiB)
14:51:22.839 INFO BlockManagerInfo - Added broadcast_105_piece0 in memory on localhost:44923 (size: 50.2 KiB, free: 1919.7 MiB)
14:51:22.840 INFO SparkContext - Created broadcast 105 from newAPIHadoopFile at PathSplitSource.java:96
14:51:22.856 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest3_bf12d752-09b6-421b-bd3d-c420126cb019.bam dst=null perm=null proto=rpc
14:51:22.857 INFO FileInputFormat - Total input files to process : 1
14:51:22.857 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest3_bf12d752-09b6-421b-bd3d-c420126cb019.bam dst=null perm=null proto=rpc
14:51:22.878 INFO SparkContext - Starting job: collect at ReadsSparkSinkUnitTest.java:182
14:51:22.879 INFO DAGScheduler - Got job 45 (collect at ReadsSparkSinkUnitTest.java:182) with 1 output partitions
14:51:22.879 INFO DAGScheduler - Final stage: ResultStage 61 (collect at ReadsSparkSinkUnitTest.java:182)
14:51:22.879 INFO DAGScheduler - Parents of final stage: List()
14:51:22.879 INFO DAGScheduler - Missing parents: List()
14:51:22.879 INFO DAGScheduler - Submitting ResultStage 61 (MapPartitionsRDD[240] at filter at ReadsSparkSource.java:96), which has no missing parents
14:51:22.886 INFO MemoryStore - Block broadcast_106 stored as values in memory (estimated size 148.2 KiB, free 1918.3 MiB)
14:51:22.887 INFO MemoryStore - Block broadcast_106_piece0 stored as bytes in memory (estimated size 54.6 KiB, free 1918.3 MiB)
14:51:22.887 INFO BlockManagerInfo - Added broadcast_106_piece0 in memory on localhost:44923 (size: 54.6 KiB, free: 1919.6 MiB)
14:51:22.887 INFO SparkContext - Created broadcast 106 from broadcast at DAGScheduler.scala:1580
14:51:22.888 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 61 (MapPartitionsRDD[240] at filter at ReadsSparkSource.java:96) (first 15 tasks are for partitions Vector(0))
14:51:22.888 INFO TaskSchedulerImpl - Adding task set 61.0 with 1 tasks resource profile 0
14:51:22.888 INFO TaskSetManager - Starting task 0.0 in stage 61.0 (TID 99) (localhost, executor driver, partition 0, ANY, 7852 bytes)
14:51:22.889 INFO Executor - Running task 0.0 in stage 61.0 (TID 99)
14:51:22.902 INFO NewHadoopRDD - Input split: hdfs://localhost:43595/user/runner/ReadsSparkSinkUnitTest3_bf12d752-09b6-421b-bd3d-c420126cb019.bam:0+236517
14:51:22.903 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest3_bf12d752-09b6-421b-bd3d-c420126cb019.bam dst=null perm=null proto=rpc
14:51:22.904 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest3_bf12d752-09b6-421b-bd3d-c420126cb019.bam dst=null perm=null proto=rpc
14:51:22.905 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest3_bf12d752-09b6-421b-bd3d-c420126cb019.bam.bai dst=null perm=null proto=rpc
14:51:22.906 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest3_bf12d752-09b6-421b-bd3d-c420126cb019.bam.bai dst=null perm=null proto=rpc
14:51:22.906 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest3_bf12d752-09b6-421b-bd3d-c420126cb019.bam.bai dst=null perm=null proto=rpc
14:51:22.908 WARN DFSUtil - Unexpected value for data transfer bytes=1202 duration=0
14:51:22.910 WARN DFSUtil - Unexpected value for data transfer bytes=600 duration=0
14:51:22.910 WARN DFSUtil - Unexpected value for data transfer bytes=600 duration=0
14:51:22.911 WARN DFSUtil - Unexpected value for data transfer bytes=237139 duration=0
14:51:22.912 WARN DFSUtil - Unexpected value for data transfer bytes=32 duration=0
14:51:22.915 INFO Executor - Finished task 0.0 in stage 61.0 (TID 99). 749470 bytes result sent to driver
14:51:22.916 INFO TaskSetManager - Finished task 0.0 in stage 61.0 (TID 99) in 28 ms on localhost (executor driver) (1/1)
14:51:22.916 INFO TaskSchedulerImpl - Removed TaskSet 61.0, whose tasks have all completed, from pool
14:51:22.917 INFO DAGScheduler - ResultStage 61 (collect at ReadsSparkSinkUnitTest.java:182) finished in 0.037 s
14:51:22.917 INFO DAGScheduler - Job 45 is finished. Cancelling potential speculative or zombie tasks for this job
14:51:22.917 INFO TaskSchedulerImpl - Killing all running tasks in stage 61: Stage finished
14:51:22.917 INFO DAGScheduler - Job 45 finished: collect at ReadsSparkSinkUnitTest.java:182, took 0.038846 s
14:51:22.932 INFO SparkContext - Starting job: count at ReadsSparkSinkUnitTest.java:185
14:51:22.932 INFO DAGScheduler - Got job 46 (count at ReadsSparkSinkUnitTest.java:185) with 1 output partitions
14:51:22.933 INFO DAGScheduler - Final stage: ResultStage 62 (count at ReadsSparkSinkUnitTest.java:185)
14:51:22.933 INFO DAGScheduler - Parents of final stage: List()
14:51:22.933 INFO DAGScheduler - Missing parents: List()
14:51:22.933 INFO DAGScheduler - Submitting ResultStage 62 (MapPartitionsRDD[222] at filter at ReadsSparkSource.java:96), which has no missing parents
14:51:22.950 INFO MemoryStore - Block broadcast_107 stored as values in memory (estimated size 426.1 KiB, free 1917.9 MiB)
14:51:22.952 INFO MemoryStore - Block broadcast_107_piece0 stored as bytes in memory (estimated size 153.6 KiB, free 1917.7 MiB)
14:51:22.952 INFO BlockManagerInfo - Added broadcast_107_piece0 in memory on localhost:44923 (size: 153.6 KiB, free: 1919.5 MiB)
14:51:22.952 INFO SparkContext - Created broadcast 107 from broadcast at DAGScheduler.scala:1580
14:51:22.953 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 62 (MapPartitionsRDD[222] at filter at ReadsSparkSource.java:96) (first 15 tasks are for partitions Vector(0))
14:51:22.953 INFO TaskSchedulerImpl - Adding task set 62.0 with 1 tasks resource profile 0
14:51:22.954 INFO TaskSetManager - Starting task 0.0 in stage 62.0 (TID 100) (localhost, executor driver, partition 0, PROCESS_LOCAL, 7893 bytes)
14:51:22.954 INFO Executor - Running task 0.0 in stage 62.0 (TID 100)
14:51:22.992 INFO NewHadoopRDD - Input split: file:/home/runner/work/gatk/gatk/src/test/resources/org/broadinstitute/hellbender/tools/BQSR/CEUTrio.HiSeq.WGS.b37.ch20.1m-1m1k.NA12878.bam:0+211123
14:51:23.004 INFO Executor - Finished task 0.0 in stage 62.0 (TID 100). 989 bytes result sent to driver
14:51:23.005 INFO TaskSetManager - Finished task 0.0 in stage 62.0 (TID 100) in 52 ms on localhost (executor driver) (1/1)
14:51:23.005 INFO TaskSchedulerImpl - Removed TaskSet 62.0, whose tasks have all completed, from pool
14:51:23.005 INFO DAGScheduler - ResultStage 62 (count at ReadsSparkSinkUnitTest.java:185) finished in 0.072 s
14:51:23.005 INFO DAGScheduler - Job 46 is finished. Cancelling potential speculative or zombie tasks for this job
14:51:23.006 INFO TaskSchedulerImpl - Killing all running tasks in stage 62: Stage finished
14:51:23.006 INFO DAGScheduler - Job 46 finished: count at ReadsSparkSinkUnitTest.java:185, took 0.073751 s
14:51:23.009 INFO SparkContext - Starting job: count at ReadsSparkSinkUnitTest.java:185
14:51:23.010 INFO DAGScheduler - Got job 47 (count at ReadsSparkSinkUnitTest.java:185) with 1 output partitions
14:51:23.010 INFO DAGScheduler - Final stage: ResultStage 63 (count at ReadsSparkSinkUnitTest.java:185)
14:51:23.010 INFO DAGScheduler - Parents of final stage: List()
14:51:23.010 INFO DAGScheduler - Missing parents: List()
14:51:23.010 INFO DAGScheduler - Submitting ResultStage 63 (MapPartitionsRDD[240] at filter at ReadsSparkSource.java:96), which has no missing parents
14:51:23.017 INFO MemoryStore - Block broadcast_108 stored as values in memory (estimated size 148.1 KiB, free 1917.6 MiB)
14:51:23.018 INFO MemoryStore - Block broadcast_108_piece0 stored as bytes in memory (estimated size 54.6 KiB, free 1917.5 MiB)
14:51:23.018 INFO BlockManagerInfo - Added broadcast_108_piece0 in memory on localhost:44923 (size: 54.6 KiB, free: 1919.4 MiB)
14:51:23.018 INFO SparkContext - Created broadcast 108 from broadcast at DAGScheduler.scala:1580
14:51:23.019 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 63 (MapPartitionsRDD[240] at filter at ReadsSparkSource.java:96) (first 15 tasks are for partitions Vector(0))
14:51:23.019 INFO TaskSchedulerImpl - Adding task set 63.0 with 1 tasks resource profile 0
14:51:23.020 INFO TaskSetManager - Starting task 0.0 in stage 63.0 (TID 101) (localhost, executor driver, partition 0, ANY, 7852 bytes)
14:51:23.020 INFO Executor - Running task 0.0 in stage 63.0 (TID 101)
14:51:23.033 INFO NewHadoopRDD - Input split: hdfs://localhost:43595/user/runner/ReadsSparkSinkUnitTest3_bf12d752-09b6-421b-bd3d-c420126cb019.bam:0+236517
14:51:23.034 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest3_bf12d752-09b6-421b-bd3d-c420126cb019.bam dst=null perm=null proto=rpc
14:51:23.035 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest3_bf12d752-09b6-421b-bd3d-c420126cb019.bam dst=null perm=null proto=rpc
14:51:23.035 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest3_bf12d752-09b6-421b-bd3d-c420126cb019.bam.bai dst=null perm=null proto=rpc
14:51:23.036 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest3_bf12d752-09b6-421b-bd3d-c420126cb019.bam.bai dst=null perm=null proto=rpc
14:51:23.037 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest3_bf12d752-09b6-421b-bd3d-c420126cb019.bam.bai dst=null perm=null proto=rpc
14:51:23.038 WARN DFSUtil - Unexpected value for data transfer bytes=1202 duration=0
14:51:23.040 WARN DFSUtil - Unexpected value for data transfer bytes=600 duration=0
14:51:23.040 WARN DFSUtil - Unexpected value for data transfer bytes=600 duration=0
14:51:23.042 WARN DFSUtil - Unexpected value for data transfer bytes=237139 duration=0
14:51:23.042 WARN DFSUtil - Unexpected value for data transfer bytes=32 duration=0
14:51:23.044 INFO Executor - Finished task 0.0 in stage 63.0 (TID 101). 989 bytes result sent to driver
14:51:23.045 INFO TaskSetManager - Finished task 0.0 in stage 63.0 (TID 101) in 26 ms on localhost (executor driver) (1/1)
14:51:23.045 INFO TaskSchedulerImpl - Removed TaskSet 63.0, whose tasks have all completed, from pool
14:51:23.045 INFO DAGScheduler - ResultStage 63 (count at ReadsSparkSinkUnitTest.java:185) finished in 0.034 s
14:51:23.046 INFO DAGScheduler - Job 47 is finished. Cancelling potential speculative or zombie tasks for this job
14:51:23.046 INFO TaskSchedulerImpl - Killing all running tasks in stage 63: Stage finished
14:51:23.046 INFO DAGScheduler - Job 47 finished: count at ReadsSparkSinkUnitTest.java:185, took 0.036546 s
14:51:23.055 INFO MemoryStore - Block broadcast_109 stored as values in memory (estimated size 576.0 B, free 1917.5 MiB)
14:51:23.059 INFO MemoryStore - Block broadcast_109_piece0 stored as bytes in memory (estimated size 228.0 B, free 1917.5 MiB)
14:51:23.059 INFO BlockManagerInfo - Added broadcast_109_piece0 in memory on localhost:44923 (size: 228.0 B, free: 1919.4 MiB)
14:51:23.059 INFO SparkContext - Created broadcast 109 from broadcast at CramSource.java:114
14:51:23.061 INFO MemoryStore - Block broadcast_110 stored as values in memory (estimated size 297.9 KiB, free 1917.2 MiB)
14:51:23.068 INFO MemoryStore - Block broadcast_110_piece0 stored as bytes in memory (estimated size 50.2 KiB, free 1917.2 MiB)
14:51:23.068 INFO BlockManagerInfo - Added broadcast_110_piece0 in memory on localhost:44923 (size: 50.2 KiB, free: 1919.4 MiB)
14:51:23.068 INFO SparkContext - Created broadcast 110 from newAPIHadoopFile at PathSplitSource.java:96
14:51:23.088 INFO MemoryStore - Block broadcast_111 stored as values in memory (estimated size 576.0 B, free 1917.2 MiB)
14:51:23.088 INFO MemoryStore - Block broadcast_111_piece0 stored as bytes in memory (estimated size 228.0 B, free 1917.2 MiB)
14:51:23.089 INFO BlockManagerInfo - Added broadcast_111_piece0 in memory on localhost:44923 (size: 228.0 B, free: 1919.4 MiB)
14:51:23.089 INFO SparkContext - Created broadcast 111 from broadcast at CramSource.java:114
14:51:23.091 INFO MemoryStore - Block broadcast_112 stored as values in memory (estimated size 297.9 KiB, free 1916.9 MiB)
14:51:23.097 INFO MemoryStore - Block broadcast_112_piece0 stored as bytes in memory (estimated size 50.2 KiB, free 1916.8 MiB)
14:51:23.097 INFO BlockManagerInfo - Added broadcast_112_piece0 in memory on localhost:44923 (size: 50.2 KiB, free: 1919.3 MiB)
14:51:23.098 INFO SparkContext - Created broadcast 112 from newAPIHadoopFile at PathSplitSource.java:96
14:51:23.116 INFO FileInputFormat - Total input files to process : 1
14:51:23.118 INFO MemoryStore - Block broadcast_113 stored as values in memory (estimated size 6.0 KiB, free 1916.8 MiB)
14:51:23.119 INFO MemoryStore - Block broadcast_113_piece0 stored as bytes in memory (estimated size 1473.0 B, free 1916.8 MiB)
14:51:23.119 INFO BlockManagerInfo - Added broadcast_113_piece0 in memory on localhost:44923 (size: 1473.0 B, free: 1919.3 MiB)
14:51:23.119 INFO SparkContext - Created broadcast 113 from broadcast at ReadsSparkSink.java:133
14:51:23.121 INFO MemoryStore - Block broadcast_114 stored as values in memory (estimated size 6.2 KiB, free 1916.8 MiB)
14:51:23.121 INFO MemoryStore - Block broadcast_114_piece0 stored as bytes in memory (estimated size 1473.0 B, free 1916.8 MiB)
14:51:23.121 INFO BlockManagerInfo - Added broadcast_114_piece0 in memory on localhost:44923 (size: 1473.0 B, free: 1919.3 MiB)
14:51:23.122 INFO SparkContext - Created broadcast 114 from broadcast at CramSink.java:76
14:51:23.128 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest5_d294ad7a-9b81-4fde-ad12-8fc27a90336f.cram.parts dst=null perm=null proto=rpc
14:51:23.129 INFO PathOutputCommitterFactory - No output committer factory defined, defaulting to FileOutputCommitterFactory
14:51:23.129 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
14:51:23.129 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
14:51:23.130 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=mkdirs src=/user/runner/ReadsSparkSinkUnitTest5_d294ad7a-9b81-4fde-ad12-8fc27a90336f.cram.parts/_temporary/0 dst=null perm=runner:supergroup:rwxr-xr-x proto=rpc
14:51:23.136 INFO SparkContext - Starting job: runJob at SparkHadoopWriter.scala:83
14:51:23.137 INFO DAGScheduler - Registering RDD 252 (mapToPair at SparkUtils.java:161) as input to shuffle 14
14:51:23.138 INFO DAGScheduler - Got job 48 (runJob at SparkHadoopWriter.scala:83) with 1 output partitions
14:51:23.138 INFO DAGScheduler - Final stage: ResultStage 65 (runJob at SparkHadoopWriter.scala:83)
14:51:23.138 INFO DAGScheduler - Parents of final stage: List(ShuffleMapStage 64)
14:51:23.138 INFO DAGScheduler - Missing parents: List(ShuffleMapStage 64)
14:51:23.138 INFO DAGScheduler - Submitting ShuffleMapStage 64 (MapPartitionsRDD[252] at mapToPair at SparkUtils.java:161), which has no missing parents
14:51:23.159 INFO MemoryStore - Block broadcast_115 stored as values in memory (estimated size 292.8 KiB, free 1916.5 MiB)
14:51:23.160 INFO MemoryStore - Block broadcast_115_piece0 stored as bytes in memory (estimated size 107.3 KiB, free 1916.4 MiB)
14:51:23.160 INFO BlockManagerInfo - Added broadcast_115_piece0 in memory on localhost:44923 (size: 107.3 KiB, free: 1919.2 MiB)
14:51:23.160 INFO SparkContext - Created broadcast 115 from broadcast at DAGScheduler.scala:1580
14:51:23.161 INFO DAGScheduler - Submitting 1 missing tasks from ShuffleMapStage 64 (MapPartitionsRDD[252] at mapToPair at SparkUtils.java:161) (first 15 tasks are for partitions Vector(0))
14:51:23.161 INFO TaskSchedulerImpl - Adding task set 64.0 with 1 tasks resource profile 0
14:51:23.162 INFO TaskSetManager - Starting task 0.0 in stage 64.0 (TID 102) (localhost, executor driver, partition 0, PROCESS_LOCAL, 7869 bytes)
14:51:23.162 INFO Executor - Running task 0.0 in stage 64.0 (TID 102)
14:51:23.187 INFO NewHadoopRDD - Input split: file:/home/runner/work/gatk/gatk/src/test/resources/org/broadinstitute/hellbender/tools/BQSR/NA12878.chr17_69k_70k.dictFix.cram:0+50619
14:51:23.215 INFO Executor - Finished task 0.0 in stage 64.0 (TID 102). 1148 bytes result sent to driver
14:51:23.216 INFO TaskSetManager - Finished task 0.0 in stage 64.0 (TID 102) in 54 ms on localhost (executor driver) (1/1)
14:51:23.216 INFO TaskSchedulerImpl - Removed TaskSet 64.0, whose tasks have all completed, from pool
14:51:23.216 INFO DAGScheduler - ShuffleMapStage 64 (mapToPair at SparkUtils.java:161) finished in 0.078 s
14:51:23.216 INFO DAGScheduler - looking for newly runnable stages
14:51:23.216 INFO DAGScheduler - running: HashSet()
14:51:23.216 INFO DAGScheduler - waiting: HashSet(ResultStage 65)
14:51:23.216 INFO DAGScheduler - failed: HashSet()
14:51:23.216 INFO DAGScheduler - Submitting ResultStage 65 (MapPartitionsRDD[257] at mapToPair at CramSink.java:89), which has no missing parents
14:51:23.223 INFO MemoryStore - Block broadcast_116 stored as values in memory (estimated size 153.3 KiB, free 1916.3 MiB)
14:51:23.224 INFO MemoryStore - Block broadcast_116_piece0 stored as bytes in memory (estimated size 58.1 KiB, free 1916.2 MiB)
14:51:23.224 INFO BlockManagerInfo - Added broadcast_116_piece0 in memory on localhost:44923 (size: 58.1 KiB, free: 1919.2 MiB)
14:51:23.225 INFO SparkContext - Created broadcast 116 from broadcast at DAGScheduler.scala:1580
14:51:23.225 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 65 (MapPartitionsRDD[257] at mapToPair at CramSink.java:89) (first 15 tasks are for partitions Vector(0))
14:51:23.225 INFO TaskSchedulerImpl - Adding task set 65.0 with 1 tasks resource profile 0
14:51:23.226 INFO TaskSetManager - Starting task 0.0 in stage 65.0 (TID 103) (localhost, executor driver, partition 0, NODE_LOCAL, 7513 bytes)
14:51:23.227 INFO Executor - Running task 0.0 in stage 65.0 (TID 103)
14:51:23.234 INFO ShuffleBlockFetcherIterator - Getting 1 (82.3 KiB) non-empty blocks including 1 (82.3 KiB) local and 0 (0.0 B) host-local and 0 (0.0 B) push-merged-local and 0 (0.0 B) remote blocks
14:51:23.234 INFO ShuffleBlockFetcherIterator - Started 0 remote fetches in 0 ms
14:51:23.243 INFO PathOutputCommitterFactory - No output committer factory defined, defaulting to FileOutputCommitterFactory
14:51:23.243 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
14:51:23.243 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
14:51:23.243 INFO PathOutputCommitterFactory - No output committer factory defined, defaulting to FileOutputCommitterFactory
14:51:23.243 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
14:51:23.243 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
14:51:23.246 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=create src=/user/runner/ReadsSparkSinkUnitTest5_d294ad7a-9b81-4fde-ad12-8fc27a90336f.cram.parts/_temporary/0/_temporary/attempt_202603041451235902408217677748548_0257_r_000000_0/part-r-00000 dst=null perm=runner:supergroup:rw-r--r-- proto=rpc
14:51:23.361 INFO StateChange - BLOCK* allocate blk_1073741866_1042, replicas=127.0.0.1:34059 for /user/runner/ReadsSparkSinkUnitTest5_d294ad7a-9b81-4fde-ad12-8fc27a90336f.cram.parts/_temporary/0/_temporary/attempt_202603041451235902408217677748548_0257_r_000000_0/part-r-00000
14:51:23.362 INFO DataNode - Receiving BP-1768883704-10.1.0.125-1772635868711:blk_1073741866_1042 src: /127.0.0.1:37210 dest: /127.0.0.1:34059
14:51:23.364 INFO clienttrace - src: /127.0.0.1:37210, dest: /127.0.0.1:34059, bytes: 42659, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-951833550_1, offset: 0, srvID: 3c05d188-bc2c-46c8-958b-79f7184a02a2, blockid: BP-1768883704-10.1.0.125-1772635868711:blk_1073741866_1042, duration(ns): 670275
14:51:23.364 INFO DataNode - PacketResponder: BP-1768883704-10.1.0.125-1772635868711:blk_1073741866_1042, type=LAST_IN_PIPELINE terminating
14:51:23.364 INFO FSNamesystem - BLOCK* blk_1073741866_1042 is COMMITTED but not COMPLETE(numNodes= 0 < minimum = 1) in file /user/runner/ReadsSparkSinkUnitTest5_d294ad7a-9b81-4fde-ad12-8fc27a90336f.cram.parts/_temporary/0/_temporary/attempt_202603041451235902408217677748548_0257_r_000000_0/part-r-00000
14:51:23.765 INFO StateChange - DIR* completeFile: /user/runner/ReadsSparkSinkUnitTest5_d294ad7a-9b81-4fde-ad12-8fc27a90336f.cram.parts/_temporary/0/_temporary/attempt_202603041451235902408217677748548_0257_r_000000_0/part-r-00000 is closed by DFSClient_NONMAPREDUCE_-951833550_1
14:51:23.767 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest5_d294ad7a-9b81-4fde-ad12-8fc27a90336f.cram.parts/_temporary/0/_temporary/attempt_202603041451235902408217677748548_0257_r_000000_0 dst=null perm=null proto=rpc
14:51:23.767 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest5_d294ad7a-9b81-4fde-ad12-8fc27a90336f.cram.parts/_temporary/0/_temporary/attempt_202603041451235902408217677748548_0257_r_000000_0 dst=null perm=null proto=rpc
14:51:23.768 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest5_d294ad7a-9b81-4fde-ad12-8fc27a90336f.cram.parts/_temporary/0/task_202603041451235902408217677748548_0257_r_000000 dst=null perm=null proto=rpc
14:51:23.769 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=rename src=/user/runner/ReadsSparkSinkUnitTest5_d294ad7a-9b81-4fde-ad12-8fc27a90336f.cram.parts/_temporary/0/_temporary/attempt_202603041451235902408217677748548_0257_r_000000_0 dst=/user/runner/ReadsSparkSinkUnitTest5_d294ad7a-9b81-4fde-ad12-8fc27a90336f.cram.parts/_temporary/0/task_202603041451235902408217677748548_0257_r_000000 perm=runner:supergroup:rwxr-xr-x proto=rpc
14:51:23.769 INFO FileOutputCommitter - Saved output of task 'attempt_202603041451235902408217677748548_0257_r_000000_0' to hdfs://localhost:43595/user/runner/ReadsSparkSinkUnitTest5_d294ad7a-9b81-4fde-ad12-8fc27a90336f.cram.parts/_temporary/0/task_202603041451235902408217677748548_0257_r_000000
14:51:23.769 INFO SparkHadoopMapRedUtil - attempt_202603041451235902408217677748548_0257_r_000000_0: Committed. Elapsed time: 2 ms.
14:51:23.777 INFO Executor - Finished task 0.0 in stage 65.0 (TID 103). 1944 bytes result sent to driver
14:51:23.778 INFO BlockManagerInfo - Removed broadcast_105_piece0 on localhost:44923 in memory (size: 50.2 KiB, free: 1919.2 MiB)
14:51:23.778 INFO TaskSetManager - Finished task 0.0 in stage 65.0 (TID 103) in 552 ms on localhost (executor driver) (1/1)
14:51:23.778 INFO TaskSchedulerImpl - Removed TaskSet 65.0, whose tasks have all completed, from pool
14:51:23.779 INFO DAGScheduler - ResultStage 65 (runJob at SparkHadoopWriter.scala:83) finished in 0.562 s
14:51:23.779 INFO DAGScheduler - Job 48 is finished. Cancelling potential speculative or zombie tasks for this job
14:51:23.779 INFO TaskSchedulerImpl - Killing all running tasks in stage 65: Stage finished
14:51:23.779 INFO DAGScheduler - Job 48 finished: runJob at SparkHadoopWriter.scala:83, took 0.642428 s
14:51:23.779 INFO BlockManagerInfo - Removed broadcast_100_piece0 on localhost:44923 in memory (size: 1890.0 B, free: 1919.2 MiB)
14:51:23.780 INFO SparkHadoopWriter - Start to commit write Job job_202603041451235902408217677748548_0257.
14:51:23.780 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=listStatus src=/user/runner/ReadsSparkSinkUnitTest5_d294ad7a-9b81-4fde-ad12-8fc27a90336f.cram.parts/_temporary/0 dst=null perm=null proto=rpc
14:51:23.781 INFO BlockManagerInfo - Removed broadcast_112_piece0 on localhost:44923 in memory (size: 50.2 KiB, free: 1919.3 MiB)
14:51:23.781 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest5_d294ad7a-9b81-4fde-ad12-8fc27a90336f.cram.parts dst=null perm=null proto=rpc
14:51:23.781 INFO BlockManagerInfo - Removed broadcast_102_piece0 on localhost:44923 in memory (size: 157.6 KiB, free: 1919.4 MiB)
14:51:23.782 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=listStatus src=/user/runner/ReadsSparkSinkUnitTest5_d294ad7a-9b81-4fde-ad12-8fc27a90336f.cram.parts/_temporary/0/task_202603041451235902408217677748548_0257_r_000000 dst=null perm=null proto=rpc
14:51:23.782 INFO BlockManagerInfo - Removed broadcast_111_piece0 on localhost:44923 in memory (size: 228.0 B, free: 1919.4 MiB)
14:51:23.783 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest5_d294ad7a-9b81-4fde-ad12-8fc27a90336f.cram.parts/part-r-00000 dst=null perm=null proto=rpc
14:51:23.783 INFO BlockManagerInfo - Removed broadcast_107_piece0 on localhost:44923 in memory (size: 153.6 KiB, free: 1919.6 MiB)
14:51:23.784 INFO BlockManagerInfo - Removed broadcast_98_piece0 on localhost:44923 in memory (size: 50.2 KiB, free: 1919.6 MiB)
14:51:23.784 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=rename src=/user/runner/ReadsSparkSinkUnitTest5_d294ad7a-9b81-4fde-ad12-8fc27a90336f.cram.parts/_temporary/0/task_202603041451235902408217677748548_0257_r_000000/part-r-00000 dst=/user/runner/ReadsSparkSinkUnitTest5_d294ad7a-9b81-4fde-ad12-8fc27a90336f.cram.parts/part-r-00000 perm=runner:supergroup:rw-r--r-- proto=rpc
14:51:23.784 INFO BlockManagerInfo - Removed broadcast_108_piece0 on localhost:44923 in memory (size: 54.6 KiB, free: 1919.7 MiB)
14:51:23.785 INFO BlockManagerInfo - Removed broadcast_103_piece0 on localhost:44923 in memory (size: 58.6 KiB, free: 1919.7 MiB)
14:51:23.785 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=delete src=/user/runner/ReadsSparkSinkUnitTest5_d294ad7a-9b81-4fde-ad12-8fc27a90336f.cram.parts/_temporary dst=null perm=null proto=rpc
14:51:23.786 INFO BlockManagerInfo - Removed broadcast_106_piece0 on localhost:44923 in memory (size: 54.6 KiB, free: 1919.8 MiB)
14:51:23.786 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=create src=/user/runner/ReadsSparkSinkUnitTest5_d294ad7a-9b81-4fde-ad12-8fc27a90336f.cram.parts/_SUCCESS dst=null perm=runner:supergroup:rw-r--r-- proto=rpc
14:51:23.787 INFO BlockManagerInfo - Removed broadcast_104_piece0 on localhost:44923 in memory (size: 231.0 B, free: 1919.8 MiB)
14:51:23.788 INFO BlockManagerInfo - Removed broadcast_101_piece0 on localhost:44923 in memory (size: 1890.0 B, free: 1919.8 MiB)
14:51:23.788 INFO StateChange - DIR* completeFile: /user/runner/ReadsSparkSinkUnitTest5_d294ad7a-9b81-4fde-ad12-8fc27a90336f.cram.parts/_SUCCESS is closed by DFSClient_NONMAPREDUCE_-951833550_1
14:51:23.790 INFO BlockManagerInfo - Removed broadcast_115_piece0 on localhost:44923 in memory (size: 107.3 KiB, free: 1919.9 MiB)
14:51:23.790 INFO audit - allowed=false ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=delete src=/user/runner/ReadsSparkSinkUnitTest5_d294ad7a-9b81-4fde-ad12-8fc27a90336f.cram.parts/.spark-staging-257 dst=null perm=null proto=rpc
14:51:23.790 INFO SparkHadoopWriter - Write Job job_202603041451235902408217677748548_0257 committed. Elapsed time: 10 ms.
14:51:23.791 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=create src=/user/runner/ReadsSparkSinkUnitTest5_d294ad7a-9b81-4fde-ad12-8fc27a90336f.cram.parts/header dst=null perm=runner:supergroup:rw-r--r-- proto=rpc
14:51:23.794 INFO StateChange - BLOCK* allocate blk_1073741867_1043, replicas=127.0.0.1:34059 for /user/runner/ReadsSparkSinkUnitTest5_d294ad7a-9b81-4fde-ad12-8fc27a90336f.cram.parts/header
14:51:23.795 INFO DataNode - Receiving BP-1768883704-10.1.0.125-1772635868711:blk_1073741867_1043 src: /127.0.0.1:37218 dest: /127.0.0.1:34059
14:51:23.797 INFO clienttrace - src: /127.0.0.1:37218, dest: /127.0.0.1:34059, bytes: 1016, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-951833550_1, offset: 0, srvID: 3c05d188-bc2c-46c8-958b-79f7184a02a2, blockid: BP-1768883704-10.1.0.125-1772635868711:blk_1073741867_1043, duration(ns): 583213
14:51:23.797 INFO DataNode - PacketResponder: BP-1768883704-10.1.0.125-1772635868711:blk_1073741867_1043, type=LAST_IN_PIPELINE terminating
14:51:23.798 INFO StateChange - DIR* completeFile: /user/runner/ReadsSparkSinkUnitTest5_d294ad7a-9b81-4fde-ad12-8fc27a90336f.cram.parts/header is closed by DFSClient_NONMAPREDUCE_-951833550_1
14:51:23.799 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=create src=/user/runner/ReadsSparkSinkUnitTest5_d294ad7a-9b81-4fde-ad12-8fc27a90336f.cram.parts/terminator dst=null perm=runner:supergroup:rw-r--r-- proto=rpc
14:51:23.800 INFO StateChange - BLOCK* allocate blk_1073741868_1044, replicas=127.0.0.1:34059 for /user/runner/ReadsSparkSinkUnitTest5_d294ad7a-9b81-4fde-ad12-8fc27a90336f.cram.parts/terminator
14:51:23.801 INFO DataNode - Receiving BP-1768883704-10.1.0.125-1772635868711:blk_1073741868_1044 src: /127.0.0.1:37232 dest: /127.0.0.1:34059
14:51:23.803 INFO clienttrace - src: /127.0.0.1:37232, dest: /127.0.0.1:34059, bytes: 38, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-951833550_1, offset: 0, srvID: 3c05d188-bc2c-46c8-958b-79f7184a02a2, blockid: BP-1768883704-10.1.0.125-1772635868711:blk_1073741868_1044, duration(ns): 593232
14:51:23.803 INFO DataNode - PacketResponder: BP-1768883704-10.1.0.125-1772635868711:blk_1073741868_1044, type=LAST_IN_PIPELINE terminating
14:51:23.804 INFO FSNamesystem - BLOCK* blk_1073741868_1044 is COMMITTED but not COMPLETE(numNodes= 0 < minimum = 1) in file /user/runner/ReadsSparkSinkUnitTest5_d294ad7a-9b81-4fde-ad12-8fc27a90336f.cram.parts/terminator
14:51:24.205 INFO StateChange - DIR* completeFile: /user/runner/ReadsSparkSinkUnitTest5_d294ad7a-9b81-4fde-ad12-8fc27a90336f.cram.parts/terminator is closed by DFSClient_NONMAPREDUCE_-951833550_1
14:51:24.206 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=listStatus src=/user/runner/ReadsSparkSinkUnitTest5_d294ad7a-9b81-4fde-ad12-8fc27a90336f.cram.parts dst=null perm=null proto=rpc
14:51:24.208 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=create src=/user/runner/ReadsSparkSinkUnitTest5_d294ad7a-9b81-4fde-ad12-8fc27a90336f.cram.parts/output dst=null perm=runner:supergroup:rw-r--r-- proto=rpc
14:51:24.209 INFO StateChange - DIR* completeFile: /user/runner/ReadsSparkSinkUnitTest5_d294ad7a-9b81-4fde-ad12-8fc27a90336f.cram.parts/output is closed by DFSClient_NONMAPREDUCE_-951833550_1
14:51:24.209 INFO HadoopFileSystemWrapper - Concatenating 3 parts to hdfs://localhost:43595/user/runner/ReadsSparkSinkUnitTest5_d294ad7a-9b81-4fde-ad12-8fc27a90336f.cram
14:51:24.210 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=concat src=[/user/runner/ReadsSparkSinkUnitTest5_d294ad7a-9b81-4fde-ad12-8fc27a90336f.cram.parts/header, /user/runner/ReadsSparkSinkUnitTest5_d294ad7a-9b81-4fde-ad12-8fc27a90336f.cram.parts/part-r-00000, /user/runner/ReadsSparkSinkUnitTest5_d294ad7a-9b81-4fde-ad12-8fc27a90336f.cram.parts/terminator] dst=/user/runner/ReadsSparkSinkUnitTest5_d294ad7a-9b81-4fde-ad12-8fc27a90336f.cram.parts/output perm=runner:supergroup:rw-r--r-- proto=rpc
14:51:24.210 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest5_d294ad7a-9b81-4fde-ad12-8fc27a90336f.cram dst=null perm=null proto=rpc
14:51:24.211 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=rename src=/user/runner/ReadsSparkSinkUnitTest5_d294ad7a-9b81-4fde-ad12-8fc27a90336f.cram.parts/output dst=/user/runner/ReadsSparkSinkUnitTest5_d294ad7a-9b81-4fde-ad12-8fc27a90336f.cram perm=runner:supergroup:rw-r--r-- proto=rpc
14:51:24.211 INFO HadoopFileSystemWrapper - Concatenating to hdfs://localhost:43595/user/runner/ReadsSparkSinkUnitTest5_d294ad7a-9b81-4fde-ad12-8fc27a90336f.cram done
14:51:24.211 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=delete src=/user/runner/ReadsSparkSinkUnitTest5_d294ad7a-9b81-4fde-ad12-8fc27a90336f.cram.parts dst=null perm=null proto=rpc
14:51:24.212 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest5_d294ad7a-9b81-4fde-ad12-8fc27a90336f.cram dst=null perm=null proto=rpc
14:51:24.212 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest5_d294ad7a-9b81-4fde-ad12-8fc27a90336f.cram dst=null perm=null proto=rpc
14:51:24.213 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest5_d294ad7a-9b81-4fde-ad12-8fc27a90336f.cram dst=null perm=null proto=rpc
14:51:24.214 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest5_d294ad7a-9b81-4fde-ad12-8fc27a90336f.cram dst=null perm=null proto=rpc
14:51:24.215 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest5_d294ad7a-9b81-4fde-ad12-8fc27a90336f.cram.crai dst=null perm=null proto=rpc
14:51:24.215 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest5_d294ad7a-9b81-4fde-ad12-8fc27a90336f.crai dst=null perm=null proto=rpc
14:51:24.218 WARN DFSUtil - Unexpected value for data transfer bytes=1024 duration=0
14:51:24.219 WARN DFSUtil - Unexpected value for data transfer bytes=42995 duration=0
14:51:24.219 WARN DFSUtil - Unexpected value for data transfer bytes=42 duration=0
14:51:24.220 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest5_d294ad7a-9b81-4fde-ad12-8fc27a90336f.cram dst=null perm=null proto=rpc
14:51:24.221 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest5_d294ad7a-9b81-4fde-ad12-8fc27a90336f.cram dst=null perm=null proto=rpc
14:51:24.221 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest5_d294ad7a-9b81-4fde-ad12-8fc27a90336f.cram.crai dst=null perm=null proto=rpc
14:51:24.222 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest5_d294ad7a-9b81-4fde-ad12-8fc27a90336f.crai dst=null perm=null proto=rpc
14:51:24.222 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest5_d294ad7a-9b81-4fde-ad12-8fc27a90336f.cram dst=null perm=null proto=rpc
14:51:24.222 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest5_d294ad7a-9b81-4fde-ad12-8fc27a90336f.cram dst=null perm=null proto=rpc
14:51:24.224 WARN DFSUtil - Unexpected value for data transfer bytes=1024 duration=0
14:51:24.224 WARN DFSUtil - Unexpected value for data transfer bytes=42995 duration=0
14:51:24.225 WARN DFSUtil - Unexpected value for data transfer bytes=42 duration=0
14:51:24.225 INFO MemoryStore - Block broadcast_117 stored as values in memory (estimated size 528.0 B, free 1919.4 MiB)
14:51:24.226 INFO MemoryStore - Block broadcast_117_piece0 stored as bytes in memory (estimated size 187.0 B, free 1919.4 MiB)
14:51:24.226 INFO BlockManagerInfo - Added broadcast_117_piece0 in memory on localhost:44923 (size: 187.0 B, free: 1919.9 MiB)
14:51:24.227 INFO SparkContext - Created broadcast 117 from broadcast at CramSource.java:114
14:51:24.228 INFO MemoryStore - Block broadcast_118 stored as values in memory (estimated size 297.9 KiB, free 1919.1 MiB)
14:51:24.234 INFO MemoryStore - Block broadcast_118_piece0 stored as bytes in memory (estimated size 50.2 KiB, free 1919.1 MiB)
14:51:24.235 INFO BlockManagerInfo - Added broadcast_118_piece0 in memory on localhost:44923 (size: 50.2 KiB, free: 1919.8 MiB)
14:51:24.235 INFO SparkContext - Created broadcast 118 from newAPIHadoopFile at PathSplitSource.java:96
14:51:24.252 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest5_d294ad7a-9b81-4fde-ad12-8fc27a90336f.cram dst=null perm=null proto=rpc
14:51:24.253 INFO FileInputFormat - Total input files to process : 1
14:51:24.253 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest5_d294ad7a-9b81-4fde-ad12-8fc27a90336f.cram dst=null perm=null proto=rpc
14:51:24.281 INFO SparkContext - Starting job: collect at ReadsSparkSinkUnitTest.java:182
14:51:24.282 INFO DAGScheduler - Got job 49 (collect at ReadsSparkSinkUnitTest.java:182) with 1 output partitions
14:51:24.282 INFO DAGScheduler - Final stage: ResultStage 66 (collect at ReadsSparkSinkUnitTest.java:182)
14:51:24.282 INFO DAGScheduler - Parents of final stage: List()
14:51:24.282 INFO DAGScheduler - Missing parents: List()
14:51:24.282 INFO DAGScheduler - Submitting ResultStage 66 (MapPartitionsRDD[263] at filter at ReadsSparkSource.java:96), which has no missing parents
14:51:24.295 INFO MemoryStore - Block broadcast_119 stored as values in memory (estimated size 286.8 KiB, free 1918.8 MiB)
14:51:24.296 INFO MemoryStore - Block broadcast_119_piece0 stored as bytes in memory (estimated size 103.6 KiB, free 1918.7 MiB)
14:51:24.296 INFO BlockManagerInfo - Added broadcast_119_piece0 in memory on localhost:44923 (size: 103.6 KiB, free: 1919.7 MiB)
14:51:24.296 INFO SparkContext - Created broadcast 119 from broadcast at DAGScheduler.scala:1580
14:51:24.297 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 66 (MapPartitionsRDD[263] at filter at ReadsSparkSource.java:96) (first 15 tasks are for partitions Vector(0))
14:51:24.297 INFO TaskSchedulerImpl - Adding task set 66.0 with 1 tasks resource profile 0
14:51:24.298 INFO TaskSetManager - Starting task 0.0 in stage 66.0 (TID 104) (localhost, executor driver, partition 0, ANY, 7853 bytes)
14:51:24.298 INFO Executor - Running task 0.0 in stage 66.0 (TID 104)
14:51:24.327 INFO NewHadoopRDD - Input split: hdfs://localhost:43595/user/runner/ReadsSparkSinkUnitTest5_d294ad7a-9b81-4fde-ad12-8fc27a90336f.cram:0+43713
14:51:24.328 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest5_d294ad7a-9b81-4fde-ad12-8fc27a90336f.cram dst=null perm=null proto=rpc
14:51:24.329 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest5_d294ad7a-9b81-4fde-ad12-8fc27a90336f.cram dst=null perm=null proto=rpc
14:51:24.330 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest5_d294ad7a-9b81-4fde-ad12-8fc27a90336f.cram.crai dst=null perm=null proto=rpc
14:51:24.331 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest5_d294ad7a-9b81-4fde-ad12-8fc27a90336f.crai dst=null perm=null proto=rpc
14:51:24.334 WARN DFSUtil - Unexpected value for data transfer bytes=1024 duration=0
14:51:24.334 WARN DFSUtil - Unexpected value for data transfer bytes=42995 duration=0
14:51:24.335 WARN DFSUtil - Unexpected value for data transfer bytes=42 duration=0
14:51:24.407 INFO Executor - Finished task 0.0 in stage 66.0 (TID 104). 154101 bytes result sent to driver
14:51:24.408 INFO TaskSetManager - Finished task 0.0 in stage 66.0 (TID 104) in 111 ms on localhost (executor driver) (1/1)
14:51:24.408 INFO TaskSchedulerImpl - Removed TaskSet 66.0, whose tasks have all completed, from pool
14:51:24.409 INFO DAGScheduler - ResultStage 66 (collect at ReadsSparkSinkUnitTest.java:182) finished in 0.126 s
14:51:24.409 INFO DAGScheduler - Job 49 is finished. Cancelling potential speculative or zombie tasks for this job
14:51:24.409 INFO TaskSchedulerImpl - Killing all running tasks in stage 66: Stage finished
14:51:24.409 INFO DAGScheduler - Job 49 finished: collect at ReadsSparkSinkUnitTest.java:182, took 0.127795 s
14:51:24.416 INFO SparkContext - Starting job: count at ReadsSparkSinkUnitTest.java:185
14:51:24.416 INFO DAGScheduler - Got job 50 (count at ReadsSparkSinkUnitTest.java:185) with 1 output partitions
14:51:24.416 INFO DAGScheduler - Final stage: ResultStage 67 (count at ReadsSparkSinkUnitTest.java:185)
14:51:24.416 INFO DAGScheduler - Parents of final stage: List()
14:51:24.417 INFO DAGScheduler - Missing parents: List()
14:51:24.417 INFO DAGScheduler - Submitting ResultStage 67 (MapPartitionsRDD[246] at filter at ReadsSparkSource.java:96), which has no missing parents
14:51:24.429 INFO MemoryStore - Block broadcast_120 stored as values in memory (estimated size 286.8 KiB, free 1918.4 MiB)
14:51:24.431 INFO MemoryStore - Block broadcast_120_piece0 stored as bytes in memory (estimated size 103.6 KiB, free 1918.3 MiB)
14:51:24.431 INFO BlockManagerInfo - Added broadcast_120_piece0 in memory on localhost:44923 (size: 103.6 KiB, free: 1919.6 MiB)
14:51:24.432 INFO SparkContext - Created broadcast 120 from broadcast at DAGScheduler.scala:1580
14:51:24.432 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 67 (MapPartitionsRDD[246] at filter at ReadsSparkSource.java:96) (first 15 tasks are for partitions Vector(0))
14:51:24.432 INFO TaskSchedulerImpl - Adding task set 67.0 with 1 tasks resource profile 0
14:51:24.433 INFO TaskSetManager - Starting task 0.0 in stage 67.0 (TID 105) (localhost, executor driver, partition 0, PROCESS_LOCAL, 7880 bytes)
14:51:24.433 INFO Executor - Running task 0.0 in stage 67.0 (TID 105)
14:51:24.460 INFO NewHadoopRDD - Input split: file:/home/runner/work/gatk/gatk/src/test/resources/org/broadinstitute/hellbender/tools/BQSR/NA12878.chr17_69k_70k.dictFix.cram:0+50619
14:51:24.474 INFO Executor - Finished task 0.0 in stage 67.0 (TID 105). 989 bytes result sent to driver
14:51:24.475 INFO TaskSetManager - Finished task 0.0 in stage 67.0 (TID 105) in 43 ms on localhost (executor driver) (1/1)
14:51:24.475 INFO TaskSchedulerImpl - Removed TaskSet 67.0, whose tasks have all completed, from pool
14:51:24.475 INFO DAGScheduler - ResultStage 67 (count at ReadsSparkSinkUnitTest.java:185) finished in 0.058 s
14:51:24.475 INFO DAGScheduler - Job 50 is finished. Cancelling potential speculative or zombie tasks for this job
14:51:24.475 INFO TaskSchedulerImpl - Killing all running tasks in stage 67: Stage finished
14:51:24.475 INFO DAGScheduler - Job 50 finished: count at ReadsSparkSinkUnitTest.java:185, took 0.059250 s
14:51:24.479 INFO SparkContext - Starting job: count at ReadsSparkSinkUnitTest.java:185
14:51:24.479 INFO DAGScheduler - Got job 51 (count at ReadsSparkSinkUnitTest.java:185) with 1 output partitions
14:51:24.479 INFO DAGScheduler - Final stage: ResultStage 68 (count at ReadsSparkSinkUnitTest.java:185)
14:51:24.479 INFO DAGScheduler - Parents of final stage: List()
14:51:24.479 INFO DAGScheduler - Missing parents: List()
14:51:24.480 INFO DAGScheduler - Submitting ResultStage 68 (MapPartitionsRDD[263] at filter at ReadsSparkSource.java:96), which has no missing parents
14:51:24.497 INFO MemoryStore - Block broadcast_121 stored as values in memory (estimated size 286.8 KiB, free 1918.1 MiB)
14:51:24.499 INFO MemoryStore - Block broadcast_121_piece0 stored as bytes in memory (estimated size 103.6 KiB, free 1918.0 MiB)
14:51:24.499 INFO BlockManagerInfo - Added broadcast_121_piece0 in memory on localhost:44923 (size: 103.6 KiB, free: 1919.5 MiB)
14:51:24.499 INFO SparkContext - Created broadcast 121 from broadcast at DAGScheduler.scala:1580
14:51:24.500 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 68 (MapPartitionsRDD[263] at filter at ReadsSparkSource.java:96) (first 15 tasks are for partitions Vector(0))
14:51:24.500 INFO TaskSchedulerImpl - Adding task set 68.0 with 1 tasks resource profile 0
14:51:24.500 INFO TaskSetManager - Starting task 0.0 in stage 68.0 (TID 106) (localhost, executor driver, partition 0, ANY, 7853 bytes)
14:51:24.501 INFO Executor - Running task 0.0 in stage 68.0 (TID 106)
14:51:24.525 INFO NewHadoopRDD - Input split: hdfs://localhost:43595/user/runner/ReadsSparkSinkUnitTest5_d294ad7a-9b81-4fde-ad12-8fc27a90336f.cram:0+43713
14:51:24.526 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest5_d294ad7a-9b81-4fde-ad12-8fc27a90336f.cram dst=null perm=null proto=rpc
14:51:24.527 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest5_d294ad7a-9b81-4fde-ad12-8fc27a90336f.cram dst=null perm=null proto=rpc
14:51:24.528 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest5_d294ad7a-9b81-4fde-ad12-8fc27a90336f.cram.crai dst=null perm=null proto=rpc
14:51:24.529 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest5_d294ad7a-9b81-4fde-ad12-8fc27a90336f.crai dst=null perm=null proto=rpc
14:51:24.531 WARN DFSUtil - Unexpected value for data transfer bytes=1024 duration=0
14:51:24.532 WARN DFSUtil - Unexpected value for data transfer bytes=42995 duration=0
14:51:24.532 WARN DFSUtil - Unexpected value for data transfer bytes=42 duration=0
14:51:24.555 INFO Executor - Finished task 0.0 in stage 68.0 (TID 106). 989 bytes result sent to driver
14:51:24.555 INFO TaskSetManager - Finished task 0.0 in stage 68.0 (TID 106) in 55 ms on localhost (executor driver) (1/1)
14:51:24.555 INFO TaskSchedulerImpl - Removed TaskSet 68.0, whose tasks have all completed, from pool
14:51:24.556 INFO DAGScheduler - ResultStage 68 (count at ReadsSparkSinkUnitTest.java:185) finished in 0.076 s
14:51:24.556 INFO DAGScheduler - Job 51 is finished. Cancelling potential speculative or zombie tasks for this job
14:51:24.556 INFO TaskSchedulerImpl - Killing all running tasks in stage 68: Stage finished
14:51:24.556 INFO DAGScheduler - Job 51 finished: count at ReadsSparkSinkUnitTest.java:185, took 0.077254 s
14:51:24.561 INFO MemoryStore - Block broadcast_122 stored as values in memory (estimated size 297.9 KiB, free 1917.7 MiB)
14:51:24.572 INFO MemoryStore - Block broadcast_122_piece0 stored as bytes in memory (estimated size 50.2 KiB, free 1917.6 MiB)
14:51:24.572 INFO BlockManagerInfo - Added broadcast_122_piece0 in memory on localhost:44923 (size: 50.2 KiB, free: 1919.5 MiB)
14:51:24.573 INFO SparkContext - Created broadcast 122 from newAPIHadoopFile at PathSplitSource.java:96
14:51:24.604 INFO MemoryStore - Block broadcast_123 stored as values in memory (estimated size 297.9 KiB, free 1917.3 MiB)
14:51:24.610 INFO MemoryStore - Block broadcast_123_piece0 stored as bytes in memory (estimated size 50.2 KiB, free 1917.3 MiB)
14:51:24.610 INFO BlockManagerInfo - Added broadcast_123_piece0 in memory on localhost:44923 (size: 50.2 KiB, free: 1919.4 MiB)
14:51:24.611 INFO SparkContext - Created broadcast 123 from newAPIHadoopFile at PathSplitSource.java:96
14:51:24.633 INFO FileInputFormat - Total input files to process : 1
14:51:24.636 INFO MemoryStore - Block broadcast_124 stored as values in memory (estimated size 160.7 KiB, free 1917.1 MiB)
14:51:24.638 INFO MemoryStore - Block broadcast_124_piece0 stored as bytes in memory (estimated size 9.6 KiB, free 1917.1 MiB)
14:51:24.638 INFO BlockManagerInfo - Added broadcast_124_piece0 in memory on localhost:44923 (size: 9.6 KiB, free: 1919.4 MiB)
14:51:24.638 INFO SparkContext - Created broadcast 124 from broadcast at ReadsSparkSink.java:133
14:51:24.652 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest6_e7e1f1d2-4ed6-4e09-bfce-3a6bf9ce613e.sam.parts dst=null perm=null proto=rpc
14:51:24.653 INFO deprecation - mapred.output.dir is deprecated. Instead, use mapreduce.output.fileoutputformat.outputdir
14:51:24.654 INFO HadoopMapRedCommitProtocol - Using output committer class org.apache.hadoop.mapred.FileOutputCommitter
14:51:24.654 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
14:51:24.654 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
14:51:24.655 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=mkdirs src=/user/runner/ReadsSparkSinkUnitTest6_e7e1f1d2-4ed6-4e09-bfce-3a6bf9ce613e.sam.parts/_temporary/0 dst=null perm=runner:supergroup:rwxr-xr-x proto=rpc
14:51:24.663 INFO SparkContext - Starting job: runJob at SparkHadoopWriter.scala:83
14:51:24.664 INFO DAGScheduler - Registering RDD 277 (mapToPair at SparkUtils.java:161) as input to shuffle 15
14:51:24.664 INFO DAGScheduler - Got job 52 (runJob at SparkHadoopWriter.scala:83) with 1 output partitions
14:51:24.664 INFO DAGScheduler - Final stage: ResultStage 70 (runJob at SparkHadoopWriter.scala:83)
14:51:24.664 INFO DAGScheduler - Parents of final stage: List(ShuffleMapStage 69)
14:51:24.665 INFO DAGScheduler - Missing parents: List(ShuffleMapStage 69)
14:51:24.665 INFO DAGScheduler - Submitting ShuffleMapStage 69 (MapPartitionsRDD[277] at mapToPair at SparkUtils.java:161), which has no missing parents
14:51:24.684 INFO MemoryStore - Block broadcast_125 stored as values in memory (estimated size 520.4 KiB, free 1916.6 MiB)
14:51:24.686 INFO MemoryStore - Block broadcast_125_piece0 stored as bytes in memory (estimated size 166.1 KiB, free 1916.4 MiB)
14:51:24.686 INFO BlockManagerInfo - Added broadcast_125_piece0 in memory on localhost:44923 (size: 166.1 KiB, free: 1919.3 MiB)
14:51:24.686 INFO SparkContext - Created broadcast 125 from broadcast at DAGScheduler.scala:1580
14:51:24.686 INFO DAGScheduler - Submitting 1 missing tasks from ShuffleMapStage 69 (MapPartitionsRDD[277] at mapToPair at SparkUtils.java:161) (first 15 tasks are for partitions Vector(0))
14:51:24.686 INFO TaskSchedulerImpl - Adding task set 69.0 with 1 tasks resource profile 0
14:51:24.687 INFO TaskSetManager - Starting task 0.0 in stage 69.0 (TID 107) (localhost, executor driver, partition 0, PROCESS_LOCAL, 7861 bytes)
14:51:24.688 INFO Executor - Running task 0.0 in stage 69.0 (TID 107)
14:51:24.725 INFO NewHadoopRDD - Input split: file:/home/runner/work/gatk/gatk/src/test/resources/org/broadinstitute/hellbender/tools/BQSR/HiSeq.1mb.1RG.2k_lines.bam:0+222075
14:51:24.745 INFO Executor - Finished task 0.0 in stage 69.0 (TID 107). 1148 bytes result sent to driver
14:51:24.746 INFO TaskSetManager - Finished task 0.0 in stage 69.0 (TID 107) in 59 ms on localhost (executor driver) (1/1)
14:51:24.746 INFO TaskSchedulerImpl - Removed TaskSet 69.0, whose tasks have all completed, from pool
14:51:24.746 INFO DAGScheduler - ShuffleMapStage 69 (mapToPair at SparkUtils.java:161) finished in 0.081 s
14:51:24.746 INFO DAGScheduler - looking for newly runnable stages
14:51:24.746 INFO DAGScheduler - running: HashSet()
14:51:24.746 INFO DAGScheduler - waiting: HashSet(ResultStage 70)
14:51:24.746 INFO DAGScheduler - failed: HashSet()
14:51:24.747 INFO DAGScheduler - Submitting ResultStage 70 (MapPartitionsRDD[283] at saveAsTextFile at SamSink.java:65), which has no missing parents
14:51:24.754 INFO MemoryStore - Block broadcast_126 stored as values in memory (estimated size 241.1 KiB, free 1916.2 MiB)
14:51:24.755 INFO MemoryStore - Block broadcast_126_piece0 stored as bytes in memory (estimated size 67.0 KiB, free 1916.1 MiB)
14:51:24.755 INFO BlockManagerInfo - Added broadcast_126_piece0 in memory on localhost:44923 (size: 67.0 KiB, free: 1919.2 MiB)
14:51:24.756 INFO SparkContext - Created broadcast 126 from broadcast at DAGScheduler.scala:1580
14:51:24.756 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 70 (MapPartitionsRDD[283] at saveAsTextFile at SamSink.java:65) (first 15 tasks are for partitions Vector(0))
14:51:24.756 INFO TaskSchedulerImpl - Adding task set 70.0 with 1 tasks resource profile 0
14:51:24.757 INFO TaskSetManager - Starting task 0.0 in stage 70.0 (TID 108) (localhost, executor driver, partition 0, NODE_LOCAL, 7513 bytes)
14:51:24.757 INFO Executor - Running task 0.0 in stage 70.0 (TID 108)
14:51:24.764 INFO ShuffleBlockFetcherIterator - Getting 1 (343.8 KiB) non-empty blocks including 1 (343.8 KiB) local and 0 (0.0 B) host-local and 0 (0.0 B) push-merged-local and 0 (0.0 B) remote blocks
14:51:24.764 INFO ShuffleBlockFetcherIterator - Started 0 remote fetches in 0 ms
14:51:24.779 INFO HadoopMapRedCommitProtocol - Using output committer class org.apache.hadoop.mapred.FileOutputCommitter
14:51:24.779 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
14:51:24.779 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
14:51:24.781 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=create src=/user/runner/ReadsSparkSinkUnitTest6_e7e1f1d2-4ed6-4e09-bfce-3a6bf9ce613e.sam.parts/_temporary/0/_temporary/attempt_202603041451249101639025582394826_0283_m_000000_0/part-00000 dst=null perm=runner:supergroup:rw-r--r-- proto=rpc
14:51:24.784 INFO StateChange - BLOCK* allocate blk_1073741869_1045, replicas=127.0.0.1:34059 for /user/runner/ReadsSparkSinkUnitTest6_e7e1f1d2-4ed6-4e09-bfce-3a6bf9ce613e.sam.parts/_temporary/0/_temporary/attempt_202603041451249101639025582394826_0283_m_000000_0/part-00000
14:51:24.786 INFO DataNode - Receiving BP-1768883704-10.1.0.125-1772635868711:blk_1073741869_1045 src: /127.0.0.1:37238 dest: /127.0.0.1:34059
14:51:24.793 INFO clienttrace - src: /127.0.0.1:37238, dest: /127.0.0.1:34059, bytes: 761729, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-951833550_1, offset: 0, srvID: 3c05d188-bc2c-46c8-958b-79f7184a02a2, blockid: BP-1768883704-10.1.0.125-1772635868711:blk_1073741869_1045, duration(ns): 6332862
14:51:24.793 INFO DataNode - PacketResponder: BP-1768883704-10.1.0.125-1772635868711:blk_1073741869_1045, type=LAST_IN_PIPELINE terminating
14:51:24.794 INFO FSNamesystem - BLOCK* blk_1073741869_1045 is COMMITTED but not COMPLETE(numNodes= 0 < minimum = 1) in file /user/runner/ReadsSparkSinkUnitTest6_e7e1f1d2-4ed6-4e09-bfce-3a6bf9ce613e.sam.parts/_temporary/0/_temporary/attempt_202603041451249101639025582394826_0283_m_000000_0/part-00000
14:51:25.027 INFO FsDatasetAsyncDiskService - Scheduling blk_1073741860_1036 replica FinalizedReplica, blk_1073741860_1036, FINALIZED
getNumBytes() = 204
getBytesOnDisk() = 204
getVisibleLength()= 204
getVolume() = /tmp/minicluster_storage16268522075870465194/data/data2
getBlockURI() = file:/tmp/minicluster_storage16268522075870465194/data/data2/current/BP-1768883704-10.1.0.125-1772635868711/current/finalized/subdir0/subdir0/blk_1073741860 for deletion
14:51:25.027 INFO FsDatasetAsyncDiskService - Scheduling blk_1073741861_1037 replica FinalizedReplica, blk_1073741861_1037, FINALIZED
getNumBytes() = 592
getBytesOnDisk() = 592
getVisibleLength()= 592
getVolume() = /tmp/minicluster_storage16268522075870465194/data/data1
getBlockURI() = file:/tmp/minicluster_storage16268522075870465194/data/data1/current/BP-1768883704-10.1.0.125-1772635868711/current/finalized/subdir0/subdir0/blk_1073741861 for deletion
14:51:25.027 INFO FsDatasetAsyncDiskService - Deleted BP-1768883704-10.1.0.125-1772635868711 blk_1073741860_1036 URI file:/tmp/minicluster_storage16268522075870465194/data/data2/current/BP-1768883704-10.1.0.125-1772635868711/current/finalized/subdir0/subdir0/blk_1073741860
14:51:25.027 INFO FsDatasetAsyncDiskService - Deleted BP-1768883704-10.1.0.125-1772635868711 blk_1073741861_1037 URI file:/tmp/minicluster_storage16268522075870465194/data/data1/current/BP-1768883704-10.1.0.125-1772635868711/current/finalized/subdir0/subdir0/blk_1073741861
14:51:25.195 INFO StateChange - DIR* completeFile: /user/runner/ReadsSparkSinkUnitTest6_e7e1f1d2-4ed6-4e09-bfce-3a6bf9ce613e.sam.parts/_temporary/0/_temporary/attempt_202603041451249101639025582394826_0283_m_000000_0/part-00000 is closed by DFSClient_NONMAPREDUCE_-951833550_1
14:51:25.196 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest6_e7e1f1d2-4ed6-4e09-bfce-3a6bf9ce613e.sam.parts/_temporary/0/_temporary/attempt_202603041451249101639025582394826_0283_m_000000_0 dst=null perm=null proto=rpc
14:51:25.197 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest6_e7e1f1d2-4ed6-4e09-bfce-3a6bf9ce613e.sam.parts/_temporary/0/_temporary/attempt_202603041451249101639025582394826_0283_m_000000_0 dst=null perm=null proto=rpc
14:51:25.197 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest6_e7e1f1d2-4ed6-4e09-bfce-3a6bf9ce613e.sam.parts/_temporary/0/task_202603041451249101639025582394826_0283_m_000000 dst=null perm=null proto=rpc
14:51:25.198 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=rename src=/user/runner/ReadsSparkSinkUnitTest6_e7e1f1d2-4ed6-4e09-bfce-3a6bf9ce613e.sam.parts/_temporary/0/_temporary/attempt_202603041451249101639025582394826_0283_m_000000_0 dst=/user/runner/ReadsSparkSinkUnitTest6_e7e1f1d2-4ed6-4e09-bfce-3a6bf9ce613e.sam.parts/_temporary/0/task_202603041451249101639025582394826_0283_m_000000 perm=runner:supergroup:rwxr-xr-x proto=rpc
14:51:25.198 INFO FileOutputCommitter - Saved output of task 'attempt_202603041451249101639025582394826_0283_m_000000_0' to hdfs://localhost:43595/user/runner/ReadsSparkSinkUnitTest6_e7e1f1d2-4ed6-4e09-bfce-3a6bf9ce613e.sam.parts/_temporary/0/task_202603041451249101639025582394826_0283_m_000000
14:51:25.198 INFO SparkHadoopMapRedUtil - attempt_202603041451249101639025582394826_0283_m_000000_0: Committed. Elapsed time: 1 ms.
14:51:25.199 INFO Executor - Finished task 0.0 in stage 70.0 (TID 108). 1858 bytes result sent to driver
14:51:25.200 INFO TaskSetManager - Finished task 0.0 in stage 70.0 (TID 108) in 443 ms on localhost (executor driver) (1/1)
14:51:25.200 INFO TaskSchedulerImpl - Removed TaskSet 70.0, whose tasks have all completed, from pool
14:51:25.200 INFO DAGScheduler - ResultStage 70 (runJob at SparkHadoopWriter.scala:83) finished in 0.453 s
14:51:25.200 INFO DAGScheduler - Job 52 is finished. Cancelling potential speculative or zombie tasks for this job
14:51:25.200 INFO TaskSchedulerImpl - Killing all running tasks in stage 70: Stage finished
14:51:25.200 INFO DAGScheduler - Job 52 finished: runJob at SparkHadoopWriter.scala:83, took 0.537026 s
14:51:25.201 INFO SparkHadoopWriter - Start to commit write Job job_202603041451249101639025582394826_0283.
14:51:25.202 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=listStatus src=/user/runner/ReadsSparkSinkUnitTest6_e7e1f1d2-4ed6-4e09-bfce-3a6bf9ce613e.sam.parts/_temporary/0 dst=null perm=null proto=rpc
14:51:25.203 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest6_e7e1f1d2-4ed6-4e09-bfce-3a6bf9ce613e.sam.parts dst=null perm=null proto=rpc
14:51:25.203 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=listStatus src=/user/runner/ReadsSparkSinkUnitTest6_e7e1f1d2-4ed6-4e09-bfce-3a6bf9ce613e.sam.parts/_temporary/0/task_202603041451249101639025582394826_0283_m_000000 dst=null perm=null proto=rpc
14:51:25.204 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest6_e7e1f1d2-4ed6-4e09-bfce-3a6bf9ce613e.sam.parts/part-00000 dst=null perm=null proto=rpc
14:51:25.205 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=rename src=/user/runner/ReadsSparkSinkUnitTest6_e7e1f1d2-4ed6-4e09-bfce-3a6bf9ce613e.sam.parts/_temporary/0/task_202603041451249101639025582394826_0283_m_000000/part-00000 dst=/user/runner/ReadsSparkSinkUnitTest6_e7e1f1d2-4ed6-4e09-bfce-3a6bf9ce613e.sam.parts/part-00000 perm=runner:supergroup:rw-r--r-- proto=rpc
14:51:25.205 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=delete src=/user/runner/ReadsSparkSinkUnitTest6_e7e1f1d2-4ed6-4e09-bfce-3a6bf9ce613e.sam.parts/_temporary dst=null perm=null proto=rpc
14:51:25.206 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=create src=/user/runner/ReadsSparkSinkUnitTest6_e7e1f1d2-4ed6-4e09-bfce-3a6bf9ce613e.sam.parts/_SUCCESS dst=null perm=runner:supergroup:rw-r--r-- proto=rpc
14:51:25.207 INFO StateChange - DIR* completeFile: /user/runner/ReadsSparkSinkUnitTest6_e7e1f1d2-4ed6-4e09-bfce-3a6bf9ce613e.sam.parts/_SUCCESS is closed by DFSClient_NONMAPREDUCE_-951833550_1
14:51:25.208 INFO audit - allowed=false ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=delete src=/user/runner/ReadsSparkSinkUnitTest6_e7e1f1d2-4ed6-4e09-bfce-3a6bf9ce613e.sam.parts/.spark-staging-283 dst=null perm=null proto=rpc
14:51:25.208 INFO SparkHadoopWriter - Write Job job_202603041451249101639025582394826_0283 committed. Elapsed time: 7 ms.
14:51:25.209 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=create src=/user/runner/ReadsSparkSinkUnitTest6_e7e1f1d2-4ed6-4e09-bfce-3a6bf9ce613e.sam.parts/header dst=null perm=runner:supergroup:rw-r--r-- proto=rpc
14:51:25.212 INFO StateChange - BLOCK* allocate blk_1073741870_1046, replicas=127.0.0.1:34059 for /user/runner/ReadsSparkSinkUnitTest6_e7e1f1d2-4ed6-4e09-bfce-3a6bf9ce613e.sam.parts/header
14:51:25.213 INFO DataNode - Receiving BP-1768883704-10.1.0.125-1772635868711:blk_1073741870_1046 src: /127.0.0.1:49284 dest: /127.0.0.1:34059
14:51:25.215 INFO clienttrace - src: /127.0.0.1:49284, dest: /127.0.0.1:34059, bytes: 85829, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-951833550_1, offset: 0, srvID: 3c05d188-bc2c-46c8-958b-79f7184a02a2, blockid: BP-1768883704-10.1.0.125-1772635868711:blk_1073741870_1046, duration(ns): 711062
14:51:25.215 INFO DataNode - PacketResponder: BP-1768883704-10.1.0.125-1772635868711:blk_1073741870_1046, type=LAST_IN_PIPELINE terminating
14:51:25.216 INFO FSNamesystem - BLOCK* blk_1073741870_1046 is COMMITTED but not COMPLETE(numNodes= 0 < minimum = 1) in file /user/runner/ReadsSparkSinkUnitTest6_e7e1f1d2-4ed6-4e09-bfce-3a6bf9ce613e.sam.parts/header
14:51:25.617 INFO StateChange - DIR* completeFile: /user/runner/ReadsSparkSinkUnitTest6_e7e1f1d2-4ed6-4e09-bfce-3a6bf9ce613e.sam.parts/header is closed by DFSClient_NONMAPREDUCE_-951833550_1
14:51:25.618 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=listStatus src=/user/runner/ReadsSparkSinkUnitTest6_e7e1f1d2-4ed6-4e09-bfce-3a6bf9ce613e.sam.parts dst=null perm=null proto=rpc
14:51:25.619 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=create src=/user/runner/ReadsSparkSinkUnitTest6_e7e1f1d2-4ed6-4e09-bfce-3a6bf9ce613e.sam.parts/output dst=null perm=runner:supergroup:rw-r--r-- proto=rpc
14:51:25.620 INFO StateChange - DIR* completeFile: /user/runner/ReadsSparkSinkUnitTest6_e7e1f1d2-4ed6-4e09-bfce-3a6bf9ce613e.sam.parts/output is closed by DFSClient_NONMAPREDUCE_-951833550_1
14:51:25.620 INFO HadoopFileSystemWrapper - Concatenating 2 parts to hdfs://localhost:43595/user/runner/ReadsSparkSinkUnitTest6_e7e1f1d2-4ed6-4e09-bfce-3a6bf9ce613e.sam
14:51:25.621 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=concat src=[/user/runner/ReadsSparkSinkUnitTest6_e7e1f1d2-4ed6-4e09-bfce-3a6bf9ce613e.sam.parts/header, /user/runner/ReadsSparkSinkUnitTest6_e7e1f1d2-4ed6-4e09-bfce-3a6bf9ce613e.sam.parts/part-00000] dst=/user/runner/ReadsSparkSinkUnitTest6_e7e1f1d2-4ed6-4e09-bfce-3a6bf9ce613e.sam.parts/output perm=runner:supergroup:rw-r--r-- proto=rpc
14:51:25.621 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest6_e7e1f1d2-4ed6-4e09-bfce-3a6bf9ce613e.sam dst=null perm=null proto=rpc
14:51:25.622 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=rename src=/user/runner/ReadsSparkSinkUnitTest6_e7e1f1d2-4ed6-4e09-bfce-3a6bf9ce613e.sam.parts/output dst=/user/runner/ReadsSparkSinkUnitTest6_e7e1f1d2-4ed6-4e09-bfce-3a6bf9ce613e.sam perm=runner:supergroup:rw-r--r-- proto=rpc
14:51:25.622 INFO HadoopFileSystemWrapper - Concatenating to hdfs://localhost:43595/user/runner/ReadsSparkSinkUnitTest6_e7e1f1d2-4ed6-4e09-bfce-3a6bf9ce613e.sam done
14:51:25.622 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=delete src=/user/runner/ReadsSparkSinkUnitTest6_e7e1f1d2-4ed6-4e09-bfce-3a6bf9ce613e.sam.parts dst=null perm=null proto=rpc
14:51:25.623 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest6_e7e1f1d2-4ed6-4e09-bfce-3a6bf9ce613e.sam dst=null perm=null proto=rpc
14:51:25.623 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest6_e7e1f1d2-4ed6-4e09-bfce-3a6bf9ce613e.sam dst=null perm=null proto=rpc
14:51:25.624 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest6_e7e1f1d2-4ed6-4e09-bfce-3a6bf9ce613e.sam dst=null perm=null proto=rpc
14:51:25.624 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest6_e7e1f1d2-4ed6-4e09-bfce-3a6bf9ce613e.sam dst=null perm=null proto=rpc
WARNING 2026-03-04 14:51:25 SamReaderFactory Unable to detect file format from input URL or stream, assuming SAM format.
14:51:25.627 WARN DFSUtil - Unexpected value for data transfer bytes=86501 duration=0
14:51:25.628 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest6_e7e1f1d2-4ed6-4e09-bfce-3a6bf9ce613e.sam dst=null perm=null proto=rpc
14:51:25.629 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest6_e7e1f1d2-4ed6-4e09-bfce-3a6bf9ce613e.sam dst=null perm=null proto=rpc
14:51:25.629 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest6_e7e1f1d2-4ed6-4e09-bfce-3a6bf9ce613e.sam dst=null perm=null proto=rpc
WARNING 2026-03-04 14:51:25 SamReaderFactory Unable to detect file format from input URL or stream, assuming SAM format.
14:51:25.632 WARN DFSUtil - Unexpected value for data transfer bytes=86501 duration=0
14:51:25.633 WARN DFSUtil - Unexpected value for data transfer bytes=767681 duration=0
14:51:25.634 INFO MemoryStore - Block broadcast_127 stored as values in memory (estimated size 160.7 KiB, free 1916.0 MiB)
14:51:25.635 INFO MemoryStore - Block broadcast_127_piece0 stored as bytes in memory (estimated size 9.6 KiB, free 1916.0 MiB)
14:51:25.636 INFO BlockManagerInfo - Added broadcast_127_piece0 in memory on localhost:44923 (size: 9.6 KiB, free: 1919.2 MiB)
14:51:25.636 INFO SparkContext - Created broadcast 127 from broadcast at SamSource.java:78
14:51:25.638 INFO MemoryStore - Block broadcast_128 stored as values in memory (estimated size 297.9 KiB, free 1915.7 MiB)
14:51:25.649 INFO MemoryStore - Block broadcast_128_piece0 stored as bytes in memory (estimated size 50.2 KiB, free 1915.6 MiB)
14:51:25.649 INFO BlockManagerInfo - Added broadcast_128_piece0 in memory on localhost:44923 (size: 50.2 KiB, free: 1919.1 MiB)
14:51:25.649 INFO SparkContext - Created broadcast 128 from newAPIHadoopFile at SamSource.java:108
14:51:25.659 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest6_e7e1f1d2-4ed6-4e09-bfce-3a6bf9ce613e.sam dst=null perm=null proto=rpc
14:51:25.659 INFO FileInputFormat - Total input files to process : 1
14:51:25.660 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest6_e7e1f1d2-4ed6-4e09-bfce-3a6bf9ce613e.sam dst=null perm=null proto=rpc
14:51:25.673 INFO SparkContext - Starting job: collect at ReadsSparkSinkUnitTest.java:182
14:51:25.674 INFO DAGScheduler - Got job 53 (collect at ReadsSparkSinkUnitTest.java:182) with 1 output partitions
14:51:25.674 INFO DAGScheduler - Final stage: ResultStage 71 (collect at ReadsSparkSinkUnitTest.java:182)
14:51:25.674 INFO DAGScheduler - Parents of final stage: List()
14:51:25.674 INFO DAGScheduler - Missing parents: List()
14:51:25.674 INFO DAGScheduler - Submitting ResultStage 71 (MapPartitionsRDD[288] at filter at ReadsSparkSource.java:96), which has no missing parents
14:51:25.675 INFO MemoryStore - Block broadcast_129 stored as values in memory (estimated size 7.5 KiB, free 1915.6 MiB)
14:51:25.675 INFO MemoryStore - Block broadcast_129_piece0 stored as bytes in memory (estimated size 3.8 KiB, free 1915.6 MiB)
14:51:25.676 INFO BlockManagerInfo - Added broadcast_129_piece0 in memory on localhost:44923 (size: 3.8 KiB, free: 1919.1 MiB)
14:51:25.676 INFO SparkContext - Created broadcast 129 from broadcast at DAGScheduler.scala:1580
14:51:25.676 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 71 (MapPartitionsRDD[288] at filter at ReadsSparkSource.java:96) (first 15 tasks are for partitions Vector(0))
14:51:25.676 INFO TaskSchedulerImpl - Adding task set 71.0 with 1 tasks resource profile 0
14:51:25.677 INFO TaskSetManager - Starting task 0.0 in stage 71.0 (TID 109) (localhost, executor driver, partition 0, ANY, 7852 bytes)
14:51:25.677 INFO Executor - Running task 0.0 in stage 71.0 (TID 109)
14:51:25.679 INFO NewHadoopRDD - Input split: hdfs://localhost:43595/user/runner/ReadsSparkSinkUnitTest6_e7e1f1d2-4ed6-4e09-bfce-3a6bf9ce613e.sam:0+847558
14:51:25.684 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest6_e7e1f1d2-4ed6-4e09-bfce-3a6bf9ce613e.sam dst=null perm=null proto=rpc
14:51:25.692 WARN DFSUtil - Unexpected value for data transfer bytes=86501 duration=0
14:51:25.733 INFO Executor - Finished task 0.0 in stage 71.0 (TID 109). 651526 bytes result sent to driver
14:51:25.735 INFO TaskSetManager - Finished task 0.0 in stage 71.0 (TID 109) in 58 ms on localhost (executor driver) (1/1)
14:51:25.735 INFO TaskSchedulerImpl - Removed TaskSet 71.0, whose tasks have all completed, from pool
14:51:25.736 INFO DAGScheduler - ResultStage 71 (collect at ReadsSparkSinkUnitTest.java:182) finished in 0.062 s
14:51:25.736 INFO DAGScheduler - Job 53 is finished. Cancelling potential speculative or zombie tasks for this job
14:51:25.736 INFO TaskSchedulerImpl - Killing all running tasks in stage 71: Stage finished
14:51:25.736 INFO DAGScheduler - Job 53 finished: collect at ReadsSparkSinkUnitTest.java:182, took 0.062839 s
14:51:25.746 INFO BlockManagerInfo - Removed broadcast_120_piece0 on localhost:44923 in memory (size: 103.6 KiB, free: 1919.2 MiB)
14:51:25.748 INFO BlockManagerInfo - Removed broadcast_119_piece0 on localhost:44923 in memory (size: 103.6 KiB, free: 1919.3 MiB)
14:51:25.748 INFO BlockManagerInfo - Removed broadcast_114_piece0 on localhost:44923 in memory (size: 1473.0 B, free: 1919.3 MiB)
14:51:25.749 INFO BlockManagerInfo - Removed broadcast_125_piece0 on localhost:44923 in memory (size: 166.1 KiB, free: 1919.5 MiB)
14:51:25.750 INFO BlockManagerInfo - Removed broadcast_121_piece0 on localhost:44923 in memory (size: 103.6 KiB, free: 1919.6 MiB)
14:51:25.750 INFO BlockManagerInfo - Removed broadcast_126_piece0 on localhost:44923 in memory (size: 67.0 KiB, free: 1919.7 MiB)
14:51:25.751 INFO BlockManagerInfo - Removed broadcast_116_piece0 on localhost:44923 in memory (size: 58.1 KiB, free: 1919.7 MiB)
14:51:25.752 INFO BlockManagerInfo - Removed broadcast_123_piece0 on localhost:44923 in memory (size: 50.2 KiB, free: 1919.8 MiB)
14:51:25.752 INFO BlockManagerInfo - Removed broadcast_124_piece0 on localhost:44923 in memory (size: 9.6 KiB, free: 1919.8 MiB)
14:51:25.753 INFO BlockManagerInfo - Removed broadcast_109_piece0 on localhost:44923 in memory (size: 228.0 B, free: 1919.8 MiB)
14:51:25.754 INFO BlockManagerInfo - Removed broadcast_129_piece0 on localhost:44923 in memory (size: 3.8 KiB, free: 1919.8 MiB)
14:51:25.755 INFO BlockManagerInfo - Removed broadcast_110_piece0 on localhost:44923 in memory (size: 50.2 KiB, free: 1919.8 MiB)
14:51:25.755 INFO BlockManagerInfo - Removed broadcast_117_piece0 on localhost:44923 in memory (size: 187.0 B, free: 1919.8 MiB)
14:51:25.756 INFO BlockManagerInfo - Removed broadcast_113_piece0 on localhost:44923 in memory (size: 1473.0 B, free: 1919.8 MiB)
14:51:25.758 INFO BlockManagerInfo - Removed broadcast_118_piece0 on localhost:44923 in memory (size: 50.2 KiB, free: 1919.9 MiB)
14:51:25.758 INFO SparkContext - Starting job: count at ReadsSparkSinkUnitTest.java:185
14:51:25.759 INFO DAGScheduler - Got job 54 (count at ReadsSparkSinkUnitTest.java:185) with 1 output partitions
14:51:25.759 INFO DAGScheduler - Final stage: ResultStage 72 (count at ReadsSparkSinkUnitTest.java:185)
14:51:25.759 INFO DAGScheduler - Parents of final stage: List()
14:51:25.759 INFO DAGScheduler - Missing parents: List()
14:51:25.759 INFO DAGScheduler - Submitting ResultStage 72 (MapPartitionsRDD[270] at filter at ReadsSparkSource.java:96), which has no missing parents
14:51:25.777 INFO MemoryStore - Block broadcast_130 stored as values in memory (estimated size 426.1 KiB, free 1918.7 MiB)
14:51:25.778 INFO MemoryStore - Block broadcast_130_piece0 stored as bytes in memory (estimated size 153.6 KiB, free 1918.6 MiB)
14:51:25.779 INFO BlockManagerInfo - Added broadcast_130_piece0 in memory on localhost:44923 (size: 153.6 KiB, free: 1919.7 MiB)
14:51:25.779 INFO SparkContext - Created broadcast 130 from broadcast at DAGScheduler.scala:1580
14:51:25.779 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 72 (MapPartitionsRDD[270] at filter at ReadsSparkSource.java:96) (first 15 tasks are for partitions Vector(0))
14:51:25.779 INFO TaskSchedulerImpl - Adding task set 72.0 with 1 tasks resource profile 0
14:51:25.780 INFO TaskSetManager - Starting task 0.0 in stage 72.0 (TID 110) (localhost, executor driver, partition 0, PROCESS_LOCAL, 7872 bytes)
14:51:25.781 INFO Executor - Running task 0.0 in stage 72.0 (TID 110)
14:51:25.828 INFO NewHadoopRDD - Input split: file:/home/runner/work/gatk/gatk/src/test/resources/org/broadinstitute/hellbender/tools/BQSR/HiSeq.1mb.1RG.2k_lines.bam:0+222075
14:51:25.839 INFO Executor - Finished task 0.0 in stage 72.0 (TID 110). 989 bytes result sent to driver
14:51:25.840 INFO TaskSetManager - Finished task 0.0 in stage 72.0 (TID 110) in 60 ms on localhost (executor driver) (1/1)
14:51:25.840 INFO TaskSchedulerImpl - Removed TaskSet 72.0, whose tasks have all completed, from pool
14:51:25.840 INFO DAGScheduler - ResultStage 72 (count at ReadsSparkSinkUnitTest.java:185) finished in 0.081 s
14:51:25.840 INFO DAGScheduler - Job 54 is finished. Cancelling potential speculative or zombie tasks for this job
14:51:25.840 INFO TaskSchedulerImpl - Killing all running tasks in stage 72: Stage finished
14:51:25.840 INFO DAGScheduler - Job 54 finished: count at ReadsSparkSinkUnitTest.java:185, took 0.081722 s
14:51:25.844 INFO SparkContext - Starting job: count at ReadsSparkSinkUnitTest.java:185
14:51:25.844 INFO DAGScheduler - Got job 55 (count at ReadsSparkSinkUnitTest.java:185) with 1 output partitions
14:51:25.844 INFO DAGScheduler - Final stage: ResultStage 73 (count at ReadsSparkSinkUnitTest.java:185)
14:51:25.844 INFO DAGScheduler - Parents of final stage: List()
14:51:25.844 INFO DAGScheduler - Missing parents: List()
14:51:25.844 INFO DAGScheduler - Submitting ResultStage 73 (MapPartitionsRDD[288] at filter at ReadsSparkSource.java:96), which has no missing parents
14:51:25.845 INFO MemoryStore - Block broadcast_131 stored as values in memory (estimated size 7.4 KiB, free 1918.6 MiB)
14:51:25.846 INFO MemoryStore - Block broadcast_131_piece0 stored as bytes in memory (estimated size 3.8 KiB, free 1918.6 MiB)
14:51:25.846 INFO BlockManagerInfo - Added broadcast_131_piece0 in memory on localhost:44923 (size: 3.8 KiB, free: 1919.7 MiB)
14:51:25.847 INFO SparkContext - Created broadcast 131 from broadcast at DAGScheduler.scala:1580
14:51:25.847 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 73 (MapPartitionsRDD[288] at filter at ReadsSparkSource.java:96) (first 15 tasks are for partitions Vector(0))
14:51:25.847 INFO TaskSchedulerImpl - Adding task set 73.0 with 1 tasks resource profile 0
14:51:25.848 INFO TaskSetManager - Starting task 0.0 in stage 73.0 (TID 111) (localhost, executor driver, partition 0, ANY, 7852 bytes)
14:51:25.848 INFO Executor - Running task 0.0 in stage 73.0 (TID 111)
14:51:25.850 INFO NewHadoopRDD - Input split: hdfs://localhost:43595/user/runner/ReadsSparkSinkUnitTest6_e7e1f1d2-4ed6-4e09-bfce-3a6bf9ce613e.sam:0+847558
14:51:25.852 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest6_e7e1f1d2-4ed6-4e09-bfce-3a6bf9ce613e.sam dst=null perm=null proto=rpc
14:51:25.853 WARN DFSUtil - Unexpected value for data transfer bytes=86501 duration=0
14:51:25.863 WARN DFSUtil - Unexpected value for data transfer bytes=767681 duration=0
14:51:25.865 INFO Executor - Finished task 0.0 in stage 73.0 (TID 111). 989 bytes result sent to driver
14:51:25.865 INFO TaskSetManager - Finished task 0.0 in stage 73.0 (TID 111) in 18 ms on localhost (executor driver) (1/1)
14:51:25.865 INFO TaskSchedulerImpl - Removed TaskSet 73.0, whose tasks have all completed, from pool
14:51:25.866 INFO DAGScheduler - ResultStage 73 (count at ReadsSparkSinkUnitTest.java:185) finished in 0.021 s
14:51:25.866 INFO DAGScheduler - Job 55 is finished. Cancelling potential speculative or zombie tasks for this job
14:51:25.866 INFO TaskSchedulerImpl - Killing all running tasks in stage 73: Stage finished
14:51:25.866 INFO DAGScheduler - Job 55 finished: count at ReadsSparkSinkUnitTest.java:185, took 0.022171 s
14:51:25.870 INFO MemoryStore - Block broadcast_132 stored as values in memory (estimated size 297.9 KiB, free 1918.3 MiB)
14:51:25.876 INFO MemoryStore - Block broadcast_132_piece0 stored as bytes in memory (estimated size 50.2 KiB, free 1918.2 MiB)
14:51:25.877 INFO BlockManagerInfo - Added broadcast_132_piece0 in memory on localhost:44923 (size: 50.2 KiB, free: 1919.7 MiB)
14:51:25.877 INFO SparkContext - Created broadcast 132 from newAPIHadoopFile at PathSplitSource.java:96
14:51:25.907 INFO MemoryStore - Block broadcast_133 stored as values in memory (estimated size 297.9 KiB, free 1917.9 MiB)
14:51:25.913 INFO MemoryStore - Block broadcast_133_piece0 stored as bytes in memory (estimated size 50.2 KiB, free 1917.9 MiB)
14:51:25.914 INFO BlockManagerInfo - Added broadcast_133_piece0 in memory on localhost:44923 (size: 50.2 KiB, free: 1919.6 MiB)
14:51:25.914 INFO SparkContext - Created broadcast 133 from newAPIHadoopFile at PathSplitSource.java:96
14:51:25.937 INFO MemoryStore - Block broadcast_134 stored as values in memory (estimated size 160.7 KiB, free 1917.7 MiB)
14:51:25.938 INFO MemoryStore - Block broadcast_134_piece0 stored as bytes in memory (estimated size 9.6 KiB, free 1917.7 MiB)
14:51:25.938 INFO BlockManagerInfo - Added broadcast_134_piece0 in memory on localhost:44923 (size: 9.6 KiB, free: 1919.6 MiB)
14:51:25.938 INFO SparkContext - Created broadcast 134 from broadcast at ReadsSparkSink.java:133
14:51:25.941 INFO MemoryStore - Block broadcast_135 stored as values in memory (estimated size 163.2 KiB, free 1917.6 MiB)
14:51:25.942 INFO MemoryStore - Block broadcast_135_piece0 stored as bytes in memory (estimated size 9.6 KiB, free 1917.6 MiB)
14:51:25.942 INFO BlockManagerInfo - Added broadcast_135_piece0 in memory on localhost:44923 (size: 9.6 KiB, free: 1919.6 MiB)
14:51:25.943 INFO SparkContext - Created broadcast 135 from broadcast at AnySamSinkMultiple.java:80
14:51:25.947 INFO PathOutputCommitterFactory - No output committer factory defined, defaulting to FileOutputCommitterFactory
14:51:25.948 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
14:51:25.948 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
14:51:25.965 INFO FileInputFormat - Total input files to process : 1
14:51:25.974 INFO SparkContext - Starting job: runJob at SparkHadoopWriter.scala:83
14:51:25.975 INFO DAGScheduler - Registering RDD 296 (repartition at ReadsSparkSinkUnitTest.java:210) as input to shuffle 16
14:51:25.975 INFO DAGScheduler - Got job 56 (runJob at SparkHadoopWriter.scala:83) with 2 output partitions
14:51:25.975 INFO DAGScheduler - Final stage: ResultStage 75 (runJob at SparkHadoopWriter.scala:83)
14:51:25.975 INFO DAGScheduler - Parents of final stage: List(ShuffleMapStage 74)
14:51:25.975 INFO DAGScheduler - Missing parents: List(ShuffleMapStage 74)
14:51:25.976 INFO DAGScheduler - Submitting ShuffleMapStage 74 (MapPartitionsRDD[296] at repartition at ReadsSparkSinkUnitTest.java:210), which has no missing parents
14:51:25.996 INFO MemoryStore - Block broadcast_136 stored as values in memory (estimated size 427.7 KiB, free 1917.1 MiB)
14:51:25.997 INFO MemoryStore - Block broadcast_136_piece0 stored as bytes in memory (estimated size 154.6 KiB, free 1917.0 MiB)
14:51:25.998 INFO BlockManagerInfo - Added broadcast_136_piece0 in memory on localhost:44923 (size: 154.6 KiB, free: 1919.5 MiB)
14:51:25.998 INFO SparkContext - Created broadcast 136 from broadcast at DAGScheduler.scala:1580
14:51:25.998 INFO DAGScheduler - Submitting 1 missing tasks from ShuffleMapStage 74 (MapPartitionsRDD[296] at repartition at ReadsSparkSinkUnitTest.java:210) (first 15 tasks are for partitions Vector(0))
14:51:25.998 INFO TaskSchedulerImpl - Adding task set 74.0 with 1 tasks resource profile 0
14:51:25.999 INFO TaskSetManager - Starting task 0.0 in stage 74.0 (TID 112) (localhost, executor driver, partition 0, PROCESS_LOCAL, 7861 bytes)
14:51:25.999 INFO Executor - Running task 0.0 in stage 74.0 (TID 112)
14:51:26.034 INFO NewHadoopRDD - Input split: file:/home/runner/work/gatk/gatk/src/test/resources/org/broadinstitute/hellbender/tools/BQSR/HiSeq.1mb.1RG.2k_lines.bam:0+222075
14:51:26.064 INFO Executor - Finished task 0.0 in stage 74.0 (TID 112). 1149 bytes result sent to driver
14:51:26.064 INFO TaskSetManager - Finished task 0.0 in stage 74.0 (TID 112) in 65 ms on localhost (executor driver) (1/1)
14:51:26.064 INFO TaskSchedulerImpl - Removed TaskSet 74.0, whose tasks have all completed, from pool
14:51:26.065 INFO DAGScheduler - ShuffleMapStage 74 (repartition at ReadsSparkSinkUnitTest.java:210) finished in 0.089 s
14:51:26.065 INFO DAGScheduler - looking for newly runnable stages
14:51:26.065 INFO DAGScheduler - running: HashSet()
14:51:26.065 INFO DAGScheduler - waiting: HashSet(ResultStage 75)
14:51:26.065 INFO DAGScheduler - failed: HashSet()
14:51:26.065 INFO DAGScheduler - Submitting ResultStage 75 (MapPartitionsRDD[308] at mapToPair at AnySamSinkMultiple.java:89), which has no missing parents
14:51:26.073 INFO MemoryStore - Block broadcast_137 stored as values in memory (estimated size 150.2 KiB, free 1916.8 MiB)
14:51:26.073 INFO MemoryStore - Block broadcast_137_piece0 stored as bytes in memory (estimated size 56.2 KiB, free 1916.8 MiB)
14:51:26.074 INFO BlockManagerInfo - Added broadcast_137_piece0 in memory on localhost:44923 (size: 56.2 KiB, free: 1919.4 MiB)
14:51:26.074 INFO SparkContext - Created broadcast 137 from broadcast at DAGScheduler.scala:1580
14:51:26.074 INFO DAGScheduler - Submitting 2 missing tasks from ResultStage 75 (MapPartitionsRDD[308] at mapToPair at AnySamSinkMultiple.java:89) (first 15 tasks are for partitions Vector(0, 1))
14:51:26.074 INFO TaskSchedulerImpl - Adding task set 75.0 with 2 tasks resource profile 0
14:51:26.075 INFO TaskSetManager - Starting task 0.0 in stage 75.0 (TID 113) (localhost, executor driver, partition 0, NODE_LOCAL, 7860 bytes)
14:51:26.075 INFO TaskSetManager - Starting task 1.0 in stage 75.0 (TID 114) (localhost, executor driver, partition 1, NODE_LOCAL, 7860 bytes)
14:51:26.076 INFO Executor - Running task 1.0 in stage 75.0 (TID 114)
14:51:26.076 INFO Executor - Running task 0.0 in stage 75.0 (TID 113)
14:51:26.083 INFO PathOutputCommitterFactory - No output committer factory defined, defaulting to FileOutputCommitterFactory
14:51:26.083 INFO PathOutputCommitterFactory - No output committer factory defined, defaulting to FileOutputCommitterFactory
14:51:26.083 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
14:51:26.083 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
14:51:26.083 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
14:51:26.083 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
14:51:26.084 INFO PathOutputCommitterFactory - No output committer factory defined, defaulting to FileOutputCommitterFactory
14:51:26.084 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
14:51:26.084 INFO PathOutputCommitterFactory - No output committer factory defined, defaulting to FileOutputCommitterFactory
14:51:26.084 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
14:51:26.084 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
14:51:26.084 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
14:51:26.095 INFO ShuffleBlockFetcherIterator - Getting 1 (176.4 KiB) non-empty blocks including 1 (176.4 KiB) local and 0 (0.0 B) host-local and 0 (0.0 B) push-merged-local and 0 (0.0 B) remote blocks
14:51:26.095 INFO ShuffleBlockFetcherIterator - Started 0 remote fetches in 0 ms
14:51:26.103 INFO ShuffleBlockFetcherIterator - Getting 1 (176.4 KiB) non-empty blocks including 1 (176.4 KiB) local and 0 (0.0 B) host-local and 0 (0.0 B) push-merged-local and 0 (0.0 B) remote blocks
14:51:26.103 INFO ShuffleBlockFetcherIterator - Started 0 remote fetches in 0 ms
14:51:26.107 INFO FileOutputCommitter - Saved output of task 'attempt_202603041451251594255251621806110_0308_r_000001_0' to file:/tmp/ReadsSparkSinkUnitTest19070785983991903442.bam/_temporary/0/task_202603041451251594255251621806110_0308_r_000001
14:51:26.107 INFO SparkHadoopMapRedUtil - attempt_202603041451251594255251621806110_0308_r_000001_0: Committed. Elapsed time: 0 ms.
14:51:26.108 INFO Executor - Finished task 1.0 in stage 75.0 (TID 114). 1729 bytes result sent to driver
14:51:26.108 INFO TaskSetManager - Finished task 1.0 in stage 75.0 (TID 114) in 33 ms on localhost (executor driver) (1/2)
14:51:26.111 INFO FileOutputCommitter - Saved output of task 'attempt_202603041451251594255251621806110_0308_r_000000_0' to file:/tmp/ReadsSparkSinkUnitTest19070785983991903442.bam/_temporary/0/task_202603041451251594255251621806110_0308_r_000000
14:51:26.111 INFO SparkHadoopMapRedUtil - attempt_202603041451251594255251621806110_0308_r_000000_0: Committed. Elapsed time: 0 ms.
14:51:26.112 INFO Executor - Finished task 0.0 in stage 75.0 (TID 113). 1729 bytes result sent to driver
14:51:26.112 INFO TaskSetManager - Finished task 0.0 in stage 75.0 (TID 113) in 37 ms on localhost (executor driver) (2/2)
14:51:26.112 INFO TaskSchedulerImpl - Removed TaskSet 75.0, whose tasks have all completed, from pool
14:51:26.113 INFO DAGScheduler - ResultStage 75 (runJob at SparkHadoopWriter.scala:83) finished in 0.047 s
14:51:26.113 INFO DAGScheduler - Job 56 is finished. Cancelling potential speculative or zombie tasks for this job
14:51:26.113 INFO TaskSchedulerImpl - Killing all running tasks in stage 75: Stage finished
14:51:26.113 INFO DAGScheduler - Job 56 finished: runJob at SparkHadoopWriter.scala:83, took 0.138629 s
14:51:26.114 INFO SparkHadoopWriter - Start to commit write Job job_202603041451251594255251621806110_0308.
14:51:26.121 INFO SparkHadoopWriter - Write Job job_202603041451251594255251621806110_0308 committed. Elapsed time: 6 ms.
14:51:26.125 INFO MemoryStore - Block broadcast_138 stored as values in memory (estimated size 297.9 KiB, free 1916.5 MiB)
14:51:26.135 INFO MemoryStore - Block broadcast_138_piece0 stored as bytes in memory (estimated size 50.2 KiB, free 1916.5 MiB)
14:51:26.135 INFO BlockManagerInfo - Added broadcast_138_piece0 in memory on localhost:44923 (size: 50.2 KiB, free: 1919.4 MiB)
14:51:26.136 INFO SparkContext - Created broadcast 138 from newAPIHadoopFile at PathSplitSource.java:96
14:51:26.166 INFO SparkContext - Starting job: count at ReadsSparkSinkUnitTest.java:222
14:51:26.167 INFO DAGScheduler - Got job 57 (count at ReadsSparkSinkUnitTest.java:222) with 2 output partitions
14:51:26.167 INFO DAGScheduler - Final stage: ResultStage 77 (count at ReadsSparkSinkUnitTest.java:222)
14:51:26.167 INFO DAGScheduler - Parents of final stage: List(ShuffleMapStage 76)
14:51:26.167 INFO DAGScheduler - Missing parents: List()
14:51:26.167 INFO DAGScheduler - Submitting ResultStage 77 (MapPartitionsRDD[299] at repartition at ReadsSparkSinkUnitTest.java:210), which has no missing parents
14:51:26.168 INFO MemoryStore - Block broadcast_139 stored as values in memory (estimated size 6.3 KiB, free 1916.4 MiB)
14:51:26.169 INFO MemoryStore - Block broadcast_139_piece0 stored as bytes in memory (estimated size 3.4 KiB, free 1916.4 MiB)
14:51:26.169 INFO BlockManagerInfo - Added broadcast_139_piece0 in memory on localhost:44923 (size: 3.4 KiB, free: 1919.4 MiB)
14:51:26.169 INFO SparkContext - Created broadcast 139 from broadcast at DAGScheduler.scala:1580
14:51:26.169 INFO DAGScheduler - Submitting 2 missing tasks from ResultStage 77 (MapPartitionsRDD[299] at repartition at ReadsSparkSinkUnitTest.java:210) (first 15 tasks are for partitions Vector(0, 1))
14:51:26.169 INFO TaskSchedulerImpl - Adding task set 77.0 with 2 tasks resource profile 0
14:51:26.170 INFO TaskSetManager - Starting task 0.0 in stage 77.0 (TID 115) (localhost, executor driver, partition 0, NODE_LOCAL, 7860 bytes)
14:51:26.170 INFO TaskSetManager - Starting task 1.0 in stage 77.0 (TID 116) (localhost, executor driver, partition 1, NODE_LOCAL, 7860 bytes)
14:51:26.171 INFO Executor - Running task 0.0 in stage 77.0 (TID 115)
14:51:26.171 INFO Executor - Running task 1.0 in stage 77.0 (TID 116)
14:51:26.173 INFO ShuffleBlockFetcherIterator - Getting 1 (176.4 KiB) non-empty blocks including 1 (176.4 KiB) local and 0 (0.0 B) host-local and 0 (0.0 B) push-merged-local and 0 (0.0 B) remote blocks
14:51:26.173 INFO ShuffleBlockFetcherIterator - Getting 1 (176.4 KiB) non-empty blocks including 1 (176.4 KiB) local and 0 (0.0 B) host-local and 0 (0.0 B) push-merged-local and 0 (0.0 B) remote blocks
14:51:26.173 INFO ShuffleBlockFetcherIterator - Started 0 remote fetches in 0 ms
14:51:26.173 INFO ShuffleBlockFetcherIterator - Started 0 remote fetches in 0 ms
14:51:26.178 INFO Executor - Finished task 0.0 in stage 77.0 (TID 115). 1634 bytes result sent to driver
14:51:26.178 INFO Executor - Finished task 1.0 in stage 77.0 (TID 116). 1591 bytes result sent to driver
14:51:26.178 INFO TaskSetManager - Finished task 0.0 in stage 77.0 (TID 115) in 8 ms on localhost (executor driver) (1/2)
14:51:26.179 INFO TaskSetManager - Finished task 1.0 in stage 77.0 (TID 116) in 9 ms on localhost (executor driver) (2/2)
14:51:26.179 INFO TaskSchedulerImpl - Removed TaskSet 77.0, whose tasks have all completed, from pool
14:51:26.179 INFO DAGScheduler - ResultStage 77 (count at ReadsSparkSinkUnitTest.java:222) finished in 0.011 s
14:51:26.179 INFO DAGScheduler - Job 57 is finished. Cancelling potential speculative or zombie tasks for this job
14:51:26.179 INFO TaskSchedulerImpl - Killing all running tasks in stage 77: Stage finished
14:51:26.179 INFO DAGScheduler - Job 57 finished: count at ReadsSparkSinkUnitTest.java:222, took 0.013126 s
14:51:26.195 INFO FileInputFormat - Total input files to process : 2
14:51:26.200 INFO SparkContext - Starting job: count at ReadsSparkSinkUnitTest.java:222
14:51:26.200 INFO DAGScheduler - Got job 58 (count at ReadsSparkSinkUnitTest.java:222) with 2 output partitions
14:51:26.200 INFO DAGScheduler - Final stage: ResultStage 78 (count at ReadsSparkSinkUnitTest.java:222)
14:51:26.201 INFO DAGScheduler - Parents of final stage: List()
14:51:26.201 INFO DAGScheduler - Missing parents: List()
14:51:26.201 INFO DAGScheduler - Submitting ResultStage 78 (MapPartitionsRDD[315] at filter at ReadsSparkSource.java:96), which has no missing parents
14:51:26.220 INFO MemoryStore - Block broadcast_140 stored as values in memory (estimated size 426.1 KiB, free 1916.0 MiB)
14:51:26.222 INFO MemoryStore - Block broadcast_140_piece0 stored as bytes in memory (estimated size 153.6 KiB, free 1915.9 MiB)
14:51:26.222 INFO BlockManagerInfo - Added broadcast_140_piece0 in memory on localhost:44923 (size: 153.6 KiB, free: 1919.2 MiB)
14:51:26.222 INFO SparkContext - Created broadcast 140 from broadcast at DAGScheduler.scala:1580
14:51:26.223 INFO DAGScheduler - Submitting 2 missing tasks from ResultStage 78 (MapPartitionsRDD[315] at filter at ReadsSparkSource.java:96) (first 15 tasks are for partitions Vector(0, 1))
14:51:26.223 INFO TaskSchedulerImpl - Adding task set 78.0 with 2 tasks resource profile 0
14:51:26.223 INFO TaskSetManager - Starting task 0.0 in stage 78.0 (TID 117) (localhost, executor driver, partition 0, PROCESS_LOCAL, 7826 bytes)
14:51:26.224 INFO TaskSetManager - Starting task 1.0 in stage 78.0 (TID 118) (localhost, executor driver, partition 1, PROCESS_LOCAL, 7826 bytes)
14:51:26.224 INFO Executor - Running task 0.0 in stage 78.0 (TID 117)
14:51:26.224 INFO Executor - Running task 1.0 in stage 78.0 (TID 118)
14:51:26.256 INFO NewHadoopRDD - Input split: file:/tmp/ReadsSparkSinkUnitTest19070785983991903442.bam/part-r-00001.bam:0+129330
14:51:26.266 INFO Executor - Finished task 1.0 in stage 78.0 (TID 118). 989 bytes result sent to driver
14:51:26.267 INFO TaskSetManager - Finished task 1.0 in stage 78.0 (TID 118) in 43 ms on localhost (executor driver) (1/2)
14:51:26.272 INFO NewHadoopRDD - Input split: file:/tmp/ReadsSparkSinkUnitTest19070785983991903442.bam/part-r-00000.bam:0+132492
14:51:26.286 INFO Executor - Finished task 0.0 in stage 78.0 (TID 117). 989 bytes result sent to driver
14:51:26.286 INFO TaskSetManager - Finished task 0.0 in stage 78.0 (TID 117) in 63 ms on localhost (executor driver) (2/2)
14:51:26.286 INFO TaskSchedulerImpl - Removed TaskSet 78.0, whose tasks have all completed, from pool
14:51:26.286 INFO DAGScheduler - ResultStage 78 (count at ReadsSparkSinkUnitTest.java:222) finished in 0.085 s
14:51:26.287 INFO DAGScheduler - Job 58 is finished. Cancelling potential speculative or zombie tasks for this job
14:51:26.287 INFO TaskSchedulerImpl - Killing all running tasks in stage 78: Stage finished
14:51:26.287 INFO DAGScheduler - Job 58 finished: count at ReadsSparkSinkUnitTest.java:222, took 0.087292 s
14:51:26.291 INFO MemoryStore - Block broadcast_141 stored as values in memory (estimated size 297.9 KiB, free 1915.6 MiB)
14:51:26.297 INFO MemoryStore - Block broadcast_141_piece0 stored as bytes in memory (estimated size 50.2 KiB, free 1915.5 MiB)
14:51:26.297 INFO BlockManagerInfo - Added broadcast_141_piece0 in memory on localhost:44923 (size: 50.2 KiB, free: 1919.2 MiB)
14:51:26.298 INFO SparkContext - Created broadcast 141 from newAPIHadoopFile at PathSplitSource.java:96
14:51:26.325 INFO MemoryStore - Block broadcast_142 stored as values in memory (estimated size 297.9 KiB, free 1915.2 MiB)
14:51:26.331 INFO MemoryStore - Block broadcast_142_piece0 stored as bytes in memory (estimated size 50.2 KiB, free 1915.2 MiB)
14:51:26.331 INFO BlockManagerInfo - Added broadcast_142_piece0 in memory on localhost:44923 (size: 50.2 KiB, free: 1919.1 MiB)
14:51:26.332 INFO SparkContext - Created broadcast 142 from newAPIHadoopFile at PathSplitSource.java:96
14:51:26.358 INFO MemoryStore - Block broadcast_143 stored as values in memory (estimated size 160.7 KiB, free 1915.0 MiB)
14:51:26.370 INFO MemoryStore - Block broadcast_143_piece0 stored as bytes in memory (estimated size 9.6 KiB, free 1915.0 MiB)
14:51:26.370 INFO BlockManagerInfo - Added broadcast_143_piece0 in memory on localhost:44923 (size: 9.6 KiB, free: 1919.1 MiB)
14:51:26.370 INFO SparkContext - Created broadcast 143 from broadcast at ReadsSparkSink.java:133
14:51:26.370 INFO BlockManagerInfo - Removed broadcast_122_piece0 on localhost:44923 in memory (size: 50.2 KiB, free: 1919.2 MiB)
14:51:26.371 INFO BlockManagerInfo - Removed broadcast_131_piece0 on localhost:44923 in memory (size: 3.8 KiB, free: 1919.2 MiB)
14:51:26.372 INFO BlockManagerInfo - Removed broadcast_127_piece0 on localhost:44923 in memory (size: 9.6 KiB, free: 1919.2 MiB)
14:51:26.372 INFO BlockManagerInfo - Removed broadcast_130_piece0 on localhost:44923 in memory (size: 153.6 KiB, free: 1919.3 MiB)
14:51:26.373 INFO MemoryStore - Block broadcast_144 stored as values in memory (estimated size 163.2 KiB, free 1916.0 MiB)
14:51:26.373 INFO BlockManagerInfo - Removed broadcast_135_piece0 on localhost:44923 in memory (size: 9.6 KiB, free: 1919.3 MiB)
14:51:26.374 INFO BlockManagerInfo - Removed broadcast_132_piece0 on localhost:44923 in memory (size: 50.2 KiB, free: 1919.4 MiB)
14:51:26.374 INFO MemoryStore - Block broadcast_144_piece0 stored as bytes in memory (estimated size 9.6 KiB, free 1916.5 MiB)
14:51:26.374 INFO BlockManagerInfo - Removed broadcast_136_piece0 on localhost:44923 in memory (size: 154.6 KiB, free: 1919.5 MiB)
14:51:26.375 INFO BlockManagerInfo - Added broadcast_144_piece0 in memory on localhost:44923 (size: 9.6 KiB, free: 1919.5 MiB)
14:51:26.375 INFO BlockManagerInfo - Removed broadcast_140_piece0 on localhost:44923 in memory (size: 153.6 KiB, free: 1919.7 MiB)
14:51:26.375 INFO SparkContext - Created broadcast 144 from broadcast at AnySamSinkMultiple.java:80
14:51:26.375 INFO BlockManagerInfo - Removed broadcast_128_piece0 on localhost:44923 in memory (size: 50.2 KiB, free: 1919.7 MiB)
14:51:26.376 INFO BlockManagerInfo - Removed broadcast_142_piece0 on localhost:44923 in memory (size: 50.2 KiB, free: 1919.8 MiB)
14:51:26.377 INFO BlockManagerInfo - Removed broadcast_137_piece0 on localhost:44923 in memory (size: 56.2 KiB, free: 1919.8 MiB)
14:51:26.377 INFO BlockManagerInfo - Removed broadcast_138_piece0 on localhost:44923 in memory (size: 50.2 KiB, free: 1919.9 MiB)
14:51:26.378 INFO BlockManagerInfo - Removed broadcast_139_piece0 on localhost:44923 in memory (size: 3.4 KiB, free: 1919.9 MiB)
14:51:26.379 INFO BlockManagerInfo - Removed broadcast_133_piece0 on localhost:44923 in memory (size: 50.2 KiB, free: 1919.9 MiB)
14:51:26.379 INFO PathOutputCommitterFactory - No output committer factory defined, defaulting to FileOutputCommitterFactory
14:51:26.379 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
14:51:26.379 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
14:51:26.379 INFO BlockManagerInfo - Removed broadcast_134_piece0 on localhost:44923 in memory (size: 9.6 KiB, free: 1919.9 MiB)
14:51:26.396 INFO FileInputFormat - Total input files to process : 1
14:51:26.406 INFO SparkContext - Starting job: runJob at SparkHadoopWriter.scala:83
14:51:26.406 INFO DAGScheduler - Registering RDD 323 (repartition at ReadsSparkSinkUnitTest.java:210) as input to shuffle 17
14:51:26.407 INFO DAGScheduler - Got job 59 (runJob at SparkHadoopWriter.scala:83) with 2 output partitions
14:51:26.407 INFO DAGScheduler - Final stage: ResultStage 80 (runJob at SparkHadoopWriter.scala:83)
14:51:26.407 INFO DAGScheduler - Parents of final stage: List(ShuffleMapStage 79)
14:51:26.407 INFO DAGScheduler - Missing parents: List(ShuffleMapStage 79)
14:51:26.407 INFO DAGScheduler - Submitting ShuffleMapStage 79 (MapPartitionsRDD[323] at repartition at ReadsSparkSinkUnitTest.java:210), which has no missing parents
14:51:26.428 INFO MemoryStore - Block broadcast_145 stored as values in memory (estimated size 427.7 KiB, free 1918.9 MiB)
14:51:26.429 INFO MemoryStore - Block broadcast_145_piece0 stored as bytes in memory (estimated size 154.6 KiB, free 1918.8 MiB)
14:51:26.429 INFO BlockManagerInfo - Added broadcast_145_piece0 in memory on localhost:44923 (size: 154.6 KiB, free: 1919.8 MiB)
14:51:26.430 INFO SparkContext - Created broadcast 145 from broadcast at DAGScheduler.scala:1580
14:51:26.430 INFO DAGScheduler - Submitting 1 missing tasks from ShuffleMapStage 79 (MapPartitionsRDD[323] at repartition at ReadsSparkSinkUnitTest.java:210) (first 15 tasks are for partitions Vector(0))
14:51:26.430 INFO TaskSchedulerImpl - Adding task set 79.0 with 1 tasks resource profile 0
14:51:26.431 INFO TaskSetManager - Starting task 0.0 in stage 79.0 (TID 119) (localhost, executor driver, partition 0, PROCESS_LOCAL, 7861 bytes)
14:51:26.431 INFO Executor - Running task 0.0 in stage 79.0 (TID 119)
14:51:26.465 INFO NewHadoopRDD - Input split: file:/home/runner/work/gatk/gatk/src/test/resources/org/broadinstitute/hellbender/tools/BQSR/HiSeq.1mb.1RG.2k_lines.bam:0+222075
14:51:26.487 INFO Executor - Finished task 0.0 in stage 79.0 (TID 119). 1149 bytes result sent to driver
14:51:26.487 INFO TaskSetManager - Finished task 0.0 in stage 79.0 (TID 119) in 56 ms on localhost (executor driver) (1/1)
14:51:26.487 INFO TaskSchedulerImpl - Removed TaskSet 79.0, whose tasks have all completed, from pool
14:51:26.488 INFO DAGScheduler - ShuffleMapStage 79 (repartition at ReadsSparkSinkUnitTest.java:210) finished in 0.080 s
14:51:26.488 INFO DAGScheduler - looking for newly runnable stages
14:51:26.488 INFO DAGScheduler - running: HashSet()
14:51:26.488 INFO DAGScheduler - waiting: HashSet(ResultStage 80)
14:51:26.488 INFO DAGScheduler - failed: HashSet()
14:51:26.488 INFO DAGScheduler - Submitting ResultStage 80 (MapPartitionsRDD[335] at mapToPair at AnySamSinkMultiple.java:89), which has no missing parents
14:51:26.495 INFO MemoryStore - Block broadcast_146 stored as values in memory (estimated size 150.2 KiB, free 1918.6 MiB)
14:51:26.496 INFO MemoryStore - Block broadcast_146_piece0 stored as bytes in memory (estimated size 56.2 KiB, free 1918.6 MiB)
14:51:26.496 INFO BlockManagerInfo - Added broadcast_146_piece0 in memory on localhost:44923 (size: 56.2 KiB, free: 1919.7 MiB)
14:51:26.496 INFO SparkContext - Created broadcast 146 from broadcast at DAGScheduler.scala:1580
14:51:26.497 INFO DAGScheduler - Submitting 2 missing tasks from ResultStage 80 (MapPartitionsRDD[335] at mapToPair at AnySamSinkMultiple.java:89) (first 15 tasks are for partitions Vector(0, 1))
14:51:26.497 INFO TaskSchedulerImpl - Adding task set 80.0 with 2 tasks resource profile 0
14:51:26.497 INFO TaskSetManager - Starting task 0.0 in stage 80.0 (TID 120) (localhost, executor driver, partition 0, NODE_LOCAL, 7860 bytes)
14:51:26.498 INFO TaskSetManager - Starting task 1.0 in stage 80.0 (TID 121) (localhost, executor driver, partition 1, NODE_LOCAL, 7860 bytes)
14:51:26.498 INFO Executor - Running task 0.0 in stage 80.0 (TID 120)
14:51:26.498 INFO Executor - Running task 1.0 in stage 80.0 (TID 121)
14:51:26.505 INFO PathOutputCommitterFactory - No output committer factory defined, defaulting to FileOutputCommitterFactory
14:51:26.505 INFO PathOutputCommitterFactory - No output committer factory defined, defaulting to FileOutputCommitterFactory
14:51:26.506 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
14:51:26.506 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
14:51:26.506 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
14:51:26.506 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
14:51:26.506 INFO PathOutputCommitterFactory - No output committer factory defined, defaulting to FileOutputCommitterFactory
14:51:26.506 INFO PathOutputCommitterFactory - No output committer factory defined, defaulting to FileOutputCommitterFactory
14:51:26.506 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
14:51:26.506 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
14:51:26.506 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
14:51:26.506 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
14:51:26.518 INFO ShuffleBlockFetcherIterator - Getting 1 (176.4 KiB) non-empty blocks including 1 (176.4 KiB) local and 0 (0.0 B) host-local and 0 (0.0 B) push-merged-local and 0 (0.0 B) remote blocks
14:51:26.518 INFO ShuffleBlockFetcherIterator - Started 0 remote fetches in 0 ms
14:51:26.522 INFO ShuffleBlockFetcherIterator - Getting 1 (176.4 KiB) non-empty blocks including 1 (176.4 KiB) local and 0 (0.0 B) host-local and 0 (0.0 B) push-merged-local and 0 (0.0 B) remote blocks
14:51:26.522 INFO ShuffleBlockFetcherIterator - Started 0 remote fetches in 0 ms
14:51:26.528 INFO FileOutputCommitter - Saved output of task 'attempt_202603041451267335296760343501923_0335_r_000000_0' to file:/tmp/ReadsSparkSinkUnitTest14894733613397570801.bam/_temporary/0/task_202603041451267335296760343501923_0335_r_000000
14:51:26.528 INFO SparkHadoopMapRedUtil - attempt_202603041451267335296760343501923_0335_r_000000_0: Committed. Elapsed time: 0 ms.
14:51:26.529 INFO FileOutputCommitter - Saved output of task 'attempt_202603041451267335296760343501923_0335_r_000001_0' to file:/tmp/ReadsSparkSinkUnitTest14894733613397570801.bam/_temporary/0/task_202603041451267335296760343501923_0335_r_000001
14:51:26.529 INFO SparkHadoopMapRedUtil - attempt_202603041451267335296760343501923_0335_r_000001_0: Committed. Elapsed time: 0 ms.
14:51:26.529 INFO Executor - Finished task 0.0 in stage 80.0 (TID 120). 1729 bytes result sent to driver
14:51:26.530 INFO Executor - Finished task 1.0 in stage 80.0 (TID 121). 1729 bytes result sent to driver
14:51:26.530 INFO TaskSetManager - Finished task 0.0 in stage 80.0 (TID 120) in 33 ms on localhost (executor driver) (1/2)
14:51:26.530 INFO TaskSetManager - Finished task 1.0 in stage 80.0 (TID 121) in 32 ms on localhost (executor driver) (2/2)
14:51:26.531 INFO TaskSchedulerImpl - Removed TaskSet 80.0, whose tasks have all completed, from pool
14:51:26.531 INFO DAGScheduler - ResultStage 80 (runJob at SparkHadoopWriter.scala:83) finished in 0.043 s
14:51:26.531 INFO DAGScheduler - Job 59 is finished. Cancelling potential speculative or zombie tasks for this job
14:51:26.531 INFO TaskSchedulerImpl - Killing all running tasks in stage 80: Stage finished
14:51:26.531 INFO DAGScheduler - Job 59 finished: runJob at SparkHadoopWriter.scala:83, took 0.125268 s
14:51:26.532 INFO SparkHadoopWriter - Start to commit write Job job_202603041451267335296760343501923_0335.
14:51:26.538 INFO SparkHadoopWriter - Write Job job_202603041451267335296760343501923_0335 committed. Elapsed time: 6 ms.
14:51:26.542 INFO MemoryStore - Block broadcast_147 stored as values in memory (estimated size 297.9 KiB, free 1918.3 MiB)
14:51:26.553 INFO MemoryStore - Block broadcast_147_piece0 stored as bytes in memory (estimated size 50.2 KiB, free 1918.2 MiB)
14:51:26.553 INFO BlockManagerInfo - Added broadcast_147_piece0 in memory on localhost:44923 (size: 50.2 KiB, free: 1919.7 MiB)
14:51:26.553 INFO SparkContext - Created broadcast 147 from newAPIHadoopFile at PathSplitSource.java:96
14:51:26.580 INFO SparkContext - Starting job: count at ReadsSparkSinkUnitTest.java:222
14:51:26.580 INFO DAGScheduler - Got job 60 (count at ReadsSparkSinkUnitTest.java:222) with 2 output partitions
14:51:26.580 INFO DAGScheduler - Final stage: ResultStage 82 (count at ReadsSparkSinkUnitTest.java:222)
14:51:26.580 INFO DAGScheduler - Parents of final stage: List(ShuffleMapStage 81)
14:51:26.580 INFO DAGScheduler - Missing parents: List()
14:51:26.581 INFO DAGScheduler - Submitting ResultStage 82 (MapPartitionsRDD[326] at repartition at ReadsSparkSinkUnitTest.java:210), which has no missing parents
14:51:26.582 INFO MemoryStore - Block broadcast_148 stored as values in memory (estimated size 6.3 KiB, free 1918.2 MiB)
14:51:26.582 INFO MemoryStore - Block broadcast_148_piece0 stored as bytes in memory (estimated size 3.4 KiB, free 1918.2 MiB)
14:51:26.582 INFO BlockManagerInfo - Added broadcast_148_piece0 in memory on localhost:44923 (size: 3.4 KiB, free: 1919.7 MiB)
14:51:26.583 INFO SparkContext - Created broadcast 148 from broadcast at DAGScheduler.scala:1580
14:51:26.583 INFO DAGScheduler - Submitting 2 missing tasks from ResultStage 82 (MapPartitionsRDD[326] at repartition at ReadsSparkSinkUnitTest.java:210) (first 15 tasks are for partitions Vector(0, 1))
14:51:26.583 INFO TaskSchedulerImpl - Adding task set 82.0 with 2 tasks resource profile 0
14:51:26.584 INFO TaskSetManager - Starting task 0.0 in stage 82.0 (TID 122) (localhost, executor driver, partition 0, NODE_LOCAL, 7860 bytes)
14:51:26.584 INFO TaskSetManager - Starting task 1.0 in stage 82.0 (TID 123) (localhost, executor driver, partition 1, NODE_LOCAL, 7860 bytes)
14:51:26.584 INFO Executor - Running task 1.0 in stage 82.0 (TID 123)
14:51:26.584 INFO Executor - Running task 0.0 in stage 82.0 (TID 122)
14:51:26.586 INFO ShuffleBlockFetcherIterator - Getting 1 (176.4 KiB) non-empty blocks including 1 (176.4 KiB) local and 0 (0.0 B) host-local and 0 (0.0 B) push-merged-local and 0 (0.0 B) remote blocks
14:51:26.586 INFO ShuffleBlockFetcherIterator - Getting 1 (176.4 KiB) non-empty blocks including 1 (176.4 KiB) local and 0 (0.0 B) host-local and 0 (0.0 B) push-merged-local and 0 (0.0 B) remote blocks
14:51:26.586 INFO ShuffleBlockFetcherIterator - Started 0 remote fetches in 0 ms
14:51:26.586 INFO ShuffleBlockFetcherIterator - Started 0 remote fetches in 0 ms
14:51:26.591 INFO Executor - Finished task 1.0 in stage 82.0 (TID 123). 1634 bytes result sent to driver
14:51:26.591 INFO Executor - Finished task 0.0 in stage 82.0 (TID 122). 1634 bytes result sent to driver
14:51:26.591 INFO TaskSetManager - Finished task 1.0 in stage 82.0 (TID 123) in 7 ms on localhost (executor driver) (1/2)
14:51:26.591 INFO TaskSetManager - Finished task 0.0 in stage 82.0 (TID 122) in 7 ms on localhost (executor driver) (2/2)
14:51:26.592 INFO TaskSchedulerImpl - Removed TaskSet 82.0, whose tasks have all completed, from pool
14:51:26.592 INFO DAGScheduler - ResultStage 82 (count at ReadsSparkSinkUnitTest.java:222) finished in 0.011 s
14:51:26.592 INFO DAGScheduler - Job 60 is finished. Cancelling potential speculative or zombie tasks for this job
14:51:26.592 INFO TaskSchedulerImpl - Killing all running tasks in stage 82: Stage finished
14:51:26.592 INFO DAGScheduler - Job 60 finished: count at ReadsSparkSinkUnitTest.java:222, took 0.012023 s
14:51:26.607 INFO FileInputFormat - Total input files to process : 2
14:51:26.612 INFO SparkContext - Starting job: count at ReadsSparkSinkUnitTest.java:222
14:51:26.612 INFO DAGScheduler - Got job 61 (count at ReadsSparkSinkUnitTest.java:222) with 2 output partitions
14:51:26.612 INFO DAGScheduler - Final stage: ResultStage 83 (count at ReadsSparkSinkUnitTest.java:222)
14:51:26.612 INFO DAGScheduler - Parents of final stage: List()
14:51:26.612 INFO DAGScheduler - Missing parents: List()
14:51:26.613 INFO DAGScheduler - Submitting ResultStage 83 (MapPartitionsRDD[342] at filter at ReadsSparkSource.java:96), which has no missing parents
14:51:26.631 INFO MemoryStore - Block broadcast_149 stored as values in memory (estimated size 426.1 KiB, free 1917.8 MiB)
14:51:26.632 INFO MemoryStore - Block broadcast_149_piece0 stored as bytes in memory (estimated size 153.6 KiB, free 1917.6 MiB)
14:51:26.632 INFO BlockManagerInfo - Added broadcast_149_piece0 in memory on localhost:44923 (size: 153.6 KiB, free: 1919.5 MiB)
14:51:26.633 INFO SparkContext - Created broadcast 149 from broadcast at DAGScheduler.scala:1580
14:51:26.633 INFO DAGScheduler - Submitting 2 missing tasks from ResultStage 83 (MapPartitionsRDD[342] at filter at ReadsSparkSource.java:96) (first 15 tasks are for partitions Vector(0, 1))
14:51:26.633 INFO TaskSchedulerImpl - Adding task set 83.0 with 2 tasks resource profile 0
14:51:26.634 INFO TaskSetManager - Starting task 0.0 in stage 83.0 (TID 124) (localhost, executor driver, partition 0, PROCESS_LOCAL, 7826 bytes)
14:51:26.634 INFO TaskSetManager - Starting task 1.0 in stage 83.0 (TID 125) (localhost, executor driver, partition 1, PROCESS_LOCAL, 7826 bytes)
14:51:26.634 INFO Executor - Running task 1.0 in stage 83.0 (TID 125)
14:51:26.634 INFO Executor - Running task 0.0 in stage 83.0 (TID 124)
14:51:26.668 INFO NewHadoopRDD - Input split: file:/tmp/ReadsSparkSinkUnitTest14894733613397570801.bam/part-r-00001.bam:0+129330
14:51:26.678 INFO Executor - Finished task 1.0 in stage 83.0 (TID 125). 989 bytes result sent to driver
14:51:26.679 INFO TaskSetManager - Finished task 1.0 in stage 83.0 (TID 125) in 45 ms on localhost (executor driver) (1/2)
14:51:26.683 INFO NewHadoopRDD - Input split: file:/tmp/ReadsSparkSinkUnitTest14894733613397570801.bam/part-r-00000.bam:0+132492
14:51:26.697 INFO Executor - Finished task 0.0 in stage 83.0 (TID 124). 989 bytes result sent to driver
14:51:26.697 INFO TaskSetManager - Finished task 0.0 in stage 83.0 (TID 124) in 63 ms on localhost (executor driver) (2/2)
14:51:26.697 INFO TaskSchedulerImpl - Removed TaskSet 83.0, whose tasks have all completed, from pool
14:51:26.698 INFO DAGScheduler - ResultStage 83 (count at ReadsSparkSinkUnitTest.java:222) finished in 0.085 s
14:51:26.698 INFO DAGScheduler - Job 61 is finished. Cancelling potential speculative or zombie tasks for this job
14:51:26.698 INFO TaskSchedulerImpl - Killing all running tasks in stage 83: Stage finished
14:51:26.698 INFO DAGScheduler - Job 61 finished: count at ReadsSparkSinkUnitTest.java:222, took 0.086093 s
14:51:26.701 INFO MemoryStore - Block broadcast_150 stored as values in memory (estimated size 297.9 KiB, free 1917.3 MiB)
14:51:26.707 INFO MemoryStore - Block broadcast_150_piece0 stored as bytes in memory (estimated size 50.2 KiB, free 1917.3 MiB)
14:51:26.708 INFO BlockManagerInfo - Added broadcast_150_piece0 in memory on localhost:44923 (size: 50.2 KiB, free: 1919.5 MiB)
14:51:26.708 INFO SparkContext - Created broadcast 150 from newAPIHadoopFile at PathSplitSource.java:96
14:51:26.733 INFO MemoryStore - Block broadcast_151 stored as values in memory (estimated size 297.9 KiB, free 1917.0 MiB)
14:51:26.740 INFO MemoryStore - Block broadcast_151_piece0 stored as bytes in memory (estimated size 50.2 KiB, free 1917.0 MiB)
14:51:26.740 INFO BlockManagerInfo - Added broadcast_151_piece0 in memory on localhost:44923 (size: 50.2 KiB, free: 1919.4 MiB)
14:51:26.740 INFO SparkContext - Created broadcast 151 from newAPIHadoopFile at PathSplitSource.java:96
14:51:26.762 INFO MemoryStore - Block broadcast_152 stored as values in memory (estimated size 160.7 KiB, free 1916.8 MiB)
14:51:26.763 INFO MemoryStore - Block broadcast_152_piece0 stored as bytes in memory (estimated size 9.6 KiB, free 1916.8 MiB)
14:51:26.763 INFO BlockManagerInfo - Added broadcast_152_piece0 in memory on localhost:44923 (size: 9.6 KiB, free: 1919.4 MiB)
14:51:26.763 INFO SparkContext - Created broadcast 152 from broadcast at ReadsSparkSink.java:133
14:51:26.765 INFO MemoryStore - Block broadcast_153 stored as values in memory (estimated size 163.2 KiB, free 1916.6 MiB)
14:51:26.765 INFO MemoryStore - Block broadcast_153_piece0 stored as bytes in memory (estimated size 9.6 KiB, free 1916.6 MiB)
14:51:26.765 INFO BlockManagerInfo - Added broadcast_153_piece0 in memory on localhost:44923 (size: 9.6 KiB, free: 1919.4 MiB)
14:51:26.766 INFO SparkContext - Created broadcast 153 from broadcast at AnySamSinkMultiple.java:80
14:51:26.768 INFO PathOutputCommitterFactory - No output committer factory defined, defaulting to FileOutputCommitterFactory
14:51:26.768 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
14:51:26.768 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
14:51:26.782 INFO FileInputFormat - Total input files to process : 1
14:51:26.789 INFO SparkContext - Starting job: runJob at SparkHadoopWriter.scala:83
14:51:26.789 INFO DAGScheduler - Registering RDD 350 (repartition at ReadsSparkSinkUnitTest.java:210) as input to shuffle 18
14:51:26.790 INFO DAGScheduler - Got job 62 (runJob at SparkHadoopWriter.scala:83) with 2 output partitions
14:51:26.790 INFO DAGScheduler - Final stage: ResultStage 85 (runJob at SparkHadoopWriter.scala:83)
14:51:26.790 INFO DAGScheduler - Parents of final stage: List(ShuffleMapStage 84)
14:51:26.790 INFO DAGScheduler - Missing parents: List(ShuffleMapStage 84)
14:51:26.790 INFO DAGScheduler - Submitting ShuffleMapStage 84 (MapPartitionsRDD[350] at repartition at ReadsSparkSinkUnitTest.java:210), which has no missing parents
14:51:26.814 INFO MemoryStore - Block broadcast_154 stored as values in memory (estimated size 427.7 KiB, free 1916.2 MiB)
14:51:26.816 INFO MemoryStore - Block broadcast_154_piece0 stored as bytes in memory (estimated size 154.6 KiB, free 1916.1 MiB)
14:51:26.816 INFO BlockManagerInfo - Added broadcast_154_piece0 in memory on localhost:44923 (size: 154.6 KiB, free: 1919.3 MiB)
14:51:26.816 INFO SparkContext - Created broadcast 154 from broadcast at DAGScheduler.scala:1580
14:51:26.816 INFO DAGScheduler - Submitting 1 missing tasks from ShuffleMapStage 84 (MapPartitionsRDD[350] at repartition at ReadsSparkSinkUnitTest.java:210) (first 15 tasks are for partitions Vector(0))
14:51:26.816 INFO TaskSchedulerImpl - Adding task set 84.0 with 1 tasks resource profile 0
14:51:26.817 INFO TaskSetManager - Starting task 0.0 in stage 84.0 (TID 126) (localhost, executor driver, partition 0, PROCESS_LOCAL, 7861 bytes)
14:51:26.818 INFO Executor - Running task 0.0 in stage 84.0 (TID 126)
14:51:26.853 INFO NewHadoopRDD - Input split: file:/home/runner/work/gatk/gatk/src/test/resources/org/broadinstitute/hellbender/tools/BQSR/HiSeq.1mb.1RG.2k_lines.bam:0+222075
14:51:26.872 INFO Executor - Finished task 0.0 in stage 84.0 (TID 126). 1149 bytes result sent to driver
14:51:26.873 INFO TaskSetManager - Finished task 0.0 in stage 84.0 (TID 126) in 56 ms on localhost (executor driver) (1/1)
14:51:26.873 INFO TaskSchedulerImpl - Removed TaskSet 84.0, whose tasks have all completed, from pool
14:51:26.873 INFO DAGScheduler - ShuffleMapStage 84 (repartition at ReadsSparkSinkUnitTest.java:210) finished in 0.083 s
14:51:26.873 INFO DAGScheduler - looking for newly runnable stages
14:51:26.873 INFO DAGScheduler - running: HashSet()
14:51:26.873 INFO DAGScheduler - waiting: HashSet(ResultStage 85)
14:51:26.873 INFO DAGScheduler - failed: HashSet()
14:51:26.873 INFO DAGScheduler - Submitting ResultStage 85 (MapPartitionsRDD[362] at mapToPair at AnySamSinkMultiple.java:89), which has no missing parents
14:51:26.880 INFO MemoryStore - Block broadcast_155 stored as values in memory (estimated size 150.2 KiB, free 1915.9 MiB)
14:51:26.881 INFO MemoryStore - Block broadcast_155_piece0 stored as bytes in memory (estimated size 56.2 KiB, free 1915.9 MiB)
14:51:26.881 INFO BlockManagerInfo - Added broadcast_155_piece0 in memory on localhost:44923 (size: 56.2 KiB, free: 1919.2 MiB)
14:51:26.881 INFO SparkContext - Created broadcast 155 from broadcast at DAGScheduler.scala:1580
14:51:26.881 INFO DAGScheduler - Submitting 2 missing tasks from ResultStage 85 (MapPartitionsRDD[362] at mapToPair at AnySamSinkMultiple.java:89) (first 15 tasks are for partitions Vector(0, 1))
14:51:26.881 INFO TaskSchedulerImpl - Adding task set 85.0 with 2 tasks resource profile 0
14:51:26.882 INFO TaskSetManager - Starting task 0.0 in stage 85.0 (TID 127) (localhost, executor driver, partition 0, NODE_LOCAL, 7860 bytes)
14:51:26.882 INFO TaskSetManager - Starting task 1.0 in stage 85.0 (TID 128) (localhost, executor driver, partition 1, NODE_LOCAL, 7860 bytes)
14:51:26.883 INFO Executor - Running task 1.0 in stage 85.0 (TID 128)
14:51:26.883 INFO Executor - Running task 0.0 in stage 85.0 (TID 127)
14:51:26.890 INFO PathOutputCommitterFactory - No output committer factory defined, defaulting to FileOutputCommitterFactory
14:51:26.890 INFO PathOutputCommitterFactory - No output committer factory defined, defaulting to FileOutputCommitterFactory
14:51:26.890 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
14:51:26.890 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
14:51:26.890 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
14:51:26.890 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
14:51:26.890 INFO PathOutputCommitterFactory - No output committer factory defined, defaulting to FileOutputCommitterFactory
14:51:26.890 INFO PathOutputCommitterFactory - No output committer factory defined, defaulting to FileOutputCommitterFactory
14:51:26.890 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
14:51:26.890 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
14:51:26.890 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
14:51:26.890 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
14:51:26.902 INFO ShuffleBlockFetcherIterator - Getting 1 (176.4 KiB) non-empty blocks including 1 (176.4 KiB) local and 0 (0.0 B) host-local and 0 (0.0 B) push-merged-local and 0 (0.0 B) remote blocks
14:51:26.902 INFO ShuffleBlockFetcherIterator - Started 0 remote fetches in 0 ms
14:51:26.907 INFO ShuffleBlockFetcherIterator - Getting 1 (176.4 KiB) non-empty blocks including 1 (176.4 KiB) local and 0 (0.0 B) host-local and 0 (0.0 B) push-merged-local and 0 (0.0 B) remote blocks
14:51:26.907 INFO ShuffleBlockFetcherIterator - Started 0 remote fetches in 0 ms
14:51:26.911 INFO FileOutputCommitter - Saved output of task 'attempt_202603041451262429791548052275789_0362_r_000001_0' to file:/tmp/ReadsSparkSinkUnitTest11793418366314784782.bam/_temporary/0/task_202603041451262429791548052275789_0362_r_000001
14:51:26.911 INFO SparkHadoopMapRedUtil - attempt_202603041451262429791548052275789_0362_r_000001_0: Committed. Elapsed time: 0 ms.
14:51:26.912 INFO Executor - Finished task 1.0 in stage 85.0 (TID 128). 1729 bytes result sent to driver
14:51:26.913 INFO TaskSetManager - Finished task 1.0 in stage 85.0 (TID 128) in 31 ms on localhost (executor driver) (1/2)
14:51:26.916 INFO FileOutputCommitter - Saved output of task 'attempt_202603041451262429791548052275789_0362_r_000000_0' to file:/tmp/ReadsSparkSinkUnitTest11793418366314784782.bam/_temporary/0/task_202603041451262429791548052275789_0362_r_000000
14:51:26.916 INFO SparkHadoopMapRedUtil - attempt_202603041451262429791548052275789_0362_r_000000_0: Committed. Elapsed time: 0 ms.
14:51:26.917 INFO Executor - Finished task 0.0 in stage 85.0 (TID 127). 1729 bytes result sent to driver
14:51:26.917 INFO TaskSetManager - Finished task 0.0 in stage 85.0 (TID 127) in 35 ms on localhost (executor driver) (2/2)
14:51:26.917 INFO TaskSchedulerImpl - Removed TaskSet 85.0, whose tasks have all completed, from pool
14:51:26.918 INFO DAGScheduler - ResultStage 85 (runJob at SparkHadoopWriter.scala:83) finished in 0.044 s
14:51:26.918 INFO DAGScheduler - Job 62 is finished. Cancelling potential speculative or zombie tasks for this job
14:51:26.918 INFO TaskSchedulerImpl - Killing all running tasks in stage 85: Stage finished
14:51:26.918 INFO DAGScheduler - Job 62 finished: runJob at SparkHadoopWriter.scala:83, took 0.128914 s
14:51:26.918 INFO SparkHadoopWriter - Start to commit write Job job_202603041451262429791548052275789_0362.
14:51:26.925 INFO SparkHadoopWriter - Write Job job_202603041451262429791548052275789_0362 committed. Elapsed time: 6 ms.
14:51:26.929 INFO MemoryStore - Block broadcast_156 stored as values in memory (estimated size 297.9 KiB, free 1915.6 MiB)
14:51:26.943 INFO BlockManagerInfo - Removed broadcast_155_piece0 on localhost:44923 in memory (size: 56.2 KiB, free: 1919.3 MiB)
14:51:26.944 INFO BlockManagerInfo - Removed broadcast_147_piece0 on localhost:44923 in memory (size: 50.2 KiB, free: 1919.3 MiB)
14:51:26.945 INFO BlockManagerInfo - Removed broadcast_143_piece0 on localhost:44923 in memory (size: 9.6 KiB, free: 1919.3 MiB)
14:51:26.946 INFO BlockManagerInfo - Removed broadcast_141_piece0 on localhost:44923 in memory (size: 50.2 KiB, free: 1919.4 MiB)
14:51:26.947 INFO BlockManagerInfo - Removed broadcast_152_piece0 on localhost:44923 in memory (size: 9.6 KiB, free: 1919.4 MiB)
14:51:26.947 INFO BlockManagerInfo - Removed broadcast_153_piece0 on localhost:44923 in memory (size: 9.6 KiB, free: 1919.4 MiB)
14:51:26.948 INFO BlockManagerInfo - Removed broadcast_148_piece0 on localhost:44923 in memory (size: 3.4 KiB, free: 1919.4 MiB)
14:51:26.948 INFO MemoryStore - Block broadcast_156_piece0 stored as bytes in memory (estimated size 50.2 KiB, free 1916.9 MiB)
14:51:26.948 INFO BlockManagerInfo - Added broadcast_156_piece0 in memory on localhost:44923 (size: 50.2 KiB, free: 1919.3 MiB)
14:51:26.949 INFO SparkContext - Created broadcast 156 from newAPIHadoopFile at PathSplitSource.java:96
14:51:26.949 INFO BlockManagerInfo - Removed broadcast_145_piece0 on localhost:44923 in memory (size: 154.6 KiB, free: 1919.5 MiB)
14:51:26.949 INFO BlockManagerInfo - Removed broadcast_149_piece0 on localhost:44923 in memory (size: 153.6 KiB, free: 1919.6 MiB)
14:51:26.950 INFO BlockManagerInfo - Removed broadcast_144_piece0 on localhost:44923 in memory (size: 9.6 KiB, free: 1919.6 MiB)
14:51:26.951 INFO BlockManagerInfo - Removed broadcast_154_piece0 on localhost:44923 in memory (size: 154.6 KiB, free: 1919.8 MiB)
14:51:26.955 INFO BlockManagerInfo - Removed broadcast_151_piece0 on localhost:44923 in memory (size: 50.2 KiB, free: 1919.8 MiB)
14:51:26.956 INFO BlockManagerInfo - Removed broadcast_146_piece0 on localhost:44923 in memory (size: 56.2 KiB, free: 1919.9 MiB)
14:51:26.975 INFO SparkContext - Starting job: count at ReadsSparkSinkUnitTest.java:222
14:51:26.976 INFO DAGScheduler - Got job 63 (count at ReadsSparkSinkUnitTest.java:222) with 2 output partitions
14:51:26.976 INFO DAGScheduler - Final stage: ResultStage 87 (count at ReadsSparkSinkUnitTest.java:222)
14:51:26.976 INFO DAGScheduler - Parents of final stage: List(ShuffleMapStage 86)
14:51:26.976 INFO DAGScheduler - Missing parents: List()
14:51:26.976 INFO DAGScheduler - Submitting ResultStage 87 (MapPartitionsRDD[353] at repartition at ReadsSparkSinkUnitTest.java:210), which has no missing parents
14:51:26.977 INFO MemoryStore - Block broadcast_157 stored as values in memory (estimated size 6.3 KiB, free 1919.3 MiB)
14:51:26.977 INFO MemoryStore - Block broadcast_157_piece0 stored as bytes in memory (estimated size 3.4 KiB, free 1919.3 MiB)
14:51:26.977 INFO BlockManagerInfo - Added broadcast_157_piece0 in memory on localhost:44923 (size: 3.4 KiB, free: 1919.9 MiB)
14:51:26.978 INFO SparkContext - Created broadcast 157 from broadcast at DAGScheduler.scala:1580
14:51:26.978 INFO DAGScheduler - Submitting 2 missing tasks from ResultStage 87 (MapPartitionsRDD[353] at repartition at ReadsSparkSinkUnitTest.java:210) (first 15 tasks are for partitions Vector(0, 1))
14:51:26.978 INFO TaskSchedulerImpl - Adding task set 87.0 with 2 tasks resource profile 0
14:51:26.979 INFO TaskSetManager - Starting task 0.0 in stage 87.0 (TID 129) (localhost, executor driver, partition 0, NODE_LOCAL, 7860 bytes)
14:51:26.979 INFO TaskSetManager - Starting task 1.0 in stage 87.0 (TID 130) (localhost, executor driver, partition 1, NODE_LOCAL, 7860 bytes)
14:51:26.979 INFO Executor - Running task 1.0 in stage 87.0 (TID 130)
14:51:26.979 INFO Executor - Running task 0.0 in stage 87.0 (TID 129)
14:51:26.982 INFO ShuffleBlockFetcherIterator - Getting 1 (176.4 KiB) non-empty blocks including 1 (176.4 KiB) local and 0 (0.0 B) host-local and 0 (0.0 B) push-merged-local and 0 (0.0 B) remote blocks
14:51:26.982 INFO ShuffleBlockFetcherIterator - Getting 1 (176.4 KiB) non-empty blocks including 1 (176.4 KiB) local and 0 (0.0 B) host-local and 0 (0.0 B) push-merged-local and 0 (0.0 B) remote blocks
14:51:26.982 INFO ShuffleBlockFetcherIterator - Started 0 remote fetches in 0 ms
14:51:26.982 INFO ShuffleBlockFetcherIterator - Started 0 remote fetches in 0 ms
14:51:26.987 INFO Executor - Finished task 0.0 in stage 87.0 (TID 129). 1634 bytes result sent to driver
14:51:26.987 INFO Executor - Finished task 1.0 in stage 87.0 (TID 130). 1634 bytes result sent to driver
14:51:26.987 INFO TaskSetManager - Finished task 1.0 in stage 87.0 (TID 130) in 8 ms on localhost (executor driver) (1/2)
14:51:26.988 INFO TaskSetManager - Finished task 0.0 in stage 87.0 (TID 129) in 9 ms on localhost (executor driver) (2/2)
14:51:26.988 INFO TaskSchedulerImpl - Removed TaskSet 87.0, whose tasks have all completed, from pool
14:51:26.988 INFO DAGScheduler - ResultStage 87 (count at ReadsSparkSinkUnitTest.java:222) finished in 0.012 s
14:51:26.988 INFO DAGScheduler - Job 63 is finished. Cancelling potential speculative or zombie tasks for this job
14:51:26.989 INFO TaskSchedulerImpl - Killing all running tasks in stage 87: Stage finished
14:51:26.989 INFO DAGScheduler - Job 63 finished: count at ReadsSparkSinkUnitTest.java:222, took 0.013684 s
14:51:27.008 INFO FileInputFormat - Total input files to process : 2
14:51:27.012 INFO SparkContext - Starting job: count at ReadsSparkSinkUnitTest.java:222
14:51:27.012 INFO DAGScheduler - Got job 64 (count at ReadsSparkSinkUnitTest.java:222) with 2 output partitions
14:51:27.012 INFO DAGScheduler - Final stage: ResultStage 88 (count at ReadsSparkSinkUnitTest.java:222)
14:51:27.012 INFO DAGScheduler - Parents of final stage: List()
14:51:27.012 INFO DAGScheduler - Missing parents: List()
14:51:27.013 INFO DAGScheduler - Submitting ResultStage 88 (MapPartitionsRDD[369] at filter at ReadsSparkSource.java:96), which has no missing parents
14:51:27.035 INFO MemoryStore - Block broadcast_158 stored as values in memory (estimated size 426.1 KiB, free 1918.9 MiB)
14:51:27.037 INFO MemoryStore - Block broadcast_158_piece0 stored as bytes in memory (estimated size 153.6 KiB, free 1918.7 MiB)
14:51:27.037 INFO BlockManagerInfo - Added broadcast_158_piece0 in memory on localhost:44923 (size: 153.6 KiB, free: 1919.7 MiB)
14:51:27.037 INFO SparkContext - Created broadcast 158 from broadcast at DAGScheduler.scala:1580
14:51:27.038 INFO DAGScheduler - Submitting 2 missing tasks from ResultStage 88 (MapPartitionsRDD[369] at filter at ReadsSparkSource.java:96) (first 15 tasks are for partitions Vector(0, 1))
14:51:27.038 INFO TaskSchedulerImpl - Adding task set 88.0 with 2 tasks resource profile 0
14:51:27.038 INFO TaskSetManager - Starting task 0.0 in stage 88.0 (TID 131) (localhost, executor driver, partition 0, PROCESS_LOCAL, 7826 bytes)
14:51:27.038 INFO TaskSetManager - Starting task 1.0 in stage 88.0 (TID 132) (localhost, executor driver, partition 1, PROCESS_LOCAL, 7826 bytes)
14:51:27.039 INFO Executor - Running task 1.0 in stage 88.0 (TID 132)
14:51:27.039 INFO Executor - Running task 0.0 in stage 88.0 (TID 131)
14:51:27.090 INFO NewHadoopRDD - Input split: file:/tmp/ReadsSparkSinkUnitTest11793418366314784782.bam/part-r-00000.bam:0+132492
14:51:27.090 INFO NewHadoopRDD - Input split: file:/tmp/ReadsSparkSinkUnitTest11793418366314784782.bam/part-r-00001.bam:0+129330
14:51:27.104 INFO Executor - Finished task 1.0 in stage 88.0 (TID 132). 989 bytes result sent to driver
14:51:27.104 INFO Executor - Finished task 0.0 in stage 88.0 (TID 131). 989 bytes result sent to driver
14:51:27.104 INFO TaskSetManager - Finished task 0.0 in stage 88.0 (TID 131) in 66 ms on localhost (executor driver) (1/2)
14:51:27.104 INFO TaskSetManager - Finished task 1.0 in stage 88.0 (TID 132) in 66 ms on localhost (executor driver) (2/2)
14:51:27.104 INFO TaskSchedulerImpl - Removed TaskSet 88.0, whose tasks have all completed, from pool
14:51:27.105 INFO DAGScheduler - ResultStage 88 (count at ReadsSparkSinkUnitTest.java:222) finished in 0.092 s
14:51:27.105 INFO DAGScheduler - Job 64 is finished. Cancelling potential speculative or zombie tasks for this job
14:51:27.105 INFO TaskSchedulerImpl - Killing all running tasks in stage 88: Stage finished
14:51:27.105 INFO DAGScheduler - Job 64 finished: count at ReadsSparkSinkUnitTest.java:222, took 0.093211 s
14:51:27.108 INFO MemoryStore - Block broadcast_159 stored as values in memory (estimated size 297.9 KiB, free 1918.5 MiB)
14:51:27.114 INFO MemoryStore - Block broadcast_159_piece0 stored as bytes in memory (estimated size 50.2 KiB, free 1918.4 MiB)
14:51:27.115 INFO BlockManagerInfo - Added broadcast_159_piece0 in memory on localhost:44923 (size: 50.2 KiB, free: 1919.7 MiB)
14:51:27.115 INFO SparkContext - Created broadcast 159 from newAPIHadoopFile at PathSplitSource.java:96
14:51:27.140 INFO MemoryStore - Block broadcast_160 stored as values in memory (estimated size 297.9 KiB, free 1918.1 MiB)
14:51:27.146 INFO MemoryStore - Block broadcast_160_piece0 stored as bytes in memory (estimated size 50.2 KiB, free 1918.1 MiB)
14:51:27.147 INFO BlockManagerInfo - Added broadcast_160_piece0 in memory on localhost:44923 (size: 50.2 KiB, free: 1919.7 MiB)
14:51:27.147 INFO SparkContext - Created broadcast 160 from newAPIHadoopFile at PathSplitSource.java:96
14:51:27.168 INFO MemoryStore - Block broadcast_161 stored as values in memory (estimated size 160.7 KiB, free 1917.9 MiB)
14:51:27.169 INFO MemoryStore - Block broadcast_161_piece0 stored as bytes in memory (estimated size 9.6 KiB, free 1917.9 MiB)
14:51:27.170 INFO BlockManagerInfo - Added broadcast_161_piece0 in memory on localhost:44923 (size: 9.6 KiB, free: 1919.6 MiB)
14:51:27.170 INFO SparkContext - Created broadcast 161 from broadcast at ReadsSparkSink.java:133
14:51:27.172 INFO MemoryStore - Block broadcast_162 stored as values in memory (estimated size 163.2 KiB, free 1917.7 MiB)
14:51:27.172 INFO MemoryStore - Block broadcast_162_piece0 stored as bytes in memory (estimated size 9.6 KiB, free 1917.7 MiB)
14:51:27.173 INFO BlockManagerInfo - Added broadcast_162_piece0 in memory on localhost:44923 (size: 9.6 KiB, free: 1919.6 MiB)
14:51:27.173 INFO SparkContext - Created broadcast 162 from broadcast at AnySamSinkMultiple.java:80
14:51:27.175 INFO PathOutputCommitterFactory - No output committer factory defined, defaulting to FileOutputCommitterFactory
14:51:27.175 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
14:51:27.175 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
14:51:27.190 INFO FileInputFormat - Total input files to process : 1
14:51:27.197 INFO SparkContext - Starting job: runJob at SparkHadoopWriter.scala:83
14:51:27.197 INFO DAGScheduler - Registering RDD 377 (repartition at ReadsSparkSinkUnitTest.java:210) as input to shuffle 19
14:51:27.197 INFO DAGScheduler - Got job 65 (runJob at SparkHadoopWriter.scala:83) with 2 output partitions
14:51:27.197 INFO DAGScheduler - Final stage: ResultStage 90 (runJob at SparkHadoopWriter.scala:83)
14:51:27.197 INFO DAGScheduler - Parents of final stage: List(ShuffleMapStage 89)
14:51:27.198 INFO DAGScheduler - Missing parents: List(ShuffleMapStage 89)
14:51:27.198 INFO DAGScheduler - Submitting ShuffleMapStage 89 (MapPartitionsRDD[377] at repartition at ReadsSparkSinkUnitTest.java:210), which has no missing parents
14:51:27.215 INFO MemoryStore - Block broadcast_163 stored as values in memory (estimated size 427.7 KiB, free 1917.3 MiB)
14:51:27.217 INFO MemoryStore - Block broadcast_163_piece0 stored as bytes in memory (estimated size 154.6 KiB, free 1917.2 MiB)
14:51:27.217 INFO BlockManagerInfo - Added broadcast_163_piece0 in memory on localhost:44923 (size: 154.6 KiB, free: 1919.5 MiB)
14:51:27.217 INFO SparkContext - Created broadcast 163 from broadcast at DAGScheduler.scala:1580
14:51:27.217 INFO DAGScheduler - Submitting 1 missing tasks from ShuffleMapStage 89 (MapPartitionsRDD[377] at repartition at ReadsSparkSinkUnitTest.java:210) (first 15 tasks are for partitions Vector(0))
14:51:27.217 INFO TaskSchedulerImpl - Adding task set 89.0 with 1 tasks resource profile 0
14:51:27.218 INFO TaskSetManager - Starting task 0.0 in stage 89.0 (TID 133) (localhost, executor driver, partition 0, PROCESS_LOCAL, 7861 bytes)
14:51:27.219 INFO Executor - Running task 0.0 in stage 89.0 (TID 133)
14:51:27.253 INFO NewHadoopRDD - Input split: file:/home/runner/work/gatk/gatk/src/test/resources/org/broadinstitute/hellbender/tools/BQSR/HiSeq.1mb.1RG.2k_lines.bam:0+222075
14:51:27.272 INFO Executor - Finished task 0.0 in stage 89.0 (TID 133). 1149 bytes result sent to driver
14:51:27.272 INFO TaskSetManager - Finished task 0.0 in stage 89.0 (TID 133) in 54 ms on localhost (executor driver) (1/1)
14:51:27.272 INFO TaskSchedulerImpl - Removed TaskSet 89.0, whose tasks have all completed, from pool
14:51:27.273 INFO DAGScheduler - ShuffleMapStage 89 (repartition at ReadsSparkSinkUnitTest.java:210) finished in 0.075 s
14:51:27.273 INFO DAGScheduler - looking for newly runnable stages
14:51:27.273 INFO DAGScheduler - running: HashSet()
14:51:27.273 INFO DAGScheduler - waiting: HashSet(ResultStage 90)
14:51:27.273 INFO DAGScheduler - failed: HashSet()
14:51:27.273 INFO DAGScheduler - Submitting ResultStage 90 (MapPartitionsRDD[389] at mapToPair at AnySamSinkMultiple.java:89), which has no missing parents
14:51:27.284 INFO MemoryStore - Block broadcast_164 stored as values in memory (estimated size 150.2 KiB, free 1917.0 MiB)
14:51:27.285 INFO MemoryStore - Block broadcast_164_piece0 stored as bytes in memory (estimated size 56.2 KiB, free 1917.0 MiB)
14:51:27.285 INFO BlockManagerInfo - Added broadcast_164_piece0 in memory on localhost:44923 (size: 56.2 KiB, free: 1919.4 MiB)
14:51:27.286 INFO SparkContext - Created broadcast 164 from broadcast at DAGScheduler.scala:1580
14:51:27.286 INFO DAGScheduler - Submitting 2 missing tasks from ResultStage 90 (MapPartitionsRDD[389] at mapToPair at AnySamSinkMultiple.java:89) (first 15 tasks are for partitions Vector(0, 1))
14:51:27.286 INFO TaskSchedulerImpl - Adding task set 90.0 with 2 tasks resource profile 0
14:51:27.287 INFO TaskSetManager - Starting task 0.0 in stage 90.0 (TID 134) (localhost, executor driver, partition 0, NODE_LOCAL, 7860 bytes)
14:51:27.287 INFO TaskSetManager - Starting task 1.0 in stage 90.0 (TID 135) (localhost, executor driver, partition 1, NODE_LOCAL, 7860 bytes)
14:51:27.287 INFO Executor - Running task 1.0 in stage 90.0 (TID 135)
14:51:27.287 INFO Executor - Running task 0.0 in stage 90.0 (TID 134)
14:51:27.292 INFO PathOutputCommitterFactory - No output committer factory defined, defaulting to FileOutputCommitterFactory
14:51:27.292 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
14:51:27.292 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
14:51:27.293 INFO PathOutputCommitterFactory - No output committer factory defined, defaulting to FileOutputCommitterFactory
14:51:27.293 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
14:51:27.293 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
14:51:27.294 INFO PathOutputCommitterFactory - No output committer factory defined, defaulting to FileOutputCommitterFactory
14:51:27.294 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
14:51:27.294 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
14:51:27.294 INFO PathOutputCommitterFactory - No output committer factory defined, defaulting to FileOutputCommitterFactory
14:51:27.294 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
14:51:27.294 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
14:51:27.307 INFO ShuffleBlockFetcherIterator - Getting 1 (176.4 KiB) non-empty blocks including 1 (176.4 KiB) local and 0 (0.0 B) host-local and 0 (0.0 B) push-merged-local and 0 (0.0 B) remote blocks
14:51:27.307 INFO ShuffleBlockFetcherIterator - Started 0 remote fetches in 0 ms
14:51:27.310 INFO ShuffleBlockFetcherIterator - Getting 1 (176.4 KiB) non-empty blocks including 1 (176.4 KiB) local and 0 (0.0 B) host-local and 0 (0.0 B) push-merged-local and 0 (0.0 B) remote blocks
14:51:27.310 INFO ShuffleBlockFetcherIterator - Started 0 remote fetches in 0 ms
14:51:27.316 INFO FileOutputCommitter - Saved output of task 'attempt_202603041451274334139007921503450_0389_r_000001_0' to file:/tmp/ReadsSparkSinkUnitTest15625970223731825708.bam/_temporary/0/task_202603041451274334139007921503450_0389_r_000001
14:51:27.316 INFO SparkHadoopMapRedUtil - attempt_202603041451274334139007921503450_0389_r_000001_0: Committed. Elapsed time: 0 ms.
14:51:27.317 INFO Executor - Finished task 1.0 in stage 90.0 (TID 135). 1729 bytes result sent to driver
14:51:27.318 INFO FileOutputCommitter - Saved output of task 'attempt_202603041451274334139007921503450_0389_r_000000_0' to file:/tmp/ReadsSparkSinkUnitTest15625970223731825708.bam/_temporary/0/task_202603041451274334139007921503450_0389_r_000000
14:51:27.318 INFO SparkHadoopMapRedUtil - attempt_202603041451274334139007921503450_0389_r_000000_0: Committed. Elapsed time: 0 ms.
14:51:27.318 INFO TaskSetManager - Finished task 1.0 in stage 90.0 (TID 135) in 31 ms on localhost (executor driver) (1/2)
14:51:27.318 INFO Executor - Finished task 0.0 in stage 90.0 (TID 134). 1729 bytes result sent to driver
14:51:27.319 INFO TaskSetManager - Finished task 0.0 in stage 90.0 (TID 134) in 32 ms on localhost (executor driver) (2/2)
14:51:27.319 INFO TaskSchedulerImpl - Removed TaskSet 90.0, whose tasks have all completed, from pool
14:51:27.319 INFO DAGScheduler - ResultStage 90 (runJob at SparkHadoopWriter.scala:83) finished in 0.045 s
14:51:27.319 INFO DAGScheduler - Job 65 is finished. Cancelling potential speculative or zombie tasks for this job
14:51:27.319 INFO TaskSchedulerImpl - Killing all running tasks in stage 90: Stage finished
14:51:27.319 INFO DAGScheduler - Job 65 finished: runJob at SparkHadoopWriter.scala:83, took 0.122384 s
14:51:27.320 INFO SparkHadoopWriter - Start to commit write Job job_202603041451274334139007921503450_0389.
14:51:27.326 INFO SparkHadoopWriter - Write Job job_202603041451274334139007921503450_0389 committed. Elapsed time: 6 ms.
14:51:27.328 INFO MemoryStore - Block broadcast_165 stored as values in memory (estimated size 297.9 KiB, free 1916.7 MiB)
14:51:27.335 INFO MemoryStore - Block broadcast_165_piece0 stored as bytes in memory (estimated size 50.2 KiB, free 1916.6 MiB)
14:51:27.335 INFO BlockManagerInfo - Added broadcast_165_piece0 in memory on localhost:44923 (size: 50.2 KiB, free: 1919.4 MiB)
14:51:27.335 INFO SparkContext - Created broadcast 165 from newAPIHadoopFile at PathSplitSource.java:96
14:51:27.359 INFO SparkContext - Starting job: count at ReadsSparkSinkUnitTest.java:222
14:51:27.360 INFO DAGScheduler - Got job 66 (count at ReadsSparkSinkUnitTest.java:222) with 2 output partitions
14:51:27.360 INFO DAGScheduler - Final stage: ResultStage 92 (count at ReadsSparkSinkUnitTest.java:222)
14:51:27.360 INFO DAGScheduler - Parents of final stage: List(ShuffleMapStage 91)
14:51:27.360 INFO DAGScheduler - Missing parents: List()
14:51:27.360 INFO DAGScheduler - Submitting ResultStage 92 (MapPartitionsRDD[380] at repartition at ReadsSparkSinkUnitTest.java:210), which has no missing parents
14:51:27.361 INFO MemoryStore - Block broadcast_166 stored as values in memory (estimated size 6.3 KiB, free 1916.6 MiB)
14:51:27.362 INFO MemoryStore - Block broadcast_166_piece0 stored as bytes in memory (estimated size 3.4 KiB, free 1916.6 MiB)
14:51:27.362 INFO BlockManagerInfo - Added broadcast_166_piece0 in memory on localhost:44923 (size: 3.4 KiB, free: 1919.4 MiB)
14:51:27.362 INFO SparkContext - Created broadcast 166 from broadcast at DAGScheduler.scala:1580
14:51:27.362 INFO DAGScheduler - Submitting 2 missing tasks from ResultStage 92 (MapPartitionsRDD[380] at repartition at ReadsSparkSinkUnitTest.java:210) (first 15 tasks are for partitions Vector(0, 1))
14:51:27.362 INFO TaskSchedulerImpl - Adding task set 92.0 with 2 tasks resource profile 0
14:51:27.363 INFO TaskSetManager - Starting task 0.0 in stage 92.0 (TID 136) (localhost, executor driver, partition 0, NODE_LOCAL, 7860 bytes)
14:51:27.363 INFO TaskSetManager - Starting task 1.0 in stage 92.0 (TID 137) (localhost, executor driver, partition 1, NODE_LOCAL, 7860 bytes)
14:51:27.364 INFO Executor - Running task 1.0 in stage 92.0 (TID 137)
14:51:27.364 INFO Executor - Running task 0.0 in stage 92.0 (TID 136)
14:51:27.366 INFO ShuffleBlockFetcherIterator - Getting 1 (176.4 KiB) non-empty blocks including 1 (176.4 KiB) local and 0 (0.0 B) host-local and 0 (0.0 B) push-merged-local and 0 (0.0 B) remote blocks
14:51:27.366 INFO ShuffleBlockFetcherIterator - Getting 1 (176.4 KiB) non-empty blocks including 1 (176.4 KiB) local and 0 (0.0 B) host-local and 0 (0.0 B) push-merged-local and 0 (0.0 B) remote blocks
14:51:27.366 INFO ShuffleBlockFetcherIterator - Started 0 remote fetches in 0 ms
14:51:27.366 INFO ShuffleBlockFetcherIterator - Started 0 remote fetches in 0 ms
14:51:27.369 INFO Executor - Finished task 1.0 in stage 92.0 (TID 137). 1591 bytes result sent to driver
14:51:27.370 INFO TaskSetManager - Finished task 1.0 in stage 92.0 (TID 137) in 7 ms on localhost (executor driver) (1/2)
14:51:27.370 INFO Executor - Finished task 0.0 in stage 92.0 (TID 136). 1591 bytes result sent to driver
14:51:27.371 INFO TaskSetManager - Finished task 0.0 in stage 92.0 (TID 136) in 8 ms on localhost (executor driver) (2/2)
14:51:27.371 INFO TaskSchedulerImpl - Removed TaskSet 92.0, whose tasks have all completed, from pool
14:51:27.371 INFO DAGScheduler - ResultStage 92 (count at ReadsSparkSinkUnitTest.java:222) finished in 0.010 s
14:51:27.371 INFO DAGScheduler - Job 66 is finished. Cancelling potential speculative or zombie tasks for this job
14:51:27.371 INFO TaskSchedulerImpl - Killing all running tasks in stage 92: Stage finished
14:51:27.371 INFO DAGScheduler - Job 66 finished: count at ReadsSparkSinkUnitTest.java:222, took 0.011947 s
14:51:27.387 INFO FileInputFormat - Total input files to process : 2
14:51:27.390 INFO SparkContext - Starting job: count at ReadsSparkSinkUnitTest.java:222
14:51:27.391 INFO DAGScheduler - Got job 67 (count at ReadsSparkSinkUnitTest.java:222) with 2 output partitions
14:51:27.391 INFO DAGScheduler - Final stage: ResultStage 93 (count at ReadsSparkSinkUnitTest.java:222)
14:51:27.391 INFO DAGScheduler - Parents of final stage: List()
14:51:27.391 INFO DAGScheduler - Missing parents: List()
14:51:27.391 INFO DAGScheduler - Submitting ResultStage 93 (MapPartitionsRDD[396] at filter at ReadsSparkSource.java:96), which has no missing parents
14:51:27.416 INFO MemoryStore - Block broadcast_167 stored as values in memory (estimated size 426.1 KiB, free 1916.2 MiB)
14:51:27.418 INFO MemoryStore - Block broadcast_167_piece0 stored as bytes in memory (estimated size 153.6 KiB, free 1916.0 MiB)
14:51:27.418 INFO BlockManagerInfo - Added broadcast_167_piece0 in memory on localhost:44923 (size: 153.6 KiB, free: 1919.2 MiB)
14:51:27.418 INFO SparkContext - Created broadcast 167 from broadcast at DAGScheduler.scala:1580
14:51:27.419 INFO DAGScheduler - Submitting 2 missing tasks from ResultStage 93 (MapPartitionsRDD[396] at filter at ReadsSparkSource.java:96) (first 15 tasks are for partitions Vector(0, 1))
14:51:27.419 INFO TaskSchedulerImpl - Adding task set 93.0 with 2 tasks resource profile 0
14:51:27.419 INFO TaskSetManager - Starting task 0.0 in stage 93.0 (TID 138) (localhost, executor driver, partition 0, PROCESS_LOCAL, 7826 bytes)
14:51:27.419 INFO TaskSetManager - Starting task 1.0 in stage 93.0 (TID 139) (localhost, executor driver, partition 1, PROCESS_LOCAL, 7826 bytes)
14:51:27.420 INFO Executor - Running task 1.0 in stage 93.0 (TID 139)
14:51:27.420 INFO Executor - Running task 0.0 in stage 93.0 (TID 138)
14:51:27.469 INFO NewHadoopRDD - Input split: file:/tmp/ReadsSparkSinkUnitTest15625970223731825708.bam/part-r-00001.bam:0+129330
14:51:27.469 INFO NewHadoopRDD - Input split: file:/tmp/ReadsSparkSinkUnitTest15625970223731825708.bam/part-r-00000.bam:0+132492
14:51:27.479 INFO Executor - Finished task 0.0 in stage 93.0 (TID 138). 989 bytes result sent to driver
14:51:27.480 INFO TaskSetManager - Finished task 0.0 in stage 93.0 (TID 138) in 61 ms on localhost (executor driver) (1/2)
14:51:27.486 INFO Executor - Finished task 1.0 in stage 93.0 (TID 139). 989 bytes result sent to driver
14:51:27.487 INFO TaskSetManager - Finished task 1.0 in stage 93.0 (TID 139) in 68 ms on localhost (executor driver) (2/2)
14:51:27.487 INFO TaskSchedulerImpl - Removed TaskSet 93.0, whose tasks have all completed, from pool
14:51:27.487 INFO DAGScheduler - ResultStage 93 (count at ReadsSparkSinkUnitTest.java:222) finished in 0.095 s
14:51:27.487 INFO DAGScheduler - Job 67 is finished. Cancelling potential speculative or zombie tasks for this job
14:51:27.487 INFO TaskSchedulerImpl - Killing all running tasks in stage 93: Stage finished
14:51:27.487 INFO DAGScheduler - Job 67 finished: count at ReadsSparkSinkUnitTest.java:222, took 0.096563 s
14:51:27.491 INFO MemoryStore - Block broadcast_168 stored as values in memory (estimated size 297.9 KiB, free 1915.8 MiB)
14:51:27.502 INFO MemoryStore - Block broadcast_168_piece0 stored as bytes in memory (estimated size 50.2 KiB, free 1915.7 MiB)
14:51:27.502 INFO BlockManagerInfo - Added broadcast_168_piece0 in memory on localhost:44923 (size: 50.2 KiB, free: 1919.2 MiB)
14:51:27.503 INFO SparkContext - Created broadcast 168 from newAPIHadoopFile at PathSplitSource.java:96
14:51:27.529 INFO MemoryStore - Block broadcast_169 stored as values in memory (estimated size 297.9 KiB, free 1915.4 MiB)
14:51:27.541 INFO BlockManagerInfo - Removed broadcast_156_piece0 on localhost:44923 in memory (size: 50.2 KiB, free: 1919.2 MiB)
14:51:27.541 INFO BlockManagerInfo - Removed broadcast_163_piece0 on localhost:44923 in memory (size: 154.6 KiB, free: 1919.4 MiB)
14:51:27.542 INFO BlockManagerInfo - Removed broadcast_157_piece0 on localhost:44923 in memory (size: 3.4 KiB, free: 1919.4 MiB)
14:51:27.543 INFO BlockManagerInfo - Removed broadcast_166_piece0 on localhost:44923 in memory (size: 3.4 KiB, free: 1919.4 MiB)
14:51:27.543 INFO BlockManagerInfo - Removed broadcast_167_piece0 on localhost:44923 in memory (size: 153.6 KiB, free: 1919.5 MiB)
14:51:27.545 INFO BlockManagerInfo - Removed broadcast_162_piece0 on localhost:44923 in memory (size: 9.6 KiB, free: 1919.5 MiB)
14:51:27.546 INFO BlockManagerInfo - Removed broadcast_159_piece0 on localhost:44923 in memory (size: 50.2 KiB, free: 1919.6 MiB)
14:51:27.546 INFO BlockManagerInfo - Removed broadcast_158_piece0 on localhost:44923 in memory (size: 153.6 KiB, free: 1919.7 MiB)
14:51:27.547 INFO BlockManagerInfo - Removed broadcast_161_piece0 on localhost:44923 in memory (size: 9.6 KiB, free: 1919.7 MiB)
14:51:27.548 INFO MemoryStore - Block broadcast_169_piece0 stored as bytes in memory (estimated size 50.2 KiB, free 1918.1 MiB)
14:51:27.548 INFO BlockManagerInfo - Added broadcast_169_piece0 in memory on localhost:44923 (size: 50.2 KiB, free: 1919.7 MiB)
14:51:27.548 INFO SparkContext - Created broadcast 169 from newAPIHadoopFile at PathSplitSource.java:96
14:51:27.548 INFO BlockManagerInfo - Removed broadcast_150_piece0 on localhost:44923 in memory (size: 50.2 KiB, free: 1919.7 MiB)
14:51:27.549 INFO BlockManagerInfo - Removed broadcast_165_piece0 on localhost:44923 in memory (size: 50.2 KiB, free: 1919.8 MiB)
14:51:27.549 INFO BlockManagerInfo - Removed broadcast_160_piece0 on localhost:44923 in memory (size: 50.2 KiB, free: 1919.8 MiB)
14:51:27.552 INFO BlockManagerInfo - Removed broadcast_164_piece0 on localhost:44923 in memory (size: 56.2 KiB, free: 1919.9 MiB)
14:51:27.572 INFO MemoryStore - Block broadcast_170 stored as values in memory (estimated size 160.7 KiB, free 1919.2 MiB)
14:51:27.573 INFO MemoryStore - Block broadcast_170_piece0 stored as bytes in memory (estimated size 9.6 KiB, free 1919.2 MiB)
14:51:27.573 INFO BlockManagerInfo - Added broadcast_170_piece0 in memory on localhost:44923 (size: 9.6 KiB, free: 1919.9 MiB)
14:51:27.574 INFO SparkContext - Created broadcast 170 from broadcast at ReadsSparkSink.java:133
14:51:27.575 INFO MemoryStore - Block broadcast_171 stored as values in memory (estimated size 163.2 KiB, free 1919.0 MiB)
14:51:27.576 INFO MemoryStore - Block broadcast_171_piece0 stored as bytes in memory (estimated size 9.6 KiB, free 1919.0 MiB)
14:51:27.576 INFO BlockManagerInfo - Added broadcast_171_piece0 in memory on localhost:44923 (size: 9.6 KiB, free: 1919.9 MiB)
14:51:27.577 INFO SparkContext - Created broadcast 171 from broadcast at AnySamSinkMultiple.java:80
14:51:27.579 INFO PathOutputCommitterFactory - No output committer factory defined, defaulting to FileOutputCommitterFactory
14:51:27.579 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
14:51:27.579 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
14:51:27.595 INFO FileInputFormat - Total input files to process : 1
14:51:27.601 INFO SparkContext - Starting job: runJob at SparkHadoopWriter.scala:83
14:51:27.602 INFO DAGScheduler - Registering RDD 404 (repartition at ReadsSparkSinkUnitTest.java:210) as input to shuffle 20
14:51:27.602 INFO DAGScheduler - Got job 68 (runJob at SparkHadoopWriter.scala:83) with 2 output partitions
14:51:27.602 INFO DAGScheduler - Final stage: ResultStage 95 (runJob at SparkHadoopWriter.scala:83)
14:51:27.602 INFO DAGScheduler - Parents of final stage: List(ShuffleMapStage 94)
14:51:27.602 INFO DAGScheduler - Missing parents: List(ShuffleMapStage 94)
14:51:27.602 INFO DAGScheduler - Submitting ShuffleMapStage 94 (MapPartitionsRDD[404] at repartition at ReadsSparkSinkUnitTest.java:210), which has no missing parents
14:51:27.626 INFO MemoryStore - Block broadcast_172 stored as values in memory (estimated size 427.7 KiB, free 1918.6 MiB)
14:51:27.628 INFO MemoryStore - Block broadcast_172_piece0 stored as bytes in memory (estimated size 154.6 KiB, free 1918.4 MiB)
14:51:27.628 INFO BlockManagerInfo - Added broadcast_172_piece0 in memory on localhost:44923 (size: 154.6 KiB, free: 1919.7 MiB)
14:51:27.628 INFO SparkContext - Created broadcast 172 from broadcast at DAGScheduler.scala:1580
14:51:27.628 INFO DAGScheduler - Submitting 1 missing tasks from ShuffleMapStage 94 (MapPartitionsRDD[404] at repartition at ReadsSparkSinkUnitTest.java:210) (first 15 tasks are for partitions Vector(0))
14:51:27.628 INFO TaskSchedulerImpl - Adding task set 94.0 with 1 tasks resource profile 0
14:51:27.629 INFO TaskSetManager - Starting task 0.0 in stage 94.0 (TID 140) (localhost, executor driver, partition 0, PROCESS_LOCAL, 7861 bytes)
14:51:27.630 INFO Executor - Running task 0.0 in stage 94.0 (TID 140)
14:51:27.663 INFO NewHadoopRDD - Input split: file:/home/runner/work/gatk/gatk/src/test/resources/org/broadinstitute/hellbender/tools/BQSR/HiSeq.1mb.1RG.2k_lines.bam:0+222075
14:51:27.686 INFO Executor - Finished task 0.0 in stage 94.0 (TID 140). 1149 bytes result sent to driver
14:51:27.687 INFO TaskSetManager - Finished task 0.0 in stage 94.0 (TID 140) in 58 ms on localhost (executor driver) (1/1)
14:51:27.687 INFO TaskSchedulerImpl - Removed TaskSet 94.0, whose tasks have all completed, from pool
14:51:27.687 INFO DAGScheduler - ShuffleMapStage 94 (repartition at ReadsSparkSinkUnitTest.java:210) finished in 0.084 s
14:51:27.687 INFO DAGScheduler - looking for newly runnable stages
14:51:27.687 INFO DAGScheduler - running: HashSet()
14:51:27.687 INFO DAGScheduler - waiting: HashSet(ResultStage 95)
14:51:27.687 INFO DAGScheduler - failed: HashSet()
14:51:27.688 INFO DAGScheduler - Submitting ResultStage 95 (MapPartitionsRDD[416] at mapToPair at AnySamSinkMultiple.java:89), which has no missing parents
14:51:27.699 INFO MemoryStore - Block broadcast_173 stored as values in memory (estimated size 150.2 KiB, free 1918.3 MiB)
14:51:27.700 INFO MemoryStore - Block broadcast_173_piece0 stored as bytes in memory (estimated size 56.3 KiB, free 1918.2 MiB)
14:51:27.700 INFO BlockManagerInfo - Added broadcast_173_piece0 in memory on localhost:44923 (size: 56.3 KiB, free: 1919.7 MiB)
14:51:27.700 INFO SparkContext - Created broadcast 173 from broadcast at DAGScheduler.scala:1580
14:51:27.700 INFO DAGScheduler - Submitting 2 missing tasks from ResultStage 95 (MapPartitionsRDD[416] at mapToPair at AnySamSinkMultiple.java:89) (first 15 tasks are for partitions Vector(0, 1))
14:51:27.700 INFO TaskSchedulerImpl - Adding task set 95.0 with 2 tasks resource profile 0
14:51:27.701 INFO TaskSetManager - Starting task 0.0 in stage 95.0 (TID 141) (localhost, executor driver, partition 0, NODE_LOCAL, 7860 bytes)
14:51:27.701 INFO TaskSetManager - Starting task 1.0 in stage 95.0 (TID 142) (localhost, executor driver, partition 1, NODE_LOCAL, 7860 bytes)
14:51:27.702 INFO Executor - Running task 0.0 in stage 95.0 (TID 141)
14:51:27.702 INFO Executor - Running task 1.0 in stage 95.0 (TID 142)
14:51:27.709 INFO PathOutputCommitterFactory - No output committer factory defined, defaulting to FileOutputCommitterFactory
14:51:27.709 INFO PathOutputCommitterFactory - No output committer factory defined, defaulting to FileOutputCommitterFactory
14:51:27.709 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
14:51:27.709 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
14:51:27.709 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
14:51:27.709 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
14:51:27.709 INFO PathOutputCommitterFactory - No output committer factory defined, defaulting to FileOutputCommitterFactory
14:51:27.709 INFO PathOutputCommitterFactory - No output committer factory defined, defaulting to FileOutputCommitterFactory
14:51:27.709 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
14:51:27.709 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
14:51:27.709 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
14:51:27.709 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
14:51:27.720 INFO ShuffleBlockFetcherIterator - Getting 1 (176.4 KiB) non-empty blocks including 1 (176.4 KiB) local and 0 (0.0 B) host-local and 0 (0.0 B) push-merged-local and 0 (0.0 B) remote blocks
14:51:27.721 INFO ShuffleBlockFetcherIterator - Started 0 remote fetches in 1 ms
14:51:27.726 INFO ShuffleBlockFetcherIterator - Getting 1 (176.4 KiB) non-empty blocks including 1 (176.4 KiB) local and 0 (0.0 B) host-local and 0 (0.0 B) push-merged-local and 0 (0.0 B) remote blocks
14:51:27.726 INFO ShuffleBlockFetcherIterator - Started 0 remote fetches in 0 ms
14:51:27.729 INFO FileOutputCommitter - Saved output of task 'attempt_202603041451276331393999381621668_0416_r_000001_0' to file:/tmp/ReadsSparkSinkUnitTest114169182498407178184.bam/_temporary/0/task_202603041451276331393999381621668_0416_r_000001
14:51:27.730 INFO SparkHadoopMapRedUtil - attempt_202603041451276331393999381621668_0416_r_000001_0: Committed. Elapsed time: 0 ms.
14:51:27.730 INFO Executor - Finished task 1.0 in stage 95.0 (TID 142). 1729 bytes result sent to driver
14:51:27.731 INFO TaskSetManager - Finished task 1.0 in stage 95.0 (TID 142) in 30 ms on localhost (executor driver) (1/2)
14:51:27.733 INFO FileOutputCommitter - Saved output of task 'attempt_202603041451276331393999381621668_0416_r_000000_0' to file:/tmp/ReadsSparkSinkUnitTest114169182498407178184.bam/_temporary/0/task_202603041451276331393999381621668_0416_r_000000
14:51:27.733 INFO SparkHadoopMapRedUtil - attempt_202603041451276331393999381621668_0416_r_000000_0: Committed. Elapsed time: 0 ms.
14:51:27.733 INFO Executor - Finished task 0.0 in stage 95.0 (TID 141). 1729 bytes result sent to driver
14:51:27.734 INFO TaskSetManager - Finished task 0.0 in stage 95.0 (TID 141) in 33 ms on localhost (executor driver) (2/2)
14:51:27.734 INFO TaskSchedulerImpl - Removed TaskSet 95.0, whose tasks have all completed, from pool
14:51:27.734 INFO DAGScheduler - ResultStage 95 (runJob at SparkHadoopWriter.scala:83) finished in 0.046 s
14:51:27.734 INFO DAGScheduler - Job 68 is finished. Cancelling potential speculative or zombie tasks for this job
14:51:27.734 INFO TaskSchedulerImpl - Killing all running tasks in stage 95: Stage finished
14:51:27.734 INFO DAGScheduler - Job 68 finished: runJob at SparkHadoopWriter.scala:83, took 0.132933 s
14:51:27.735 INFO SparkHadoopWriter - Start to commit write Job job_202603041451276331393999381621668_0416.
14:51:27.741 INFO SparkHadoopWriter - Write Job job_202603041451276331393999381621668_0416 committed. Elapsed time: 6 ms.
14:51:27.744 INFO MemoryStore - Block broadcast_174 stored as values in memory (estimated size 297.9 KiB, free 1917.9 MiB)
14:51:27.751 INFO MemoryStore - Block broadcast_174_piece0 stored as bytes in memory (estimated size 50.2 KiB, free 1917.9 MiB)
14:51:27.751 INFO BlockManagerInfo - Added broadcast_174_piece0 in memory on localhost:44923 (size: 50.2 KiB, free: 1919.6 MiB)
14:51:27.752 INFO SparkContext - Created broadcast 174 from newAPIHadoopFile at PathSplitSource.java:96
14:51:27.776 INFO SparkContext - Starting job: count at ReadsSparkSinkUnitTest.java:222
14:51:27.777 INFO DAGScheduler - Got job 69 (count at ReadsSparkSinkUnitTest.java:222) with 2 output partitions
14:51:27.777 INFO DAGScheduler - Final stage: ResultStage 97 (count at ReadsSparkSinkUnitTest.java:222)
14:51:27.777 INFO DAGScheduler - Parents of final stage: List(ShuffleMapStage 96)
14:51:27.777 INFO DAGScheduler - Missing parents: List()
14:51:27.777 INFO DAGScheduler - Submitting ResultStage 97 (MapPartitionsRDD[407] at repartition at ReadsSparkSinkUnitTest.java:210), which has no missing parents
14:51:27.778 INFO MemoryStore - Block broadcast_175 stored as values in memory (estimated size 6.3 KiB, free 1917.9 MiB)
14:51:27.779 INFO MemoryStore - Block broadcast_175_piece0 stored as bytes in memory (estimated size 3.4 KiB, free 1917.9 MiB)
14:51:27.779 INFO BlockManagerInfo - Added broadcast_175_piece0 in memory on localhost:44923 (size: 3.4 KiB, free: 1919.6 MiB)
14:51:27.779 INFO SparkContext - Created broadcast 175 from broadcast at DAGScheduler.scala:1580
14:51:27.779 INFO DAGScheduler - Submitting 2 missing tasks from ResultStage 97 (MapPartitionsRDD[407] at repartition at ReadsSparkSinkUnitTest.java:210) (first 15 tasks are for partitions Vector(0, 1))
14:51:27.779 INFO TaskSchedulerImpl - Adding task set 97.0 with 2 tasks resource profile 0
14:51:27.780 INFO TaskSetManager - Starting task 0.0 in stage 97.0 (TID 143) (localhost, executor driver, partition 0, NODE_LOCAL, 7860 bytes)
14:51:27.780 INFO TaskSetManager - Starting task 1.0 in stage 97.0 (TID 144) (localhost, executor driver, partition 1, NODE_LOCAL, 7860 bytes)
14:51:27.781 INFO Executor - Running task 0.0 in stage 97.0 (TID 143)
14:51:27.781 INFO Executor - Running task 1.0 in stage 97.0 (TID 144)
14:51:27.783 INFO ShuffleBlockFetcherIterator - Getting 1 (176.4 KiB) non-empty blocks including 1 (176.4 KiB) local and 0 (0.0 B) host-local and 0 (0.0 B) push-merged-local and 0 (0.0 B) remote blocks
14:51:27.783 INFO ShuffleBlockFetcherIterator - Getting 1 (176.4 KiB) non-empty blocks including 1 (176.4 KiB) local and 0 (0.0 B) host-local and 0 (0.0 B) push-merged-local and 0 (0.0 B) remote blocks
14:51:27.783 INFO ShuffleBlockFetcherIterator - Started 0 remote fetches in 0 ms
14:51:27.783 INFO ShuffleBlockFetcherIterator - Started 0 remote fetches in 0 ms
14:51:27.788 INFO Executor - Finished task 1.0 in stage 97.0 (TID 144). 1634 bytes result sent to driver
14:51:27.788 INFO Executor - Finished task 0.0 in stage 97.0 (TID 143). 1634 bytes result sent to driver
14:51:27.788 INFO TaskSetManager - Finished task 0.0 in stage 97.0 (TID 143) in 8 ms on localhost (executor driver) (1/2)
14:51:27.789 INFO TaskSetManager - Finished task 1.0 in stage 97.0 (TID 144) in 9 ms on localhost (executor driver) (2/2)
14:51:27.789 INFO TaskSchedulerImpl - Removed TaskSet 97.0, whose tasks have all completed, from pool
14:51:27.789 INFO DAGScheduler - ResultStage 97 (count at ReadsSparkSinkUnitTest.java:222) finished in 0.011 s
14:51:27.789 INFO DAGScheduler - Job 69 is finished. Cancelling potential speculative or zombie tasks for this job
14:51:27.789 INFO TaskSchedulerImpl - Killing all running tasks in stage 97: Stage finished
14:51:27.789 INFO DAGScheduler - Job 69 finished: count at ReadsSparkSinkUnitTest.java:222, took 0.012810 s
14:51:27.804 INFO FileInputFormat - Total input files to process : 2
14:51:27.808 INFO SparkContext - Starting job: count at ReadsSparkSinkUnitTest.java:222
14:51:27.808 INFO DAGScheduler - Got job 70 (count at ReadsSparkSinkUnitTest.java:222) with 2 output partitions
14:51:27.808 INFO DAGScheduler - Final stage: ResultStage 98 (count at ReadsSparkSinkUnitTest.java:222)
14:51:27.808 INFO DAGScheduler - Parents of final stage: List()
14:51:27.808 INFO DAGScheduler - Missing parents: List()
14:51:27.809 INFO DAGScheduler - Submitting ResultStage 98 (MapPartitionsRDD[423] at filter at ReadsSparkSource.java:96), which has no missing parents
14:51:27.826 INFO MemoryStore - Block broadcast_176 stored as values in memory (estimated size 426.1 KiB, free 1917.4 MiB)
14:51:27.827 INFO MemoryStore - Block broadcast_176_piece0 stored as bytes in memory (estimated size 153.6 KiB, free 1917.3 MiB)
14:51:27.827 INFO BlockManagerInfo - Added broadcast_176_piece0 in memory on localhost:44923 (size: 153.6 KiB, free: 1919.5 MiB)
14:51:27.828 INFO SparkContext - Created broadcast 176 from broadcast at DAGScheduler.scala:1580
14:51:27.828 INFO DAGScheduler - Submitting 2 missing tasks from ResultStage 98 (MapPartitionsRDD[423] at filter at ReadsSparkSource.java:96) (first 15 tasks are for partitions Vector(0, 1))
14:51:27.828 INFO TaskSchedulerImpl - Adding task set 98.0 with 2 tasks resource profile 0
14:51:27.829 INFO TaskSetManager - Starting task 0.0 in stage 98.0 (TID 145) (localhost, executor driver, partition 0, PROCESS_LOCAL, 7827 bytes)
14:51:27.829 INFO TaskSetManager - Starting task 1.0 in stage 98.0 (TID 146) (localhost, executor driver, partition 1, PROCESS_LOCAL, 7827 bytes)
14:51:27.829 INFO Executor - Running task 1.0 in stage 98.0 (TID 146)
14:51:27.829 INFO Executor - Running task 0.0 in stage 98.0 (TID 145)
14:51:27.879 INFO NewHadoopRDD - Input split: file:/tmp/ReadsSparkSinkUnitTest114169182498407178184.bam/part-r-00000.bam:0+132492
14:51:27.879 INFO NewHadoopRDD - Input split: file:/tmp/ReadsSparkSinkUnitTest114169182498407178184.bam/part-r-00001.bam:0+129330
14:51:27.894 INFO Executor - Finished task 1.0 in stage 98.0 (TID 146). 989 bytes result sent to driver
14:51:27.894 INFO Executor - Finished task 0.0 in stage 98.0 (TID 145). 989 bytes result sent to driver
14:51:27.895 INFO TaskSetManager - Finished task 1.0 in stage 98.0 (TID 146) in 66 ms on localhost (executor driver) (1/2)
14:51:27.895 INFO TaskSetManager - Finished task 0.0 in stage 98.0 (TID 145) in 66 ms on localhost (executor driver) (2/2)
14:51:27.895 INFO TaskSchedulerImpl - Removed TaskSet 98.0, whose tasks have all completed, from pool
14:51:27.895 INFO DAGScheduler - ResultStage 98 (count at ReadsSparkSinkUnitTest.java:222) finished in 0.086 s
14:51:27.896 INFO DAGScheduler - Job 70 is finished. Cancelling potential speculative or zombie tasks for this job
14:51:27.896 INFO TaskSchedulerImpl - Killing all running tasks in stage 98: Stage finished
14:51:27.896 INFO DAGScheduler - Job 70 finished: count at ReadsSparkSinkUnitTest.java:222, took 0.087876 s
14:51:27.901 INFO MemoryStore - Block broadcast_177 stored as values in memory (estimated size 298.0 KiB, free 1917.0 MiB)
14:51:27.911 INFO MemoryStore - Block broadcast_177_piece0 stored as bytes in memory (estimated size 50.3 KiB, free 1917.0 MiB)
14:51:27.912 INFO BlockManagerInfo - Added broadcast_177_piece0 in memory on localhost:44923 (size: 50.3 KiB, free: 1919.4 MiB)
14:51:27.912 INFO SparkContext - Created broadcast 177 from newAPIHadoopFile at PathSplitSource.java:96
14:51:27.942 INFO MemoryStore - Block broadcast_178 stored as values in memory (estimated size 298.0 KiB, free 1916.7 MiB)
14:51:27.949 INFO MemoryStore - Block broadcast_178_piece0 stored as bytes in memory (estimated size 50.3 KiB, free 1916.6 MiB)
14:51:27.949 INFO BlockManagerInfo - Added broadcast_178_piece0 in memory on localhost:44923 (size: 50.3 KiB, free: 1919.4 MiB)
14:51:27.949 INFO SparkContext - Created broadcast 178 from newAPIHadoopFile at PathSplitSource.java:96
14:51:27.970 INFO MemoryStore - Block broadcast_179 stored as values in memory (estimated size 160.7 KiB, free 1916.5 MiB)
14:51:27.971 INFO MemoryStore - Block broadcast_179_piece0 stored as bytes in memory (estimated size 9.6 KiB, free 1916.5 MiB)
14:51:27.971 INFO BlockManagerInfo - Added broadcast_179_piece0 in memory on localhost:44923 (size: 9.6 KiB, free: 1919.4 MiB)
14:51:27.972 INFO SparkContext - Created broadcast 179 from broadcast at ReadsSparkSink.java:133
14:51:27.973 INFO MemoryStore - Block broadcast_180 stored as values in memory (estimated size 163.2 KiB, free 1916.3 MiB)
14:51:27.974 INFO MemoryStore - Block broadcast_180_piece0 stored as bytes in memory (estimated size 9.6 KiB, free 1916.3 MiB)
14:51:27.974 INFO BlockManagerInfo - Added broadcast_180_piece0 in memory on localhost:44923 (size: 9.6 KiB, free: 1919.4 MiB)
14:51:27.974 INFO SparkContext - Created broadcast 180 from broadcast at AnySamSinkMultiple.java:80
14:51:27.976 INFO PathOutputCommitterFactory - No output committer factory defined, defaulting to FileOutputCommitterFactory
14:51:27.976 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
14:51:27.976 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
14:51:27.991 INFO FileInputFormat - Total input files to process : 1
14:51:28.002 INFO SparkContext - Starting job: runJob at SparkHadoopWriter.scala:83
14:51:28.002 INFO DAGScheduler - Registering RDD 431 (repartition at ReadsSparkSinkUnitTest.java:210) as input to shuffle 21
14:51:28.002 INFO DAGScheduler - Got job 71 (runJob at SparkHadoopWriter.scala:83) with 2 output partitions
14:51:28.002 INFO DAGScheduler - Final stage: ResultStage 100 (runJob at SparkHadoopWriter.scala:83)
14:51:28.002 INFO DAGScheduler - Parents of final stage: List(ShuffleMapStage 99)
14:51:28.002 INFO DAGScheduler - Missing parents: List(ShuffleMapStage 99)
14:51:28.003 INFO DAGScheduler - Submitting ShuffleMapStage 99 (MapPartitionsRDD[431] at repartition at ReadsSparkSinkUnitTest.java:210), which has no missing parents
14:51:28.021 INFO MemoryStore - Block broadcast_181 stored as values in memory (estimated size 427.7 KiB, free 1915.9 MiB)
14:51:28.023 INFO MemoryStore - Block broadcast_181_piece0 stored as bytes in memory (estimated size 154.6 KiB, free 1915.7 MiB)
14:51:28.023 INFO BlockManagerInfo - Added broadcast_181_piece0 in memory on localhost:44923 (size: 154.6 KiB, free: 1919.2 MiB)
14:51:28.023 INFO SparkContext - Created broadcast 181 from broadcast at DAGScheduler.scala:1580
14:51:28.023 INFO DAGScheduler - Submitting 1 missing tasks from ShuffleMapStage 99 (MapPartitionsRDD[431] at repartition at ReadsSparkSinkUnitTest.java:210) (first 15 tasks are for partitions Vector(0))
14:51:28.023 INFO TaskSchedulerImpl - Adding task set 99.0 with 1 tasks resource profile 0
14:51:28.024 INFO TaskSetManager - Starting task 0.0 in stage 99.0 (TID 147) (localhost, executor driver, partition 0, PROCESS_LOCAL, 7901 bytes)
14:51:28.024 INFO Executor - Running task 0.0 in stage 99.0 (TID 147)
14:51:28.058 INFO NewHadoopRDD - Input split: file:/home/runner/work/gatk/gatk/src/test/resources/org/broadinstitute/hellbender/tools/BQSR/expected.HiSeq.1mb.1RG.2k_lines.alternate.recalibrated.DIQ.bam:0+216896
14:51:28.080 INFO Executor - Finished task 0.0 in stage 99.0 (TID 147). 1149 bytes result sent to driver
14:51:28.081 INFO TaskSetManager - Finished task 0.0 in stage 99.0 (TID 147) in 57 ms on localhost (executor driver) (1/1)
14:51:28.081 INFO TaskSchedulerImpl - Removed TaskSet 99.0, whose tasks have all completed, from pool
14:51:28.081 INFO DAGScheduler - ShuffleMapStage 99 (repartition at ReadsSparkSinkUnitTest.java:210) finished in 0.078 s
14:51:28.081 INFO DAGScheduler - looking for newly runnable stages
14:51:28.081 INFO DAGScheduler - running: HashSet()
14:51:28.081 INFO DAGScheduler - waiting: HashSet(ResultStage 100)
14:51:28.081 INFO DAGScheduler - failed: HashSet()
14:51:28.082 INFO DAGScheduler - Submitting ResultStage 100 (MapPartitionsRDD[443] at mapToPair at AnySamSinkMultiple.java:89), which has no missing parents
14:51:28.088 INFO MemoryStore - Block broadcast_182 stored as values in memory (estimated size 150.2 KiB, free 1915.6 MiB)
14:51:28.089 INFO MemoryStore - Block broadcast_182_piece0 stored as bytes in memory (estimated size 56.3 KiB, free 1915.5 MiB)
14:51:28.090 INFO BlockManagerInfo - Added broadcast_182_piece0 in memory on localhost:44923 (size: 56.3 KiB, free: 1919.2 MiB)
14:51:28.090 INFO SparkContext - Created broadcast 182 from broadcast at DAGScheduler.scala:1580
14:51:28.090 INFO DAGScheduler - Submitting 2 missing tasks from ResultStage 100 (MapPartitionsRDD[443] at mapToPair at AnySamSinkMultiple.java:89) (first 15 tasks are for partitions Vector(0, 1))
14:51:28.090 INFO TaskSchedulerImpl - Adding task set 100.0 with 2 tasks resource profile 0
14:51:28.091 INFO TaskSetManager - Starting task 0.0 in stage 100.0 (TID 148) (localhost, executor driver, partition 0, NODE_LOCAL, 7860 bytes)
14:51:28.091 INFO TaskSetManager - Starting task 1.0 in stage 100.0 (TID 149) (localhost, executor driver, partition 1, NODE_LOCAL, 7860 bytes)
14:51:28.091 INFO Executor - Running task 0.0 in stage 100.0 (TID 148)
14:51:28.091 INFO Executor - Running task 1.0 in stage 100.0 (TID 149)
14:51:28.096 INFO PathOutputCommitterFactory - No output committer factory defined, defaulting to FileOutputCommitterFactory
14:51:28.096 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
14:51:28.096 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
14:51:28.096 INFO PathOutputCommitterFactory - No output committer factory defined, defaulting to FileOutputCommitterFactory
14:51:28.096 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
14:51:28.096 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
14:51:28.096 INFO PathOutputCommitterFactory - No output committer factory defined, defaulting to FileOutputCommitterFactory
14:51:28.096 INFO PathOutputCommitterFactory - No output committer factory defined, defaulting to FileOutputCommitterFactory
14:51:28.096 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
14:51:28.096 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
14:51:28.096 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
14:51:28.096 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
14:51:28.106 INFO ShuffleBlockFetcherIterator - Getting 1 (176.4 KiB) non-empty blocks including 1 (176.4 KiB) local and 0 (0.0 B) host-local and 0 (0.0 B) push-merged-local and 0 (0.0 B) remote blocks
14:51:28.107 INFO ShuffleBlockFetcherIterator - Started 0 remote fetches in 0 ms
14:51:28.112 INFO ShuffleBlockFetcherIterator - Getting 1 (176.4 KiB) non-empty blocks including 1 (176.4 KiB) local and 0 (0.0 B) host-local and 0 (0.0 B) push-merged-local and 0 (0.0 B) remote blocks
14:51:28.112 INFO ShuffleBlockFetcherIterator - Started 0 remote fetches in 0 ms
14:51:28.114 INFO FileOutputCommitter - Saved output of task 'attempt_202603041451277047435128079897629_0443_r_000000_0' to file:/tmp/ReadsSparkSinkUnitTest210765799486424882545.bam/_temporary/0/task_202603041451277047435128079897629_0443_r_000000
14:51:28.114 INFO SparkHadoopMapRedUtil - attempt_202603041451277047435128079897629_0443_r_000000_0: Committed. Elapsed time: 0 ms.
14:51:28.123 INFO Executor - Finished task 0.0 in stage 100.0 (TID 148). 1815 bytes result sent to driver
14:51:28.124 INFO TaskSetManager - Finished task 0.0 in stage 100.0 (TID 148) in 33 ms on localhost (executor driver) (1/2)
14:51:28.124 INFO BlockManagerInfo - Removed broadcast_171_piece0 on localhost:44923 in memory (size: 9.6 KiB, free: 1919.2 MiB)
14:51:28.125 INFO BlockManagerInfo - Removed broadcast_168_piece0 on localhost:44923 in memory (size: 50.2 KiB, free: 1919.2 MiB)
14:51:28.126 INFO BlockManagerInfo - Removed broadcast_175_piece0 on localhost:44923 in memory (size: 3.4 KiB, free: 1919.2 MiB)
14:51:28.126 INFO BlockManagerInfo - Removed broadcast_174_piece0 on localhost:44923 in memory (size: 50.2 KiB, free: 1919.3 MiB)
14:51:28.127 INFO BlockManagerInfo - Removed broadcast_173_piece0 on localhost:44923 in memory (size: 56.3 KiB, free: 1919.3 MiB)
14:51:28.128 INFO BlockManagerInfo - Removed broadcast_172_piece0 on localhost:44923 in memory (size: 154.6 KiB, free: 1919.5 MiB)
14:51:28.128 INFO BlockManagerInfo - Removed broadcast_178_piece0 on localhost:44923 in memory (size: 50.3 KiB, free: 1919.5 MiB)
14:51:28.129 INFO FileOutputCommitter - Saved output of task 'attempt_202603041451277047435128079897629_0443_r_000001_0' to file:/tmp/ReadsSparkSinkUnitTest210765799486424882545.bam/_temporary/0/task_202603041451277047435128079897629_0443_r_000001
14:51:28.129 INFO SparkHadoopMapRedUtil - attempt_202603041451277047435128079897629_0443_r_000001_0: Committed. Elapsed time: 0 ms.
14:51:28.129 INFO Executor - Finished task 1.0 in stage 100.0 (TID 149). 1772 bytes result sent to driver
14:51:28.129 INFO BlockManagerInfo - Removed broadcast_169_piece0 on localhost:44923 in memory (size: 50.2 KiB, free: 1919.6 MiB)
14:51:28.130 INFO TaskSetManager - Finished task 1.0 in stage 100.0 (TID 149) in 39 ms on localhost (executor driver) (2/2)
14:51:28.130 INFO TaskSchedulerImpl - Removed TaskSet 100.0, whose tasks have all completed, from pool
14:51:28.130 INFO BlockManagerInfo - Removed broadcast_170_piece0 on localhost:44923 in memory (size: 9.6 KiB, free: 1919.6 MiB)
14:51:28.131 INFO DAGScheduler - ResultStage 100 (runJob at SparkHadoopWriter.scala:83) finished in 0.049 s
14:51:28.131 INFO DAGScheduler - Job 71 is finished. Cancelling potential speculative or zombie tasks for this job
14:51:28.131 INFO TaskSchedulerImpl - Killing all running tasks in stage 100: Stage finished
14:51:28.131 INFO DAGScheduler - Job 71 finished: runJob at SparkHadoopWriter.scala:83, took 0.129275 s
14:51:28.131 INFO BlockManagerInfo - Removed broadcast_176_piece0 on localhost:44923 in memory (size: 153.6 KiB, free: 1919.7 MiB)
14:51:28.131 INFO SparkHadoopWriter - Start to commit write Job job_202603041451277047435128079897629_0443.
14:51:28.133 INFO BlockManagerInfo - Removed broadcast_181_piece0 on localhost:44923 in memory (size: 154.6 KiB, free: 1919.9 MiB)
14:51:28.139 INFO SparkHadoopWriter - Write Job job_202603041451277047435128079897629_0443 committed. Elapsed time: 7 ms.
14:51:28.142 INFO MemoryStore - Block broadcast_183 stored as values in memory (estimated size 297.9 KiB, free 1918.8 MiB)
14:51:28.149 INFO MemoryStore - Block broadcast_183_piece0 stored as bytes in memory (estimated size 50.2 KiB, free 1918.8 MiB)
14:51:28.149 INFO BlockManagerInfo - Added broadcast_183_piece0 in memory on localhost:44923 (size: 50.2 KiB, free: 1919.8 MiB)
14:51:28.149 INFO SparkContext - Created broadcast 183 from newAPIHadoopFile at PathSplitSource.java:96
14:51:28.174 INFO SparkContext - Starting job: count at ReadsSparkSinkUnitTest.java:222
14:51:28.174 INFO DAGScheduler - Got job 72 (count at ReadsSparkSinkUnitTest.java:222) with 2 output partitions
14:51:28.174 INFO DAGScheduler - Final stage: ResultStage 102 (count at ReadsSparkSinkUnitTest.java:222)
14:51:28.174 INFO DAGScheduler - Parents of final stage: List(ShuffleMapStage 101)
14:51:28.174 INFO DAGScheduler - Missing parents: List()
14:51:28.175 INFO DAGScheduler - Submitting ResultStage 102 (MapPartitionsRDD[434] at repartition at ReadsSparkSinkUnitTest.java:210), which has no missing parents
14:51:28.176 INFO MemoryStore - Block broadcast_184 stored as values in memory (estimated size 6.3 KiB, free 1918.8 MiB)
14:51:28.176 INFO MemoryStore - Block broadcast_184_piece0 stored as bytes in memory (estimated size 3.4 KiB, free 1918.8 MiB)
14:51:28.176 INFO BlockManagerInfo - Added broadcast_184_piece0 in memory on localhost:44923 (size: 3.4 KiB, free: 1919.8 MiB)
14:51:28.177 INFO SparkContext - Created broadcast 184 from broadcast at DAGScheduler.scala:1580
14:51:28.177 INFO DAGScheduler - Submitting 2 missing tasks from ResultStage 102 (MapPartitionsRDD[434] at repartition at ReadsSparkSinkUnitTest.java:210) (first 15 tasks are for partitions Vector(0, 1))
14:51:28.177 INFO TaskSchedulerImpl - Adding task set 102.0 with 2 tasks resource profile 0
14:51:28.178 INFO TaskSetManager - Starting task 0.0 in stage 102.0 (TID 150) (localhost, executor driver, partition 0, NODE_LOCAL, 7860 bytes)
14:51:28.178 INFO TaskSetManager - Starting task 1.0 in stage 102.0 (TID 151) (localhost, executor driver, partition 1, NODE_LOCAL, 7860 bytes)
14:51:28.178 INFO Executor - Running task 0.0 in stage 102.0 (TID 150)
14:51:28.178 INFO Executor - Running task 1.0 in stage 102.0 (TID 151)
14:51:28.180 INFO ShuffleBlockFetcherIterator - Getting 1 (176.4 KiB) non-empty blocks including 1 (176.4 KiB) local and 0 (0.0 B) host-local and 0 (0.0 B) push-merged-local and 0 (0.0 B) remote blocks
14:51:28.180 INFO ShuffleBlockFetcherIterator - Getting 1 (176.4 KiB) non-empty blocks including 1 (176.4 KiB) local and 0 (0.0 B) host-local and 0 (0.0 B) push-merged-local and 0 (0.0 B) remote blocks
14:51:28.180 INFO ShuffleBlockFetcherIterator - Started 0 remote fetches in 0 ms
14:51:28.180 INFO ShuffleBlockFetcherIterator - Started 0 remote fetches in 0 ms
14:51:28.184 INFO Executor - Finished task 1.0 in stage 102.0 (TID 151). 1634 bytes result sent to driver
14:51:28.184 INFO TaskSetManager - Finished task 1.0 in stage 102.0 (TID 151) in 6 ms on localhost (executor driver) (1/2)
14:51:28.185 INFO Executor - Finished task 0.0 in stage 102.0 (TID 150). 1634 bytes result sent to driver
14:51:28.185 INFO TaskSetManager - Finished task 0.0 in stage 102.0 (TID 150) in 7 ms on localhost (executor driver) (2/2)
14:51:28.185 INFO TaskSchedulerImpl - Removed TaskSet 102.0, whose tasks have all completed, from pool
14:51:28.185 INFO DAGScheduler - ResultStage 102 (count at ReadsSparkSinkUnitTest.java:222) finished in 0.010 s
14:51:28.185 INFO DAGScheduler - Job 72 is finished. Cancelling potential speculative or zombie tasks for this job
14:51:28.185 INFO TaskSchedulerImpl - Killing all running tasks in stage 102: Stage finished
14:51:28.186 INFO DAGScheduler - Job 72 finished: count at ReadsSparkSinkUnitTest.java:222, took 0.012078 s
14:51:28.200 INFO FileInputFormat - Total input files to process : 2
14:51:28.204 INFO SparkContext - Starting job: count at ReadsSparkSinkUnitTest.java:222
14:51:28.204 INFO DAGScheduler - Got job 73 (count at ReadsSparkSinkUnitTest.java:222) with 2 output partitions
14:51:28.204 INFO DAGScheduler - Final stage: ResultStage 103 (count at ReadsSparkSinkUnitTest.java:222)
14:51:28.205 INFO DAGScheduler - Parents of final stage: List()
14:51:28.205 INFO DAGScheduler - Missing parents: List()
14:51:28.205 INFO DAGScheduler - Submitting ResultStage 103 (MapPartitionsRDD[450] at filter at ReadsSparkSource.java:96), which has no missing parents
14:51:28.226 INFO MemoryStore - Block broadcast_185 stored as values in memory (estimated size 426.1 KiB, free 1918.4 MiB)
14:51:28.227 INFO MemoryStore - Block broadcast_185_piece0 stored as bytes in memory (estimated size 153.6 KiB, free 1918.2 MiB)
14:51:28.228 INFO BlockManagerInfo - Added broadcast_185_piece0 in memory on localhost:44923 (size: 153.6 KiB, free: 1919.7 MiB)
14:51:28.228 INFO SparkContext - Created broadcast 185 from broadcast at DAGScheduler.scala:1580
14:51:28.228 INFO DAGScheduler - Submitting 2 missing tasks from ResultStage 103 (MapPartitionsRDD[450] at filter at ReadsSparkSource.java:96) (first 15 tasks are for partitions Vector(0, 1))
14:51:28.228 INFO TaskSchedulerImpl - Adding task set 103.0 with 2 tasks resource profile 0
14:51:28.229 INFO TaskSetManager - Starting task 0.0 in stage 103.0 (TID 152) (localhost, executor driver, partition 0, PROCESS_LOCAL, 7827 bytes)
14:51:28.229 INFO TaskSetManager - Starting task 1.0 in stage 103.0 (TID 153) (localhost, executor driver, partition 1, PROCESS_LOCAL, 7827 bytes)
14:51:28.229 INFO Executor - Running task 1.0 in stage 103.0 (TID 153)
14:51:28.229 INFO Executor - Running task 0.0 in stage 103.0 (TID 152)
14:51:28.267 INFO NewHadoopRDD - Input split: file:/tmp/ReadsSparkSinkUnitTest210765799486424882545.bam/part-r-00000.bam:0+129755
14:51:28.267 INFO NewHadoopRDD - Input split: file:/tmp/ReadsSparkSinkUnitTest210765799486424882545.bam/part-r-00001.bam:0+129440
14:51:28.277 INFO Executor - Finished task 1.0 in stage 103.0 (TID 153). 989 bytes result sent to driver
14:51:28.278 INFO Executor - Finished task 0.0 in stage 103.0 (TID 152). 989 bytes result sent to driver
14:51:28.278 INFO TaskSetManager - Finished task 0.0 in stage 103.0 (TID 152) in 49 ms on localhost (executor driver) (1/2)
14:51:28.278 INFO TaskSetManager - Finished task 1.0 in stage 103.0 (TID 153) in 49 ms on localhost (executor driver) (2/2)
14:51:28.278 INFO TaskSchedulerImpl - Removed TaskSet 103.0, whose tasks have all completed, from pool
14:51:28.278 INFO DAGScheduler - ResultStage 103 (count at ReadsSparkSinkUnitTest.java:222) finished in 0.073 s
14:51:28.279 INFO DAGScheduler - Job 73 is finished. Cancelling potential speculative or zombie tasks for this job
14:51:28.279 INFO TaskSchedulerImpl - Killing all running tasks in stage 103: Stage finished
14:51:28.279 INFO DAGScheduler - Job 73 finished: count at ReadsSparkSinkUnitTest.java:222, took 0.074747 s
14:51:28.283 INFO MemoryStore - Block broadcast_186 stored as values in memory (estimated size 298.0 KiB, free 1917.9 MiB)
14:51:28.290 INFO MemoryStore - Block broadcast_186_piece0 stored as bytes in memory (estimated size 50.2 KiB, free 1917.9 MiB)
14:51:28.290 INFO BlockManagerInfo - Added broadcast_186_piece0 in memory on localhost:44923 (size: 50.2 KiB, free: 1919.6 MiB)
14:51:28.290 INFO SparkContext - Created broadcast 186 from newAPIHadoopFile at PathSplitSource.java:96
14:51:28.315 INFO MemoryStore - Block broadcast_187 stored as values in memory (estimated size 298.0 KiB, free 1917.6 MiB)
14:51:28.322 INFO MemoryStore - Block broadcast_187_piece0 stored as bytes in memory (estimated size 50.2 KiB, free 1917.5 MiB)
14:51:28.322 INFO BlockManagerInfo - Added broadcast_187_piece0 in memory on localhost:44923 (size: 50.2 KiB, free: 1919.6 MiB)
14:51:28.322 INFO SparkContext - Created broadcast 187 from newAPIHadoopFile at PathSplitSource.java:96
14:51:28.343 INFO MemoryStore - Block broadcast_188 stored as values in memory (estimated size 19.6 KiB, free 1917.5 MiB)
14:51:28.344 INFO MemoryStore - Block broadcast_188_piece0 stored as bytes in memory (estimated size 1890.0 B, free 1917.5 MiB)
14:51:28.344 INFO BlockManagerInfo - Added broadcast_188_piece0 in memory on localhost:44923 (size: 1890.0 B, free: 1919.6 MiB)
14:51:28.344 INFO SparkContext - Created broadcast 188 from broadcast at ReadsSparkSink.java:133
14:51:28.345 INFO MemoryStore - Block broadcast_189 stored as values in memory (estimated size 20.0 KiB, free 1917.5 MiB)
14:51:28.346 INFO MemoryStore - Block broadcast_189_piece0 stored as bytes in memory (estimated size 1890.0 B, free 1917.5 MiB)
14:51:28.346 INFO BlockManagerInfo - Added broadcast_189_piece0 in memory on localhost:44923 (size: 1890.0 B, free: 1919.6 MiB)
14:51:28.346 INFO SparkContext - Created broadcast 189 from broadcast at AnySamSinkMultiple.java:80
14:51:28.348 INFO PathOutputCommitterFactory - No output committer factory defined, defaulting to FileOutputCommitterFactory
14:51:28.348 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
14:51:28.348 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
14:51:28.363 INFO FileInputFormat - Total input files to process : 1
14:51:28.370 INFO SparkContext - Starting job: runJob at SparkHadoopWriter.scala:83
14:51:28.371 INFO DAGScheduler - Registering RDD 458 (repartition at ReadsSparkSinkUnitTest.java:210) as input to shuffle 22
14:51:28.371 INFO DAGScheduler - Got job 74 (runJob at SparkHadoopWriter.scala:83) with 2 output partitions
14:51:28.371 INFO DAGScheduler - Final stage: ResultStage 105 (runJob at SparkHadoopWriter.scala:83)
14:51:28.371 INFO DAGScheduler - Parents of final stage: List(ShuffleMapStage 104)
14:51:28.371 INFO DAGScheduler - Missing parents: List(ShuffleMapStage 104)
14:51:28.371 INFO DAGScheduler - Submitting ShuffleMapStage 104 (MapPartitionsRDD[458] at repartition at ReadsSparkSinkUnitTest.java:210), which has no missing parents
14:51:28.396 INFO MemoryStore - Block broadcast_190 stored as values in memory (estimated size 427.7 KiB, free 1917.1 MiB)
14:51:28.398 INFO MemoryStore - Block broadcast_190_piece0 stored as bytes in memory (estimated size 154.6 KiB, free 1916.9 MiB)
14:51:28.398 INFO BlockManagerInfo - Added broadcast_190_piece0 in memory on localhost:44923 (size: 154.6 KiB, free: 1919.4 MiB)
14:51:28.398 INFO SparkContext - Created broadcast 190 from broadcast at DAGScheduler.scala:1580
14:51:28.399 INFO DAGScheduler - Submitting 1 missing tasks from ShuffleMapStage 104 (MapPartitionsRDD[458] at repartition at ReadsSparkSinkUnitTest.java:210) (first 15 tasks are for partitions Vector(0))
14:51:28.399 INFO TaskSchedulerImpl - Adding task set 104.0 with 1 tasks resource profile 0
14:51:28.399 INFO TaskSetManager - Starting task 0.0 in stage 104.0 (TID 154) (localhost, executor driver, partition 0, PROCESS_LOCAL, 7882 bytes)
14:51:28.400 INFO Executor - Running task 0.0 in stage 104.0 (TID 154)
14:51:28.433 INFO NewHadoopRDD - Input split: file:/home/runner/work/gatk/gatk/src/test/resources/org/broadinstitute/hellbender/tools/BQSR/CEUTrio.HiSeq.WGS.b37.ch20.1m-1m1k.NA12878.bam:0+211123
14:51:28.448 INFO Executor - Finished task 0.0 in stage 104.0 (TID 154). 1149 bytes result sent to driver
14:51:28.449 INFO TaskSetManager - Finished task 0.0 in stage 104.0 (TID 154) in 50 ms on localhost (executor driver) (1/1)
14:51:28.449 INFO TaskSchedulerImpl - Removed TaskSet 104.0, whose tasks have all completed, from pool
14:51:28.449 INFO DAGScheduler - ShuffleMapStage 104 (repartition at ReadsSparkSinkUnitTest.java:210) finished in 0.077 s
14:51:28.449 INFO DAGScheduler - looking for newly runnable stages
14:51:28.449 INFO DAGScheduler - running: HashSet()
14:51:28.449 INFO DAGScheduler - waiting: HashSet(ResultStage 105)
14:51:28.449 INFO DAGScheduler - failed: HashSet()
14:51:28.450 INFO DAGScheduler - Submitting ResultStage 105 (MapPartitionsRDD[470] at mapToPair at AnySamSinkMultiple.java:89), which has no missing parents
14:51:28.456 INFO MemoryStore - Block broadcast_191 stored as values in memory (estimated size 150.2 KiB, free 1916.8 MiB)
14:51:28.457 INFO MemoryStore - Block broadcast_191_piece0 stored as bytes in memory (estimated size 56.2 KiB, free 1916.7 MiB)
14:51:28.457 INFO BlockManagerInfo - Added broadcast_191_piece0 in memory on localhost:44923 (size: 56.2 KiB, free: 1919.4 MiB)
14:51:28.457 INFO SparkContext - Created broadcast 191 from broadcast at DAGScheduler.scala:1580
14:51:28.458 INFO DAGScheduler - Submitting 2 missing tasks from ResultStage 105 (MapPartitionsRDD[470] at mapToPair at AnySamSinkMultiple.java:89) (first 15 tasks are for partitions Vector(0, 1))
14:51:28.458 INFO TaskSchedulerImpl - Adding task set 105.0 with 2 tasks resource profile 0
14:51:28.458 INFO TaskSetManager - Starting task 0.0 in stage 105.0 (TID 155) (localhost, executor driver, partition 0, NODE_LOCAL, 7860 bytes)
14:51:28.459 INFO TaskSetManager - Starting task 1.0 in stage 105.0 (TID 156) (localhost, executor driver, partition 1, NODE_LOCAL, 7860 bytes)
14:51:28.459 INFO Executor - Running task 0.0 in stage 105.0 (TID 155)
14:51:28.459 INFO Executor - Running task 1.0 in stage 105.0 (TID 156)
14:51:28.465 INFO PathOutputCommitterFactory - No output committer factory defined, defaulting to FileOutputCommitterFactory
14:51:28.465 INFO PathOutputCommitterFactory - No output committer factory defined, defaulting to FileOutputCommitterFactory
14:51:28.465 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
14:51:28.465 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
14:51:28.465 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
14:51:28.465 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
14:51:28.465 INFO PathOutputCommitterFactory - No output committer factory defined, defaulting to FileOutputCommitterFactory
14:51:28.465 INFO PathOutputCommitterFactory - No output committer factory defined, defaulting to FileOutputCommitterFactory
14:51:28.465 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
14:51:28.465 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
14:51:28.465 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
14:51:28.465 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
14:51:28.477 INFO ShuffleBlockFetcherIterator - Getting 1 (176.4 KiB) non-empty blocks including 1 (176.4 KiB) local and 0 (0.0 B) host-local and 0 (0.0 B) push-merged-local and 0 (0.0 B) remote blocks
14:51:28.477 INFO ShuffleBlockFetcherIterator - Started 0 remote fetches in 0 ms
14:51:28.482 INFO ShuffleBlockFetcherIterator - Getting 1 (160.4 KiB) non-empty blocks including 1 (160.4 KiB) local and 0 (0.0 B) host-local and 0 (0.0 B) push-merged-local and 0 (0.0 B) remote blocks
14:51:28.482 INFO ShuffleBlockFetcherIterator - Started 0 remote fetches in 0 ms
14:51:28.485 INFO FileOutputCommitter - Saved output of task 'attempt_202603041451289051407169512589161_0470_r_000001_0' to file:/tmp/ReadsSparkSinkUnitTest37327884812202069286.bam/_temporary/0/task_202603041451289051407169512589161_0470_r_000001
14:51:28.485 INFO SparkHadoopMapRedUtil - attempt_202603041451289051407169512589161_0470_r_000001_0: Committed. Elapsed time: 0 ms.
14:51:28.486 INFO Executor - Finished task 1.0 in stage 105.0 (TID 156). 1729 bytes result sent to driver
14:51:28.486 INFO TaskSetManager - Finished task 1.0 in stage 105.0 (TID 156) in 28 ms on localhost (executor driver) (1/2)
14:51:28.488 INFO FileOutputCommitter - Saved output of task 'attempt_202603041451289051407169512589161_0470_r_000000_0' to file:/tmp/ReadsSparkSinkUnitTest37327884812202069286.bam/_temporary/0/task_202603041451289051407169512589161_0470_r_000000
14:51:28.488 INFO SparkHadoopMapRedUtil - attempt_202603041451289051407169512589161_0470_r_000000_0: Committed. Elapsed time: 0 ms.
14:51:28.489 INFO Executor - Finished task 0.0 in stage 105.0 (TID 155). 1729 bytes result sent to driver
14:51:28.489 INFO TaskSetManager - Finished task 0.0 in stage 105.0 (TID 155) in 31 ms on localhost (executor driver) (2/2)
14:51:28.489 INFO TaskSchedulerImpl - Removed TaskSet 105.0, whose tasks have all completed, from pool
14:51:28.489 INFO DAGScheduler - ResultStage 105 (runJob at SparkHadoopWriter.scala:83) finished in 0.039 s
14:51:28.490 INFO DAGScheduler - Job 74 is finished. Cancelling potential speculative or zombie tasks for this job
14:51:28.490 INFO TaskSchedulerImpl - Killing all running tasks in stage 105: Stage finished
14:51:28.490 INFO DAGScheduler - Job 74 finished: runJob at SparkHadoopWriter.scala:83, took 0.119798 s
14:51:28.490 INFO SparkHadoopWriter - Start to commit write Job job_202603041451289051407169512589161_0470.
14:51:28.496 INFO SparkHadoopWriter - Write Job job_202603041451289051407169512589161_0470 committed. Elapsed time: 5 ms.
14:51:28.498 INFO MemoryStore - Block broadcast_192 stored as values in memory (estimated size 297.9 KiB, free 1916.4 MiB)
14:51:28.505 INFO MemoryStore - Block broadcast_192_piece0 stored as bytes in memory (estimated size 50.2 KiB, free 1916.4 MiB)
14:51:28.505 INFO BlockManagerInfo - Added broadcast_192_piece0 in memory on localhost:44923 (size: 50.2 KiB, free: 1919.3 MiB)
14:51:28.505 INFO SparkContext - Created broadcast 192 from newAPIHadoopFile at PathSplitSource.java:96
14:51:28.529 INFO SparkContext - Starting job: count at ReadsSparkSinkUnitTest.java:222
14:51:28.530 INFO DAGScheduler - Got job 75 (count at ReadsSparkSinkUnitTest.java:222) with 2 output partitions
14:51:28.530 INFO DAGScheduler - Final stage: ResultStage 107 (count at ReadsSparkSinkUnitTest.java:222)
14:51:28.530 INFO DAGScheduler - Parents of final stage: List(ShuffleMapStage 106)
14:51:28.530 INFO DAGScheduler - Missing parents: List()
14:51:28.530 INFO DAGScheduler - Submitting ResultStage 107 (MapPartitionsRDD[461] at repartition at ReadsSparkSinkUnitTest.java:210), which has no missing parents
14:51:28.531 INFO MemoryStore - Block broadcast_193 stored as values in memory (estimated size 6.3 KiB, free 1916.4 MiB)
14:51:28.532 INFO MemoryStore - Block broadcast_193_piece0 stored as bytes in memory (estimated size 3.4 KiB, free 1916.4 MiB)
14:51:28.532 INFO BlockManagerInfo - Added broadcast_193_piece0 in memory on localhost:44923 (size: 3.4 KiB, free: 1919.3 MiB)
14:51:28.532 INFO SparkContext - Created broadcast 193 from broadcast at DAGScheduler.scala:1580
14:51:28.532 INFO DAGScheduler - Submitting 2 missing tasks from ResultStage 107 (MapPartitionsRDD[461] at repartition at ReadsSparkSinkUnitTest.java:210) (first 15 tasks are for partitions Vector(0, 1))
14:51:28.532 INFO TaskSchedulerImpl - Adding task set 107.0 with 2 tasks resource profile 0
14:51:28.533 INFO TaskSetManager - Starting task 0.0 in stage 107.0 (TID 157) (localhost, executor driver, partition 0, NODE_LOCAL, 7860 bytes)
14:51:28.533 INFO TaskSetManager - Starting task 1.0 in stage 107.0 (TID 158) (localhost, executor driver, partition 1, NODE_LOCAL, 7860 bytes)
14:51:28.533 INFO Executor - Running task 1.0 in stage 107.0 (TID 158)
14:51:28.533 INFO Executor - Running task 0.0 in stage 107.0 (TID 157)
14:51:28.535 INFO ShuffleBlockFetcherIterator - Getting 1 (176.4 KiB) non-empty blocks including 1 (176.4 KiB) local and 0 (0.0 B) host-local and 0 (0.0 B) push-merged-local and 0 (0.0 B) remote blocks
14:51:28.535 INFO ShuffleBlockFetcherIterator - Getting 1 (160.4 KiB) non-empty blocks including 1 (160.4 KiB) local and 0 (0.0 B) host-local and 0 (0.0 B) push-merged-local and 0 (0.0 B) remote blocks
14:51:28.535 INFO ShuffleBlockFetcherIterator - Started 0 remote fetches in 0 ms
14:51:28.535 INFO ShuffleBlockFetcherIterator - Started 0 remote fetches in 0 ms
14:51:28.540 INFO Executor - Finished task 1.0 in stage 107.0 (TID 158). 1634 bytes result sent to driver
14:51:28.540 INFO Executor - Finished task 0.0 in stage 107.0 (TID 157). 1634 bytes result sent to driver
14:51:28.540 INFO TaskSetManager - Finished task 0.0 in stage 107.0 (TID 157) in 7 ms on localhost (executor driver) (1/2)
14:51:28.540 INFO TaskSetManager - Finished task 1.0 in stage 107.0 (TID 158) in 7 ms on localhost (executor driver) (2/2)
14:51:28.540 INFO TaskSchedulerImpl - Removed TaskSet 107.0, whose tasks have all completed, from pool
14:51:28.541 INFO DAGScheduler - ResultStage 107 (count at ReadsSparkSinkUnitTest.java:222) finished in 0.010 s
14:51:28.541 INFO DAGScheduler - Job 75 is finished. Cancelling potential speculative or zombie tasks for this job
14:51:28.541 INFO TaskSchedulerImpl - Killing all running tasks in stage 107: Stage finished
14:51:28.541 INFO DAGScheduler - Job 75 finished: count at ReadsSparkSinkUnitTest.java:222, took 0.011679 s
14:51:28.558 INFO FileInputFormat - Total input files to process : 2
14:51:28.562 INFO SparkContext - Starting job: count at ReadsSparkSinkUnitTest.java:222
14:51:28.562 INFO DAGScheduler - Got job 76 (count at ReadsSparkSinkUnitTest.java:222) with 2 output partitions
14:51:28.562 INFO DAGScheduler - Final stage: ResultStage 108 (count at ReadsSparkSinkUnitTest.java:222)
14:51:28.562 INFO DAGScheduler - Parents of final stage: List()
14:51:28.562 INFO DAGScheduler - Missing parents: List()
14:51:28.562 INFO DAGScheduler - Submitting ResultStage 108 (MapPartitionsRDD[477] at filter at ReadsSparkSource.java:96), which has no missing parents
14:51:28.592 INFO MemoryStore - Block broadcast_194 stored as values in memory (estimated size 426.1 KiB, free 1915.9 MiB)
14:51:28.593 INFO MemoryStore - Block broadcast_194_piece0 stored as bytes in memory (estimated size 153.6 KiB, free 1915.8 MiB)
14:51:28.594 INFO BlockManagerInfo - Added broadcast_194_piece0 in memory on localhost:44923 (size: 153.6 KiB, free: 1919.2 MiB)
14:51:28.594 INFO SparkContext - Created broadcast 194 from broadcast at DAGScheduler.scala:1580
14:51:28.594 INFO DAGScheduler - Submitting 2 missing tasks from ResultStage 108 (MapPartitionsRDD[477] at filter at ReadsSparkSource.java:96) (first 15 tasks are for partitions Vector(0, 1))
14:51:28.594 INFO TaskSchedulerImpl - Adding task set 108.0 with 2 tasks resource profile 0
14:51:28.595 INFO TaskSetManager - Starting task 0.0 in stage 108.0 (TID 159) (localhost, executor driver, partition 0, PROCESS_LOCAL, 7826 bytes)
14:51:28.595 INFO TaskSetManager - Starting task 1.0 in stage 108.0 (TID 160) (localhost, executor driver, partition 1, PROCESS_LOCAL, 7826 bytes)
14:51:28.595 INFO Executor - Running task 1.0 in stage 108.0 (TID 160)
14:51:28.595 INFO Executor - Running task 0.0 in stage 108.0 (TID 159)
14:51:28.627 INFO NewHadoopRDD - Input split: file:/tmp/ReadsSparkSinkUnitTest37327884812202069286.bam/part-r-00001.bam:0+123314
14:51:28.631 INFO Executor - Finished task 1.0 in stage 108.0 (TID 160). 989 bytes result sent to driver
14:51:28.631 INFO TaskSetManager - Finished task 1.0 in stage 108.0 (TID 160) in 36 ms on localhost (executor driver) (1/2)
14:51:28.642 INFO NewHadoopRDD - Input split: file:/tmp/ReadsSparkSinkUnitTest37327884812202069286.bam/part-r-00000.bam:0+122169
14:51:28.646 INFO Executor - Finished task 0.0 in stage 108.0 (TID 159). 989 bytes result sent to driver
14:51:28.647 INFO TaskSetManager - Finished task 0.0 in stage 108.0 (TID 159) in 52 ms on localhost (executor driver) (2/2)
14:51:28.647 INFO TaskSchedulerImpl - Removed TaskSet 108.0, whose tasks have all completed, from pool
14:51:28.647 INFO DAGScheduler - ResultStage 108 (count at ReadsSparkSinkUnitTest.java:222) finished in 0.084 s
14:51:28.647 INFO DAGScheduler - Job 76 is finished. Cancelling potential speculative or zombie tasks for this job
14:51:28.647 INFO TaskSchedulerImpl - Killing all running tasks in stage 108: Stage finished
14:51:28.648 INFO DAGScheduler - Job 76 finished: count at ReadsSparkSinkUnitTest.java:222, took 0.085904 s
14:51:28.650 INFO MemoryStore - Block broadcast_195 stored as values in memory (estimated size 576.0 B, free 1915.8 MiB)
14:51:28.661 INFO MemoryStore - Block broadcast_195_piece0 stored as bytes in memory (estimated size 228.0 B, free 1915.8 MiB)
14:51:28.661 INFO BlockManagerInfo - Added broadcast_195_piece0 in memory on localhost:44923 (size: 228.0 B, free: 1919.2 MiB)
14:51:28.661 INFO BlockManagerInfo - Removed broadcast_190_piece0 on localhost:44923 in memory (size: 154.6 KiB, free: 1919.3 MiB)
14:51:28.662 INFO SparkContext - Created broadcast 195 from broadcast at CramSource.java:114
14:51:28.662 INFO BlockManagerInfo - Removed broadcast_194_piece0 on localhost:44923 in memory (size: 153.6 KiB, free: 1919.5 MiB)
14:51:28.663 INFO BlockManagerInfo - Removed broadcast_177_piece0 on localhost:44923 in memory (size: 50.3 KiB, free: 1919.5 MiB)
14:51:28.663 INFO MemoryStore - Block broadcast_196 stored as values in memory (estimated size 297.9 KiB, free 1917.0 MiB)
14:51:28.663 INFO BlockManagerInfo - Removed broadcast_182_piece0 on localhost:44923 in memory (size: 56.3 KiB, free: 1919.6 MiB)
14:51:28.666 INFO BlockManagerInfo - Removed broadcast_188_piece0 on localhost:44923 in memory (size: 1890.0 B, free: 1919.6 MiB)
14:51:28.667 INFO BlockManagerInfo - Removed broadcast_186_piece0 on localhost:44923 in memory (size: 50.2 KiB, free: 1919.6 MiB)
14:51:28.667 INFO BlockManagerInfo - Removed broadcast_191_piece0 on localhost:44923 in memory (size: 56.2 KiB, free: 1919.7 MiB)
14:51:28.668 INFO BlockManagerInfo - Removed broadcast_184_piece0 on localhost:44923 in memory (size: 3.4 KiB, free: 1919.7 MiB)
14:51:28.668 INFO BlockManagerInfo - Removed broadcast_189_piece0 on localhost:44923 in memory (size: 1890.0 B, free: 1919.7 MiB)
14:51:28.669 INFO BlockManagerInfo - Removed broadcast_183_piece0 on localhost:44923 in memory (size: 50.2 KiB, free: 1919.7 MiB)
14:51:28.670 INFO BlockManagerInfo - Removed broadcast_179_piece0 on localhost:44923 in memory (size: 9.6 KiB, free: 1919.7 MiB)
14:51:28.671 INFO BlockManagerInfo - Removed broadcast_180_piece0 on localhost:44923 in memory (size: 9.6 KiB, free: 1919.7 MiB)
14:51:28.672 INFO BlockManagerInfo - Removed broadcast_187_piece0 on localhost:44923 in memory (size: 50.2 KiB, free: 1919.8 MiB)
14:51:28.673 INFO BlockManagerInfo - Removed broadcast_192_piece0 on localhost:44923 in memory (size: 50.2 KiB, free: 1919.8 MiB)
14:51:28.674 INFO BlockManagerInfo - Removed broadcast_185_piece0 on localhost:44923 in memory (size: 153.6 KiB, free: 1920.0 MiB)
14:51:28.674 INFO BlockManagerInfo - Removed broadcast_193_piece0 on localhost:44923 in memory (size: 3.4 KiB, free: 1920.0 MiB)
14:51:28.675 INFO MemoryStore - Block broadcast_196_piece0 stored as bytes in memory (estimated size 50.2 KiB, free 1919.7 MiB)
14:51:28.675 INFO BlockManagerInfo - Added broadcast_196_piece0 in memory on localhost:44923 (size: 50.2 KiB, free: 1920.0 MiB)
14:51:28.675 INFO SparkContext - Created broadcast 196 from newAPIHadoopFile at PathSplitSource.java:96
14:51:28.693 INFO MemoryStore - Block broadcast_197 stored as values in memory (estimated size 576.0 B, free 1919.7 MiB)
14:51:28.694 INFO MemoryStore - Block broadcast_197_piece0 stored as bytes in memory (estimated size 228.0 B, free 1919.7 MiB)
14:51:28.694 INFO BlockManagerInfo - Added broadcast_197_piece0 in memory on localhost:44923 (size: 228.0 B, free: 1920.0 MiB)
14:51:28.694 INFO SparkContext - Created broadcast 197 from broadcast at CramSource.java:114
14:51:28.695 INFO MemoryStore - Block broadcast_198 stored as values in memory (estimated size 297.9 KiB, free 1919.4 MiB)
14:51:28.701 INFO MemoryStore - Block broadcast_198_piece0 stored as bytes in memory (estimated size 50.2 KiB, free 1919.3 MiB)
14:51:28.701 INFO BlockManagerInfo - Added broadcast_198_piece0 in memory on localhost:44923 (size: 50.2 KiB, free: 1919.9 MiB)
14:51:28.702 INFO SparkContext - Created broadcast 198 from newAPIHadoopFile at PathSplitSource.java:96
14:51:28.716 INFO MemoryStore - Block broadcast_199 stored as values in memory (estimated size 6.0 KiB, free 1919.3 MiB)
14:51:28.716 INFO MemoryStore - Block broadcast_199_piece0 stored as bytes in memory (estimated size 1473.0 B, free 1919.3 MiB)
14:51:28.716 INFO BlockManagerInfo - Added broadcast_199_piece0 in memory on localhost:44923 (size: 1473.0 B, free: 1919.9 MiB)
14:51:28.717 INFO SparkContext - Created broadcast 199 from broadcast at ReadsSparkSink.java:133
14:51:28.717 INFO MemoryStore - Block broadcast_200 stored as values in memory (estimated size 6.2 KiB, free 1919.3 MiB)
14:51:28.718 INFO MemoryStore - Block broadcast_200_piece0 stored as bytes in memory (estimated size 1473.0 B, free 1919.3 MiB)
14:51:28.718 INFO BlockManagerInfo - Added broadcast_200_piece0 in memory on localhost:44923 (size: 1473.0 B, free: 1919.9 MiB)
14:51:28.718 INFO SparkContext - Created broadcast 200 from broadcast at AnySamSinkMultiple.java:80
14:51:28.720 INFO PathOutputCommitterFactory - No output committer factory defined, defaulting to FileOutputCommitterFactory
14:51:28.721 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
14:51:28.721 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
14:51:28.735 INFO FileInputFormat - Total input files to process : 1
14:51:28.741 INFO SparkContext - Starting job: runJob at SparkHadoopWriter.scala:83
14:51:28.742 INFO DAGScheduler - Registering RDD 484 (repartition at ReadsSparkSinkUnitTest.java:210) as input to shuffle 23
14:51:28.742 INFO DAGScheduler - Got job 77 (runJob at SparkHadoopWriter.scala:83) with 2 output partitions
14:51:28.742 INFO DAGScheduler - Final stage: ResultStage 110 (runJob at SparkHadoopWriter.scala:83)
14:51:28.742 INFO DAGScheduler - Parents of final stage: List(ShuffleMapStage 109)
14:51:28.742 INFO DAGScheduler - Missing parents: List(ShuffleMapStage 109)
14:51:28.743 INFO DAGScheduler - Submitting ShuffleMapStage 109 (MapPartitionsRDD[484] at repartition at ReadsSparkSinkUnitTest.java:210), which has no missing parents
14:51:28.755 INFO MemoryStore - Block broadcast_201 stored as values in memory (estimated size 288.4 KiB, free 1919.0 MiB)
14:51:28.757 INFO MemoryStore - Block broadcast_201_piece0 stored as bytes in memory (estimated size 104.7 KiB, free 1918.9 MiB)
14:51:28.757 INFO BlockManagerInfo - Added broadcast_201_piece0 in memory on localhost:44923 (size: 104.7 KiB, free: 1919.8 MiB)
14:51:28.757 INFO SparkContext - Created broadcast 201 from broadcast at DAGScheduler.scala:1580
14:51:28.757 INFO DAGScheduler - Submitting 1 missing tasks from ShuffleMapStage 109 (MapPartitionsRDD[484] at repartition at ReadsSparkSinkUnitTest.java:210) (first 15 tasks are for partitions Vector(0))
14:51:28.757 INFO TaskSchedulerImpl - Adding task set 109.0 with 1 tasks resource profile 0
14:51:28.758 INFO TaskSetManager - Starting task 0.0 in stage 109.0 (TID 161) (localhost, executor driver, partition 0, PROCESS_LOCAL, 7869 bytes)
14:51:28.759 INFO Executor - Running task 0.0 in stage 109.0 (TID 161)
14:51:28.783 INFO NewHadoopRDD - Input split: file:/home/runner/work/gatk/gatk/src/test/resources/org/broadinstitute/hellbender/tools/BQSR/NA12878.chr17_69k_70k.dictFix.cram:0+50619
14:51:28.798 INFO Executor - Finished task 0.0 in stage 109.0 (TID 161). 1149 bytes result sent to driver
14:51:28.798 INFO TaskSetManager - Finished task 0.0 in stage 109.0 (TID 161) in 40 ms on localhost (executor driver) (1/1)
14:51:28.798 INFO TaskSchedulerImpl - Removed TaskSet 109.0, whose tasks have all completed, from pool
14:51:28.799 INFO DAGScheduler - ShuffleMapStage 109 (repartition at ReadsSparkSinkUnitTest.java:210) finished in 0.056 s
14:51:28.799 INFO DAGScheduler - looking for newly runnable stages
14:51:28.799 INFO DAGScheduler - running: HashSet()
14:51:28.799 INFO DAGScheduler - waiting: HashSet(ResultStage 110)
14:51:28.799 INFO DAGScheduler - failed: HashSet()
14:51:28.799 INFO DAGScheduler - Submitting ResultStage 110 (MapPartitionsRDD[495] at mapToPair at AnySamSinkMultiple.java:89), which has no missing parents
14:51:28.806 INFO MemoryStore - Block broadcast_202 stored as values in memory (estimated size 150.3 KiB, free 1918.8 MiB)
14:51:28.806 INFO MemoryStore - Block broadcast_202_piece0 stored as bytes in memory (estimated size 56.3 KiB, free 1918.7 MiB)
14:51:28.807 INFO BlockManagerInfo - Added broadcast_202_piece0 in memory on localhost:44923 (size: 56.3 KiB, free: 1919.7 MiB)
14:51:28.807 INFO SparkContext - Created broadcast 202 from broadcast at DAGScheduler.scala:1580
14:51:28.807 INFO DAGScheduler - Submitting 2 missing tasks from ResultStage 110 (MapPartitionsRDD[495] at mapToPair at AnySamSinkMultiple.java:89) (first 15 tasks are for partitions Vector(0, 1))
14:51:28.807 INFO TaskSchedulerImpl - Adding task set 110.0 with 2 tasks resource profile 0
14:51:28.808 INFO TaskSetManager - Starting task 0.0 in stage 110.0 (TID 162) (localhost, executor driver, partition 0, NODE_LOCAL, 7860 bytes)
14:51:28.808 INFO TaskSetManager - Starting task 1.0 in stage 110.0 (TID 163) (localhost, executor driver, partition 1, NODE_LOCAL, 7860 bytes)
14:51:28.808 INFO Executor - Running task 1.0 in stage 110.0 (TID 163)
14:51:28.808 INFO Executor - Running task 0.0 in stage 110.0 (TID 162)
14:51:28.813 INFO PathOutputCommitterFactory - No output committer factory defined, defaulting to FileOutputCommitterFactory
14:51:28.813 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
14:51:28.813 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
14:51:28.813 INFO PathOutputCommitterFactory - No output committer factory defined, defaulting to FileOutputCommitterFactory
14:51:28.813 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
14:51:28.813 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
14:51:28.815 INFO PathOutputCommitterFactory - No output committer factory defined, defaulting to FileOutputCommitterFactory
14:51:28.815 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
14:51:28.815 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
14:51:28.815 INFO PathOutputCommitterFactory - No output committer factory defined, defaulting to FileOutputCommitterFactory
14:51:28.815 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
14:51:28.815 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
14:51:28.828 INFO ShuffleBlockFetcherIterator - Getting 1 (42.2 KiB) non-empty blocks including 1 (42.2 KiB) local and 0 (0.0 B) host-local and 0 (0.0 B) push-merged-local and 0 (0.0 B) remote blocks
14:51:28.828 INFO ShuffleBlockFetcherIterator - Getting 1 (42.2 KiB) non-empty blocks including 1 (42.2 KiB) local and 0 (0.0 B) host-local and 0 (0.0 B) push-merged-local and 0 (0.0 B) remote blocks
14:51:28.829 INFO ShuffleBlockFetcherIterator - Started 0 remote fetches in 0 ms
14:51:28.829 INFO ShuffleBlockFetcherIterator - Started 0 remote fetches in 0 ms
14:51:28.833 INFO FileOutputCommitter - Saved output of task 'attempt_202603041451282522935069820244842_0495_r_000001_0' to file:/tmp/ReadsSparkSinkUnitTest514769924029096269302.cram/_temporary/0/task_202603041451282522935069820244842_0495_r_000001
14:51:28.833 INFO FileOutputCommitter - Saved output of task 'attempt_202603041451282522935069820244842_0495_r_000000_0' to file:/tmp/ReadsSparkSinkUnitTest514769924029096269302.cram/_temporary/0/task_202603041451282522935069820244842_0495_r_000000
14:51:28.833 INFO SparkHadoopMapRedUtil - attempt_202603041451282522935069820244842_0495_r_000001_0: Committed. Elapsed time: 0 ms.
14:51:28.833 INFO SparkHadoopMapRedUtil - attempt_202603041451282522935069820244842_0495_r_000000_0: Committed. Elapsed time: 0 ms.
14:51:28.834 INFO Executor - Finished task 0.0 in stage 110.0 (TID 162). 1729 bytes result sent to driver
14:51:28.834 INFO Executor - Finished task 1.0 in stage 110.0 (TID 163). 1729 bytes result sent to driver
14:51:28.835 INFO TaskSetManager - Finished task 1.0 in stage 110.0 (TID 163) in 27 ms on localhost (executor driver) (1/2)
14:51:28.835 INFO TaskSetManager - Finished task 0.0 in stage 110.0 (TID 162) in 27 ms on localhost (executor driver) (2/2)
14:51:28.835 INFO TaskSchedulerImpl - Removed TaskSet 110.0, whose tasks have all completed, from pool
14:51:28.835 INFO DAGScheduler - ResultStage 110 (runJob at SparkHadoopWriter.scala:83) finished in 0.035 s
14:51:28.835 INFO DAGScheduler - Job 77 is finished. Cancelling potential speculative or zombie tasks for this job
14:51:28.835 INFO TaskSchedulerImpl - Killing all running tasks in stage 110: Stage finished
14:51:28.835 INFO DAGScheduler - Job 77 finished: runJob at SparkHadoopWriter.scala:83, took 0.093722 s
14:51:28.836 INFO SparkHadoopWriter - Start to commit write Job job_202603041451282522935069820244842_0495.
14:51:28.842 INFO SparkHadoopWriter - Write Job job_202603041451282522935069820244842_0495 committed. Elapsed time: 5 ms.
14:51:28.844 INFO MemoryStore - Block broadcast_203 stored as values in memory (estimated size 297.9 KiB, free 1918.4 MiB)
14:51:28.850 INFO MemoryStore - Block broadcast_203_piece0 stored as bytes in memory (estimated size 50.1 KiB, free 1918.4 MiB)
14:51:28.850 INFO BlockManagerInfo - Added broadcast_203_piece0 in memory on localhost:44923 (size: 50.1 KiB, free: 1919.7 MiB)
14:51:28.851 INFO SparkContext - Created broadcast 203 from newAPIHadoopFile at PathSplitSource.java:96
14:51:28.875 INFO SparkContext - Starting job: count at ReadsSparkSinkUnitTest.java:222
14:51:28.875 INFO DAGScheduler - Got job 78 (count at ReadsSparkSinkUnitTest.java:222) with 2 output partitions
14:51:28.875 INFO DAGScheduler - Final stage: ResultStage 112 (count at ReadsSparkSinkUnitTest.java:222)
14:51:28.875 INFO DAGScheduler - Parents of final stage: List(ShuffleMapStage 111)
14:51:28.875 INFO DAGScheduler - Missing parents: List()
14:51:28.875 INFO DAGScheduler - Submitting ResultStage 112 (MapPartitionsRDD[487] at repartition at ReadsSparkSinkUnitTest.java:210), which has no missing parents
14:51:28.876 INFO MemoryStore - Block broadcast_204 stored as values in memory (estimated size 6.3 KiB, free 1918.4 MiB)
14:51:28.877 INFO MemoryStore - Block broadcast_204_piece0 stored as bytes in memory (estimated size 3.4 KiB, free 1918.4 MiB)
14:51:28.877 INFO BlockManagerInfo - Added broadcast_204_piece0 in memory on localhost:44923 (size: 3.4 KiB, free: 1919.7 MiB)
14:51:28.877 INFO SparkContext - Created broadcast 204 from broadcast at DAGScheduler.scala:1580
14:51:28.877 INFO DAGScheduler - Submitting 2 missing tasks from ResultStage 112 (MapPartitionsRDD[487] at repartition at ReadsSparkSinkUnitTest.java:210) (first 15 tasks are for partitions Vector(0, 1))
14:51:28.877 INFO TaskSchedulerImpl - Adding task set 112.0 with 2 tasks resource profile 0
14:51:28.878 INFO TaskSetManager - Starting task 0.0 in stage 112.0 (TID 164) (localhost, executor driver, partition 0, NODE_LOCAL, 7860 bytes)
14:51:28.878 INFO TaskSetManager - Starting task 1.0 in stage 112.0 (TID 165) (localhost, executor driver, partition 1, NODE_LOCAL, 7860 bytes)
14:51:28.879 INFO Executor - Running task 0.0 in stage 112.0 (TID 164)
14:51:28.879 INFO Executor - Running task 1.0 in stage 112.0 (TID 165)
14:51:28.880 INFO ShuffleBlockFetcherIterator - Getting 1 (42.2 KiB) non-empty blocks including 1 (42.2 KiB) local and 0 (0.0 B) host-local and 0 (0.0 B) push-merged-local and 0 (0.0 B) remote blocks
14:51:28.880 INFO ShuffleBlockFetcherIterator - Getting 1 (42.2 KiB) non-empty blocks including 1 (42.2 KiB) local and 0 (0.0 B) host-local and 0 (0.0 B) push-merged-local and 0 (0.0 B) remote blocks
14:51:28.880 INFO ShuffleBlockFetcherIterator - Started 0 remote fetches in 0 ms
14:51:28.880 INFO ShuffleBlockFetcherIterator - Started 0 remote fetches in 0 ms
14:51:28.884 INFO Executor - Finished task 0.0 in stage 112.0 (TID 164). 1591 bytes result sent to driver
14:51:28.884 INFO Executor - Finished task 1.0 in stage 112.0 (TID 165). 1591 bytes result sent to driver
14:51:28.884 INFO TaskSetManager - Finished task 0.0 in stage 112.0 (TID 164) in 6 ms on localhost (executor driver) (1/2)
14:51:28.884 INFO TaskSetManager - Finished task 1.0 in stage 112.0 (TID 165) in 6 ms on localhost (executor driver) (2/2)
14:51:28.884 INFO TaskSchedulerImpl - Removed TaskSet 112.0, whose tasks have all completed, from pool
14:51:28.886 INFO DAGScheduler - ResultStage 112 (count at ReadsSparkSinkUnitTest.java:222) finished in 0.010 s
14:51:28.886 INFO DAGScheduler - Job 78 is finished. Cancelling potential speculative or zombie tasks for this job
14:51:28.886 INFO TaskSchedulerImpl - Killing all running tasks in stage 112: Stage finished
14:51:28.886 INFO DAGScheduler - Job 78 finished: count at ReadsSparkSinkUnitTest.java:222, took 0.011427 s
14:51:28.902 INFO FileInputFormat - Total input files to process : 2
14:51:28.906 INFO SparkContext - Starting job: count at ReadsSparkSinkUnitTest.java:222
14:51:28.906 INFO DAGScheduler - Got job 79 (count at ReadsSparkSinkUnitTest.java:222) with 2 output partitions
14:51:28.906 INFO DAGScheduler - Final stage: ResultStage 113 (count at ReadsSparkSinkUnitTest.java:222)
14:51:28.906 INFO DAGScheduler - Parents of final stage: List()
14:51:28.906 INFO DAGScheduler - Missing parents: List()
14:51:28.906 INFO DAGScheduler - Submitting ResultStage 113 (MapPartitionsRDD[502] at filter at ReadsSparkSource.java:96), which has no missing parents
14:51:28.923 INFO MemoryStore - Block broadcast_205 stored as values in memory (estimated size 426.1 KiB, free 1918.0 MiB)
14:51:28.925 INFO MemoryStore - Block broadcast_205_piece0 stored as bytes in memory (estimated size 153.6 KiB, free 1917.8 MiB)
14:51:28.925 INFO BlockManagerInfo - Added broadcast_205_piece0 in memory on localhost:44923 (size: 153.6 KiB, free: 1919.5 MiB)
14:51:28.925 INFO SparkContext - Created broadcast 205 from broadcast at DAGScheduler.scala:1580
14:51:28.925 INFO DAGScheduler - Submitting 2 missing tasks from ResultStage 113 (MapPartitionsRDD[502] at filter at ReadsSparkSource.java:96) (first 15 tasks are for partitions Vector(0, 1))
14:51:28.925 INFO TaskSchedulerImpl - Adding task set 113.0 with 2 tasks resource profile 0
14:51:28.926 INFO TaskSetManager - Starting task 0.0 in stage 113.0 (TID 166) (localhost, executor driver, partition 0, PROCESS_LOCAL, 7828 bytes)
14:51:28.926 INFO TaskSetManager - Starting task 1.0 in stage 113.0 (TID 167) (localhost, executor driver, partition 1, PROCESS_LOCAL, 7828 bytes)
14:51:28.927 INFO Executor - Running task 1.0 in stage 113.0 (TID 167)
14:51:28.927 INFO Executor - Running task 0.0 in stage 113.0 (TID 166)
14:51:28.967 INFO NewHadoopRDD - Input split: file:/tmp/ReadsSparkSinkUnitTest514769924029096269302.cram/part-r-00001.bam:0+30825
14:51:28.967 INFO NewHadoopRDD - Input split: file:/tmp/ReadsSparkSinkUnitTest514769924029096269302.cram/part-r-00000.bam:0+31473
14:51:28.970 INFO Executor - Finished task 1.0 in stage 113.0 (TID 167). 989 bytes result sent to driver
14:51:28.970 INFO Executor - Finished task 0.0 in stage 113.0 (TID 166). 989 bytes result sent to driver
14:51:28.971 INFO TaskSetManager - Finished task 0.0 in stage 113.0 (TID 166) in 44 ms on localhost (executor driver) (1/2)
14:51:28.971 INFO TaskSetManager - Finished task 1.0 in stage 113.0 (TID 167) in 45 ms on localhost (executor driver) (2/2)
14:51:28.971 INFO TaskSchedulerImpl - Removed TaskSet 113.0, whose tasks have all completed, from pool
14:51:28.971 INFO DAGScheduler - ResultStage 113 (count at ReadsSparkSinkUnitTest.java:222) finished in 0.064 s
14:51:28.971 INFO DAGScheduler - Job 79 is finished. Cancelling potential speculative or zombie tasks for this job
14:51:28.971 INFO TaskSchedulerImpl - Killing all running tasks in stage 113: Stage finished
14:51:28.971 INFO DAGScheduler - Job 79 finished: count at ReadsSparkSinkUnitTest.java:222, took 0.065581 s
14:51:28.975 INFO MemoryStore - Block broadcast_206 stored as values in memory (estimated size 297.9 KiB, free 1917.5 MiB)
14:51:28.981 INFO MemoryStore - Block broadcast_206_piece0 stored as bytes in memory (estimated size 50.2 KiB, free 1917.5 MiB)
14:51:28.982 INFO BlockManagerInfo - Added broadcast_206_piece0 in memory on localhost:44923 (size: 50.2 KiB, free: 1919.5 MiB)
14:51:28.982 INFO SparkContext - Created broadcast 206 from newAPIHadoopFile at PathSplitSource.java:96
14:51:29.010 INFO MemoryStore - Block broadcast_207 stored as values in memory (estimated size 297.9 KiB, free 1917.2 MiB)
14:51:29.018 INFO MemoryStore - Block broadcast_207_piece0 stored as bytes in memory (estimated size 50.2 KiB, free 1917.1 MiB)
14:51:29.018 INFO BlockManagerInfo - Added broadcast_207_piece0 in memory on localhost:44923 (size: 50.2 KiB, free: 1919.4 MiB)
14:51:29.018 INFO SparkContext - Created broadcast 207 from newAPIHadoopFile at PathSplitSource.java:96
14:51:29.039 INFO MemoryStore - Block broadcast_208 stored as values in memory (estimated size 160.7 KiB, free 1917.0 MiB)
14:51:29.040 INFO MemoryStore - Block broadcast_208_piece0 stored as bytes in memory (estimated size 9.6 KiB, free 1917.0 MiB)
14:51:29.040 INFO BlockManagerInfo - Added broadcast_208_piece0 in memory on localhost:44923 (size: 9.6 KiB, free: 1919.4 MiB)
14:51:29.040 INFO SparkContext - Created broadcast 208 from broadcast at ReadsSparkSink.java:133
14:51:29.042 INFO MemoryStore - Block broadcast_209 stored as values in memory (estimated size 163.2 KiB, free 1916.8 MiB)
14:51:29.051 INFO BlockManagerInfo - Removed broadcast_199_piece0 on localhost:44923 in memory (size: 1473.0 B, free: 1919.4 MiB)
14:51:29.051 INFO MemoryStore - Block broadcast_209_piece0 stored as bytes in memory (estimated size 9.6 KiB, free 1916.8 MiB)
14:51:29.051 INFO BlockManagerInfo - Added broadcast_209_piece0 in memory on localhost:44923 (size: 9.6 KiB, free: 1919.4 MiB)
14:51:29.051 INFO SparkContext - Created broadcast 209 from broadcast at AnySamSinkMultiple.java:80
14:51:29.051 INFO BlockManagerInfo - Removed broadcast_207_piece0 on localhost:44923 in memory (size: 50.2 KiB, free: 1919.5 MiB)
14:51:29.052 INFO BlockManagerInfo - Removed broadcast_201_piece0 on localhost:44923 in memory (size: 104.7 KiB, free: 1919.6 MiB)
14:51:29.053 INFO BlockManagerInfo - Removed broadcast_196_piece0 on localhost:44923 in memory (size: 50.2 KiB, free: 1919.6 MiB)
14:51:29.054 INFO BlockManagerInfo - Removed broadcast_205_piece0 on localhost:44923 in memory (size: 153.6 KiB, free: 1919.8 MiB)
14:51:29.054 INFO PathOutputCommitterFactory - No output committer factory defined, defaulting to FileOutputCommitterFactory
14:51:29.054 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
14:51:29.054 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
14:51:29.054 INFO BlockManagerInfo - Removed broadcast_198_piece0 on localhost:44923 in memory (size: 50.2 KiB, free: 1919.8 MiB)
14:51:29.055 INFO BlockManagerInfo - Removed broadcast_204_piece0 on localhost:44923 in memory (size: 3.4 KiB, free: 1919.8 MiB)
14:51:29.056 INFO BlockManagerInfo - Removed broadcast_202_piece0 on localhost:44923 in memory (size: 56.3 KiB, free: 1919.9 MiB)
14:51:29.056 INFO BlockManagerInfo - Removed broadcast_195_piece0 on localhost:44923 in memory (size: 228.0 B, free: 1919.9 MiB)
14:51:29.057 INFO BlockManagerInfo - Removed broadcast_203_piece0 on localhost:44923 in memory (size: 50.1 KiB, free: 1919.9 MiB)
14:51:29.058 INFO BlockManagerInfo - Removed broadcast_197_piece0 on localhost:44923 in memory (size: 228.0 B, free: 1919.9 MiB)
14:51:29.058 INFO BlockManagerInfo - Removed broadcast_200_piece0 on localhost:44923 in memory (size: 1473.0 B, free: 1919.9 MiB)
14:51:29.070 INFO FileInputFormat - Total input files to process : 1
14:51:29.077 INFO SparkContext - Starting job: runJob at SparkHadoopWriter.scala:83
14:51:29.077 INFO DAGScheduler - Registering RDD 510 (repartition at ReadsSparkSinkUnitTest.java:210) as input to shuffle 24
14:51:29.077 INFO DAGScheduler - Got job 80 (runJob at SparkHadoopWriter.scala:83) with 2 output partitions
14:51:29.077 INFO DAGScheduler - Final stage: ResultStage 115 (runJob at SparkHadoopWriter.scala:83)
14:51:29.077 INFO DAGScheduler - Parents of final stage: List(ShuffleMapStage 114)
14:51:29.077 INFO DAGScheduler - Missing parents: List(ShuffleMapStage 114)
14:51:29.078 INFO DAGScheduler - Submitting ShuffleMapStage 114 (MapPartitionsRDD[510] at repartition at ReadsSparkSinkUnitTest.java:210), which has no missing parents
14:51:29.096 INFO MemoryStore - Block broadcast_210 stored as values in memory (estimated size 427.7 KiB, free 1918.9 MiB)
14:51:29.098 INFO MemoryStore - Block broadcast_210_piece0 stored as bytes in memory (estimated size 154.6 KiB, free 1918.8 MiB)
14:51:29.098 INFO BlockManagerInfo - Added broadcast_210_piece0 in memory on localhost:44923 (size: 154.6 KiB, free: 1919.8 MiB)
14:51:29.098 INFO SparkContext - Created broadcast 210 from broadcast at DAGScheduler.scala:1580
14:51:29.098 INFO DAGScheduler - Submitting 1 missing tasks from ShuffleMapStage 114 (MapPartitionsRDD[510] at repartition at ReadsSparkSinkUnitTest.java:210) (first 15 tasks are for partitions Vector(0))
14:51:29.098 INFO TaskSchedulerImpl - Adding task set 114.0 with 1 tasks resource profile 0
14:51:29.099 INFO TaskSetManager - Starting task 0.0 in stage 114.0 (TID 168) (localhost, executor driver, partition 0, PROCESS_LOCAL, 7861 bytes)
14:51:29.100 INFO Executor - Running task 0.0 in stage 114.0 (TID 168)
14:51:29.133 INFO NewHadoopRDD - Input split: file:/home/runner/work/gatk/gatk/src/test/resources/org/broadinstitute/hellbender/tools/BQSR/HiSeq.1mb.1RG.2k_lines.bam:0+222075
14:51:29.154 INFO Executor - Finished task 0.0 in stage 114.0 (TID 168). 1149 bytes result sent to driver
14:51:29.155 INFO TaskSetManager - Finished task 0.0 in stage 114.0 (TID 168) in 56 ms on localhost (executor driver) (1/1)
14:51:29.155 INFO TaskSchedulerImpl - Removed TaskSet 114.0, whose tasks have all completed, from pool
14:51:29.155 INFO DAGScheduler - ShuffleMapStage 114 (repartition at ReadsSparkSinkUnitTest.java:210) finished in 0.077 s
14:51:29.155 INFO DAGScheduler - looking for newly runnable stages
14:51:29.155 INFO DAGScheduler - running: HashSet()
14:51:29.155 INFO DAGScheduler - waiting: HashSet(ResultStage 115)
14:51:29.155 INFO DAGScheduler - failed: HashSet()
14:51:29.156 INFO DAGScheduler - Submitting ResultStage 115 (MapPartitionsRDD[522] at mapToPair at AnySamSinkMultiple.java:89), which has no missing parents
14:51:29.162 INFO MemoryStore - Block broadcast_211 stored as values in memory (estimated size 150.2 KiB, free 1918.6 MiB)
14:51:29.163 INFO MemoryStore - Block broadcast_211_piece0 stored as bytes in memory (estimated size 56.2 KiB, free 1918.6 MiB)
14:51:29.163 INFO BlockManagerInfo - Added broadcast_211_piece0 in memory on localhost:44923 (size: 56.2 KiB, free: 1919.7 MiB)
14:51:29.164 INFO SparkContext - Created broadcast 211 from broadcast at DAGScheduler.scala:1580
14:51:29.164 INFO DAGScheduler - Submitting 2 missing tasks from ResultStage 115 (MapPartitionsRDD[522] at mapToPair at AnySamSinkMultiple.java:89) (first 15 tasks are for partitions Vector(0, 1))
14:51:29.164 INFO TaskSchedulerImpl - Adding task set 115.0 with 2 tasks resource profile 0
14:51:29.165 INFO TaskSetManager - Starting task 0.0 in stage 115.0 (TID 169) (localhost, executor driver, partition 0, NODE_LOCAL, 7860 bytes)
14:51:29.165 INFO TaskSetManager - Starting task 1.0 in stage 115.0 (TID 170) (localhost, executor driver, partition 1, NODE_LOCAL, 7860 bytes)
14:51:29.165 INFO Executor - Running task 1.0 in stage 115.0 (TID 170)
14:51:29.165 INFO Executor - Running task 0.0 in stage 115.0 (TID 169)
14:51:29.170 INFO PathOutputCommitterFactory - No output committer factory defined, defaulting to FileOutputCommitterFactory
14:51:29.170 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
14:51:29.170 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
14:51:29.170 INFO PathOutputCommitterFactory - No output committer factory defined, defaulting to FileOutputCommitterFactory
14:51:29.170 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
14:51:29.170 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
14:51:29.172 INFO PathOutputCommitterFactory - No output committer factory defined, defaulting to FileOutputCommitterFactory
14:51:29.172 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
14:51:29.172 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
14:51:29.172 INFO PathOutputCommitterFactory - No output committer factory defined, defaulting to FileOutputCommitterFactory
14:51:29.172 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
14:51:29.172 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
14:51:29.184 INFO ShuffleBlockFetcherIterator - Getting 1 (176.4 KiB) non-empty blocks including 1 (176.4 KiB) local and 0 (0.0 B) host-local and 0 (0.0 B) push-merged-local and 0 (0.0 B) remote blocks
14:51:29.184 INFO ShuffleBlockFetcherIterator - Started 0 remote fetches in 0 ms
14:51:29.188 INFO ShuffleBlockFetcherIterator - Getting 1 (176.4 KiB) non-empty blocks including 1 (176.4 KiB) local and 0 (0.0 B) host-local and 0 (0.0 B) push-merged-local and 0 (0.0 B) remote blocks
14:51:29.188 INFO ShuffleBlockFetcherIterator - Started 0 remote fetches in 0 ms
14:51:29.192 INFO FileOutputCommitter - Saved output of task 'attempt_202603041451294909113006290460850_0522_r_000001_0' to file:/tmp/ReadsSparkSinkUnitTest615008829082042863934.sam/_temporary/0/task_202603041451294909113006290460850_0522_r_000001
14:51:29.193 INFO SparkHadoopMapRedUtil - attempt_202603041451294909113006290460850_0522_r_000001_0: Committed. Elapsed time: 0 ms.
14:51:29.193 INFO Executor - Finished task 1.0 in stage 115.0 (TID 170). 1729 bytes result sent to driver
14:51:29.194 INFO TaskSetManager - Finished task 1.0 in stage 115.0 (TID 170) in 29 ms on localhost (executor driver) (1/2)
14:51:29.196 INFO FileOutputCommitter - Saved output of task 'attempt_202603041451294909113006290460850_0522_r_000000_0' to file:/tmp/ReadsSparkSinkUnitTest615008829082042863934.sam/_temporary/0/task_202603041451294909113006290460850_0522_r_000000
14:51:29.196 INFO SparkHadoopMapRedUtil - attempt_202603041451294909113006290460850_0522_r_000000_0: Committed. Elapsed time: 0 ms.
14:51:29.196 INFO Executor - Finished task 0.0 in stage 115.0 (TID 169). 1729 bytes result sent to driver
14:51:29.197 INFO TaskSetManager - Finished task 0.0 in stage 115.0 (TID 169) in 32 ms on localhost (executor driver) (2/2)
14:51:29.197 INFO TaskSchedulerImpl - Removed TaskSet 115.0, whose tasks have all completed, from pool
14:51:29.197 INFO DAGScheduler - ResultStage 115 (runJob at SparkHadoopWriter.scala:83) finished in 0.041 s
14:51:29.197 INFO DAGScheduler - Job 80 is finished. Cancelling potential speculative or zombie tasks for this job
14:51:29.197 INFO TaskSchedulerImpl - Killing all running tasks in stage 115: Stage finished
14:51:29.197 INFO DAGScheduler - Job 80 finished: runJob at SparkHadoopWriter.scala:83, took 0.120833 s
14:51:29.198 INFO SparkHadoopWriter - Start to commit write Job job_202603041451294909113006290460850_0522.
14:51:29.205 INFO SparkHadoopWriter - Write Job job_202603041451294909113006290460850_0522 committed. Elapsed time: 7 ms.
14:51:29.208 INFO MemoryStore - Block broadcast_212 stored as values in memory (estimated size 297.9 KiB, free 1918.3 MiB)
14:51:29.214 INFO MemoryStore - Block broadcast_212_piece0 stored as bytes in memory (estimated size 50.2 KiB, free 1918.2 MiB)
14:51:29.214 INFO BlockManagerInfo - Added broadcast_212_piece0 in memory on localhost:44923 (size: 50.2 KiB, free: 1919.7 MiB)
14:51:29.214 INFO SparkContext - Created broadcast 212 from newAPIHadoopFile at PathSplitSource.java:96
14:51:29.238 INFO SparkContext - Starting job: count at ReadsSparkSinkUnitTest.java:222
14:51:29.239 INFO DAGScheduler - Got job 81 (count at ReadsSparkSinkUnitTest.java:222) with 2 output partitions
14:51:29.239 INFO DAGScheduler - Final stage: ResultStage 117 (count at ReadsSparkSinkUnitTest.java:222)
14:51:29.239 INFO DAGScheduler - Parents of final stage: List(ShuffleMapStage 116)
14:51:29.239 INFO DAGScheduler - Missing parents: List()
14:51:29.239 INFO DAGScheduler - Submitting ResultStage 117 (MapPartitionsRDD[513] at repartition at ReadsSparkSinkUnitTest.java:210), which has no missing parents
14:51:29.240 INFO MemoryStore - Block broadcast_213 stored as values in memory (estimated size 6.3 KiB, free 1918.2 MiB)
14:51:29.241 INFO MemoryStore - Block broadcast_213_piece0 stored as bytes in memory (estimated size 3.4 KiB, free 1918.2 MiB)
14:51:29.241 INFO BlockManagerInfo - Added broadcast_213_piece0 in memory on localhost:44923 (size: 3.4 KiB, free: 1919.7 MiB)
14:51:29.241 INFO SparkContext - Created broadcast 213 from broadcast at DAGScheduler.scala:1580
14:51:29.241 INFO DAGScheduler - Submitting 2 missing tasks from ResultStage 117 (MapPartitionsRDD[513] at repartition at ReadsSparkSinkUnitTest.java:210) (first 15 tasks are for partitions Vector(0, 1))
14:51:29.241 INFO TaskSchedulerImpl - Adding task set 117.0 with 2 tasks resource profile 0
14:51:29.242 INFO TaskSetManager - Starting task 0.0 in stage 117.0 (TID 171) (localhost, executor driver, partition 0, NODE_LOCAL, 7860 bytes)
14:51:29.242 INFO TaskSetManager - Starting task 1.0 in stage 117.0 (TID 172) (localhost, executor driver, partition 1, NODE_LOCAL, 7860 bytes)
14:51:29.242 INFO Executor - Running task 0.0 in stage 117.0 (TID 171)
14:51:29.242 INFO Executor - Running task 1.0 in stage 117.0 (TID 172)
14:51:29.244 INFO ShuffleBlockFetcherIterator - Getting 1 (176.4 KiB) non-empty blocks including 1 (176.4 KiB) local and 0 (0.0 B) host-local and 0 (0.0 B) push-merged-local and 0 (0.0 B) remote blocks
14:51:29.244 INFO ShuffleBlockFetcherIterator - Started 0 remote fetches in 0 ms
14:51:29.244 INFO ShuffleBlockFetcherIterator - Getting 1 (176.4 KiB) non-empty blocks including 1 (176.4 KiB) local and 0 (0.0 B) host-local and 0 (0.0 B) push-merged-local and 0 (0.0 B) remote blocks
14:51:29.244 INFO ShuffleBlockFetcherIterator - Started 0 remote fetches in 0 ms
14:51:29.248 INFO Executor - Finished task 1.0 in stage 117.0 (TID 172). 1591 bytes result sent to driver
14:51:29.248 INFO TaskSetManager - Finished task 1.0 in stage 117.0 (TID 172) in 6 ms on localhost (executor driver) (1/2)
14:51:29.249 INFO Executor - Finished task 0.0 in stage 117.0 (TID 171). 1591 bytes result sent to driver
14:51:29.249 INFO TaskSetManager - Finished task 0.0 in stage 117.0 (TID 171) in 7 ms on localhost (executor driver) (2/2)
14:51:29.249 INFO TaskSchedulerImpl - Removed TaskSet 117.0, whose tasks have all completed, from pool
14:51:29.249 INFO DAGScheduler - ResultStage 117 (count at ReadsSparkSinkUnitTest.java:222) finished in 0.009 s
14:51:29.249 INFO DAGScheduler - Job 81 is finished. Cancelling potential speculative or zombie tasks for this job
14:51:29.250 INFO TaskSchedulerImpl - Killing all running tasks in stage 117: Stage finished
14:51:29.250 INFO DAGScheduler - Job 81 finished: count at ReadsSparkSinkUnitTest.java:222, took 0.011289 s
14:51:29.264 INFO FileInputFormat - Total input files to process : 2
14:51:29.268 INFO SparkContext - Starting job: count at ReadsSparkSinkUnitTest.java:222
14:51:29.268 INFO DAGScheduler - Got job 82 (count at ReadsSparkSinkUnitTest.java:222) with 2 output partitions
14:51:29.268 INFO DAGScheduler - Final stage: ResultStage 118 (count at ReadsSparkSinkUnitTest.java:222)
14:51:29.268 INFO DAGScheduler - Parents of final stage: List()
14:51:29.269 INFO DAGScheduler - Missing parents: List()
14:51:29.269 INFO DAGScheduler - Submitting ResultStage 118 (MapPartitionsRDD[529] at filter at ReadsSparkSource.java:96), which has no missing parents
14:51:29.286 INFO MemoryStore - Block broadcast_214 stored as values in memory (estimated size 426.1 KiB, free 1917.8 MiB)
14:51:29.287 INFO MemoryStore - Block broadcast_214_piece0 stored as bytes in memory (estimated size 153.6 KiB, free 1917.6 MiB)
14:51:29.288 INFO BlockManagerInfo - Added broadcast_214_piece0 in memory on localhost:44923 (size: 153.6 KiB, free: 1919.5 MiB)
14:51:29.288 INFO SparkContext - Created broadcast 214 from broadcast at DAGScheduler.scala:1580
14:51:29.288 INFO DAGScheduler - Submitting 2 missing tasks from ResultStage 118 (MapPartitionsRDD[529] at filter at ReadsSparkSource.java:96) (first 15 tasks are for partitions Vector(0, 1))
14:51:29.288 INFO TaskSchedulerImpl - Adding task set 118.0 with 2 tasks resource profile 0
14:51:29.289 INFO TaskSetManager - Starting task 0.0 in stage 118.0 (TID 173) (localhost, executor driver, partition 0, PROCESS_LOCAL, 7827 bytes)
14:51:29.289 INFO TaskSetManager - Starting task 1.0 in stage 118.0 (TID 174) (localhost, executor driver, partition 1, PROCESS_LOCAL, 7827 bytes)
14:51:29.289 INFO Executor - Running task 1.0 in stage 118.0 (TID 174)
14:51:29.289 INFO Executor - Running task 0.0 in stage 118.0 (TID 173)
14:51:29.327 INFO NewHadoopRDD - Input split: file:/tmp/ReadsSparkSinkUnitTest615008829082042863934.sam/part-r-00001.bam:0+129330
14:51:29.327 INFO NewHadoopRDD - Input split: file:/tmp/ReadsSparkSinkUnitTest615008829082042863934.sam/part-r-00000.bam:0+132492
14:51:29.339 INFO Executor - Finished task 1.0 in stage 118.0 (TID 174). 989 bytes result sent to driver
14:51:29.339 INFO TaskSetManager - Finished task 1.0 in stage 118.0 (TID 174) in 50 ms on localhost (executor driver) (1/2)
14:51:29.341 INFO Executor - Finished task 0.0 in stage 118.0 (TID 173). 989 bytes result sent to driver
14:51:29.341 INFO TaskSetManager - Finished task 0.0 in stage 118.0 (TID 173) in 53 ms on localhost (executor driver) (2/2)
14:51:29.341 INFO TaskSchedulerImpl - Removed TaskSet 118.0, whose tasks have all completed, from pool
14:51:29.341 INFO DAGScheduler - ResultStage 118 (count at ReadsSparkSinkUnitTest.java:222) finished in 0.072 s
14:51:29.341 INFO DAGScheduler - Job 82 is finished. Cancelling potential speculative or zombie tasks for this job
14:51:29.341 INFO TaskSchedulerImpl - Killing all running tasks in stage 118: Stage finished
14:51:29.341 INFO DAGScheduler - Job 82 finished: count at ReadsSparkSinkUnitTest.java:222, took 0.073525 s
14:51:29.346 INFO MemoryStore - Block broadcast_215 stored as values in memory (estimated size 297.9 KiB, free 1917.3 MiB)
14:51:29.357 INFO MemoryStore - Block broadcast_215_piece0 stored as bytes in memory (estimated size 50.2 KiB, free 1917.3 MiB)
14:51:29.357 INFO BlockManagerInfo - Added broadcast_215_piece0 in memory on localhost:44923 (size: 50.2 KiB, free: 1919.5 MiB)
14:51:29.357 INFO SparkContext - Created broadcast 215 from newAPIHadoopFile at PathSplitSource.java:96
14:51:29.386 INFO MemoryStore - Block broadcast_216 stored as values in memory (estimated size 297.9 KiB, free 1917.0 MiB)
14:51:29.392 INFO MemoryStore - Block broadcast_216_piece0 stored as bytes in memory (estimated size 50.2 KiB, free 1917.0 MiB)
14:51:29.392 INFO BlockManagerInfo - Added broadcast_216_piece0 in memory on localhost:44923 (size: 50.2 KiB, free: 1919.4 MiB)
14:51:29.393 INFO SparkContext - Created broadcast 216 from newAPIHadoopFile at PathSplitSource.java:96
14:51:29.413 INFO FileInputFormat - Total input files to process : 1
14:51:29.415 INFO MemoryStore - Block broadcast_217 stored as values in memory (estimated size 160.7 KiB, free 1916.8 MiB)
14:51:29.416 INFO MemoryStore - Block broadcast_217_piece0 stored as bytes in memory (estimated size 9.6 KiB, free 1916.8 MiB)
14:51:29.416 INFO BlockManagerInfo - Added broadcast_217_piece0 in memory on localhost:44923 (size: 9.6 KiB, free: 1919.4 MiB)
14:51:29.417 INFO SparkContext - Created broadcast 217 from broadcast at ReadsSparkSink.java:133
14:51:29.418 INFO MemoryStore - Block broadcast_218 stored as values in memory (estimated size 163.2 KiB, free 1916.6 MiB)
14:51:29.419 INFO MemoryStore - Block broadcast_218_piece0 stored as bytes in memory (estimated size 9.6 KiB, free 1916.6 MiB)
14:51:29.419 INFO BlockManagerInfo - Added broadcast_218_piece0 in memory on localhost:44923 (size: 9.6 KiB, free: 1919.4 MiB)
14:51:29.419 INFO SparkContext - Created broadcast 218 from broadcast at BamSink.java:76
14:51:29.422 INFO PathOutputCommitterFactory - No output committer factory defined, defaulting to FileOutputCommitterFactory
14:51:29.422 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
14:51:29.422 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
14:51:29.445 INFO SparkContext - Starting job: runJob at SparkHadoopWriter.scala:83
14:51:29.445 INFO DAGScheduler - Registering RDD 543 (mapToPair at SparkUtils.java:161) as input to shuffle 25
14:51:29.446 INFO DAGScheduler - Got job 83 (runJob at SparkHadoopWriter.scala:83) with 1 output partitions
14:51:29.446 INFO DAGScheduler - Final stage: ResultStage 120 (runJob at SparkHadoopWriter.scala:83)
14:51:29.446 INFO DAGScheduler - Parents of final stage: List(ShuffleMapStage 119)
14:51:29.446 INFO DAGScheduler - Missing parents: List(ShuffleMapStage 119)
14:51:29.446 INFO DAGScheduler - Submitting ShuffleMapStage 119 (MapPartitionsRDD[543] at mapToPair at SparkUtils.java:161), which has no missing parents
14:51:29.463 INFO MemoryStore - Block broadcast_219 stored as values in memory (estimated size 520.4 KiB, free 1916.1 MiB)
14:51:29.465 INFO MemoryStore - Block broadcast_219_piece0 stored as bytes in memory (estimated size 166.1 KiB, free 1916.0 MiB)
14:51:29.465 INFO BlockManagerInfo - Added broadcast_219_piece0 in memory on localhost:44923 (size: 166.1 KiB, free: 1919.2 MiB)
14:51:29.465 INFO SparkContext - Created broadcast 219 from broadcast at DAGScheduler.scala:1580
14:51:29.466 INFO DAGScheduler - Submitting 1 missing tasks from ShuffleMapStage 119 (MapPartitionsRDD[543] at mapToPair at SparkUtils.java:161) (first 15 tasks are for partitions Vector(0))
14:51:29.466 INFO TaskSchedulerImpl - Adding task set 119.0 with 1 tasks resource profile 0
14:51:29.466 INFO TaskSetManager - Starting task 0.0 in stage 119.0 (TID 175) (localhost, executor driver, partition 0, PROCESS_LOCAL, 7861 bytes)
14:51:29.467 INFO Executor - Running task 0.0 in stage 119.0 (TID 175)
14:51:29.501 INFO NewHadoopRDD - Input split: file:/home/runner/work/gatk/gatk/src/test/resources/org/broadinstitute/hellbender/tools/BQSR/HiSeq.1mb.1RG.2k_lines.bam:0+222075
14:51:29.518 INFO Executor - Finished task 0.0 in stage 119.0 (TID 175). 1148 bytes result sent to driver
14:51:29.518 INFO TaskSetManager - Finished task 0.0 in stage 119.0 (TID 175) in 52 ms on localhost (executor driver) (1/1)
14:51:29.518 INFO TaskSchedulerImpl - Removed TaskSet 119.0, whose tasks have all completed, from pool
14:51:29.519 INFO DAGScheduler - ShuffleMapStage 119 (mapToPair at SparkUtils.java:161) finished in 0.072 s
14:51:29.519 INFO DAGScheduler - looking for newly runnable stages
14:51:29.519 INFO DAGScheduler - running: HashSet()
14:51:29.519 INFO DAGScheduler - waiting: HashSet(ResultStage 120)
14:51:29.519 INFO DAGScheduler - failed: HashSet()
14:51:29.519 INFO DAGScheduler - Submitting ResultStage 120 (MapPartitionsRDD[548] at mapToPair at BamSink.java:91), which has no missing parents
14:51:29.527 INFO MemoryStore - Block broadcast_220 stored as values in memory (estimated size 241.4 KiB, free 1915.7 MiB)
14:51:29.528 INFO MemoryStore - Block broadcast_220_piece0 stored as bytes in memory (estimated size 67.0 KiB, free 1915.7 MiB)
14:51:29.528 INFO BlockManagerInfo - Added broadcast_220_piece0 in memory on localhost:44923 (size: 67.0 KiB, free: 1919.2 MiB)
14:51:29.529 INFO SparkContext - Created broadcast 220 from broadcast at DAGScheduler.scala:1580
14:51:29.529 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 120 (MapPartitionsRDD[548] at mapToPair at BamSink.java:91) (first 15 tasks are for partitions Vector(0))
14:51:29.529 INFO TaskSchedulerImpl - Adding task set 120.0 with 1 tasks resource profile 0
14:51:29.530 INFO TaskSetManager - Starting task 0.0 in stage 120.0 (TID 176) (localhost, executor driver, partition 0, NODE_LOCAL, 7513 bytes)
14:51:29.530 INFO Executor - Running task 0.0 in stage 120.0 (TID 176)
14:51:29.535 INFO ShuffleBlockFetcherIterator - Getting 1 (343.8 KiB) non-empty blocks including 1 (343.8 KiB) local and 0 (0.0 B) host-local and 0 (0.0 B) push-merged-local and 0 (0.0 B) remote blocks
14:51:29.535 INFO ShuffleBlockFetcherIterator - Started 0 remote fetches in 0 ms
14:51:29.550 INFO PathOutputCommitterFactory - No output committer factory defined, defaulting to FileOutputCommitterFactory
14:51:29.550 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
14:51:29.550 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
14:51:29.550 INFO PathOutputCommitterFactory - No output committer factory defined, defaulting to FileOutputCommitterFactory
14:51:29.550 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
14:51:29.550 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
14:51:29.582 INFO FileOutputCommitter - Saved output of task 'attempt_202603041451295947745270557596919_0548_r_000000_0' to file:/tmp/ReadsSparkSinkUnitTest188333678568964105.bam.parts/_temporary/0/task_202603041451295947745270557596919_0548_r_000000
14:51:29.582 INFO SparkHadoopMapRedUtil - attempt_202603041451295947745270557596919_0548_r_000000_0: Committed. Elapsed time: 0 ms.
14:51:29.582 INFO Executor - Finished task 0.0 in stage 120.0 (TID 176). 1858 bytes result sent to driver
14:51:29.583 INFO TaskSetManager - Finished task 0.0 in stage 120.0 (TID 176) in 54 ms on localhost (executor driver) (1/1)
14:51:29.583 INFO TaskSchedulerImpl - Removed TaskSet 120.0, whose tasks have all completed, from pool
14:51:29.583 INFO DAGScheduler - ResultStage 120 (runJob at SparkHadoopWriter.scala:83) finished in 0.064 s
14:51:29.583 INFO DAGScheduler - Job 83 is finished. Cancelling potential speculative or zombie tasks for this job
14:51:29.583 INFO TaskSchedulerImpl - Killing all running tasks in stage 120: Stage finished
14:51:29.584 INFO DAGScheduler - Job 83 finished: runJob at SparkHadoopWriter.scala:83, took 0.138769 s
14:51:29.584 INFO SparkHadoopWriter - Start to commit write Job job_202603041451295947745270557596919_0548.
14:51:29.591 INFO SparkHadoopWriter - Write Job job_202603041451295947745270557596919_0548 committed. Elapsed time: 6 ms.
14:51:29.606 INFO HadoopFileSystemWrapper - Concatenating 3 parts to /tmp/ReadsSparkSinkUnitTest188333678568964105.bam
14:51:29.611 INFO HadoopFileSystemWrapper - Concatenating to /tmp/ReadsSparkSinkUnitTest188333678568964105.bam done
14:51:29.611 INFO IndexFileMerger - Merging .sbi files in temp directory /tmp/ReadsSparkSinkUnitTest188333678568964105.bam.parts/ to /tmp/ReadsSparkSinkUnitTest188333678568964105.bam.sbi
14:51:29.618 INFO IndexFileMerger - Done merging .sbi files
14:51:29.618 INFO IndexFileMerger - Merging .bai files in temp directory /tmp/ReadsSparkSinkUnitTest188333678568964105.bam.parts/ to /tmp/ReadsSparkSinkUnitTest188333678568964105.bam.bai
14:51:29.625 INFO IndexFileMerger - Done merging .bai files
14:51:29.627 INFO MemoryStore - Block broadcast_221 stored as values in memory (estimated size 320.0 B, free 1915.7 MiB)
14:51:29.628 INFO MemoryStore - Block broadcast_221_piece0 stored as bytes in memory (estimated size 233.0 B, free 1915.7 MiB)
14:51:29.628 INFO BlockManagerInfo - Added broadcast_221_piece0 in memory on localhost:44923 (size: 233.0 B, free: 1919.2 MiB)
14:51:29.628 INFO SparkContext - Created broadcast 221 from broadcast at BamSource.java:104
14:51:29.629 INFO MemoryStore - Block broadcast_222 stored as values in memory (estimated size 297.9 KiB, free 1915.4 MiB)
14:51:29.639 INFO BlockManagerInfo - Removed broadcast_218_piece0 on localhost:44923 in memory (size: 9.6 KiB, free: 1919.2 MiB)
14:51:29.640 INFO BlockManagerInfo - Removed broadcast_212_piece0 on localhost:44923 in memory (size: 50.2 KiB, free: 1919.2 MiB)
14:51:29.641 INFO BlockManagerInfo - Removed broadcast_216_piece0 on localhost:44923 in memory (size: 50.2 KiB, free: 1919.3 MiB)
14:51:29.642 INFO BlockManagerInfo - Removed broadcast_219_piece0 on localhost:44923 in memory (size: 166.1 KiB, free: 1919.4 MiB)
14:51:29.642 INFO BlockManagerInfo - Removed broadcast_209_piece0 on localhost:44923 in memory (size: 9.6 KiB, free: 1919.5 MiB)
14:51:29.643 INFO BlockManagerInfo - Removed broadcast_220_piece0 on localhost:44923 in memory (size: 67.0 KiB, free: 1919.5 MiB)
14:51:29.643 INFO BlockManagerInfo - Removed broadcast_210_piece0 on localhost:44923 in memory (size: 154.6 KiB, free: 1919.7 MiB)
14:51:29.644 INFO BlockManagerInfo - Removed broadcast_217_piece0 on localhost:44923 in memory (size: 9.6 KiB, free: 1919.7 MiB)
14:51:29.644 INFO BlockManagerInfo - Removed broadcast_214_piece0 on localhost:44923 in memory (size: 153.6 KiB, free: 1919.8 MiB)
14:51:29.645 INFO BlockManagerInfo - Removed broadcast_208_piece0 on localhost:44923 in memory (size: 9.6 KiB, free: 1919.8 MiB)
14:51:29.645 INFO BlockManagerInfo - Removed broadcast_213_piece0 on localhost:44923 in memory (size: 3.4 KiB, free: 1919.8 MiB)
14:51:29.646 INFO BlockManagerInfo - Removed broadcast_211_piece0 on localhost:44923 in memory (size: 56.2 KiB, free: 1919.9 MiB)
14:51:29.646 INFO BlockManagerInfo - Removed broadcast_206_piece0 on localhost:44923 in memory (size: 50.2 KiB, free: 1920.0 MiB)
14:51:29.647 INFO MemoryStore - Block broadcast_222_piece0 stored as bytes in memory (estimated size 50.2 KiB, free 1919.3 MiB)
14:51:29.647 INFO BlockManagerInfo - Added broadcast_222_piece0 in memory on localhost:44923 (size: 50.2 KiB, free: 1919.9 MiB)
14:51:29.648 INFO SparkContext - Created broadcast 222 from newAPIHadoopFile at PathSplitSource.java:96
14:51:29.663 INFO FileInputFormat - Total input files to process : 1
14:51:29.677 INFO SparkContext - Starting job: collect at ReadsSparkSinkUnitTest.java:182
14:51:29.678 INFO DAGScheduler - Got job 84 (collect at ReadsSparkSinkUnitTest.java:182) with 1 output partitions
14:51:29.678 INFO DAGScheduler - Final stage: ResultStage 121 (collect at ReadsSparkSinkUnitTest.java:182)
14:51:29.678 INFO DAGScheduler - Parents of final stage: List()
14:51:29.678 INFO DAGScheduler - Missing parents: List()
14:51:29.678 INFO DAGScheduler - Submitting ResultStage 121 (MapPartitionsRDD[554] at filter at ReadsSparkSource.java:96), which has no missing parents
14:51:29.685 INFO MemoryStore - Block broadcast_223 stored as values in memory (estimated size 148.2 KiB, free 1919.2 MiB)
14:51:29.685 INFO MemoryStore - Block broadcast_223_piece0 stored as bytes in memory (estimated size 54.6 KiB, free 1919.1 MiB)
14:51:29.686 INFO BlockManagerInfo - Added broadcast_223_piece0 in memory on localhost:44923 (size: 54.6 KiB, free: 1919.8 MiB)
14:51:29.686 INFO SparkContext - Created broadcast 223 from broadcast at DAGScheduler.scala:1580
14:51:29.686 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 121 (MapPartitionsRDD[554] at filter at ReadsSparkSource.java:96) (first 15 tasks are for partitions Vector(0))
14:51:29.686 INFO TaskSchedulerImpl - Adding task set 121.0 with 1 tasks resource profile 0
14:51:29.687 INFO TaskSetManager - Starting task 0.0 in stage 121.0 (TID 177) (localhost, executor driver, partition 0, PROCESS_LOCAL, 7807 bytes)
14:51:29.687 INFO Executor - Running task 0.0 in stage 121.0 (TID 177)
14:51:29.700 INFO NewHadoopRDD - Input split: file:/tmp/ReadsSparkSinkUnitTest188333678568964105.bam:0+237038
14:51:29.705 INFO Executor - Finished task 0.0 in stage 121.0 (TID 177). 651483 bytes result sent to driver
14:51:29.706 INFO TaskSetManager - Finished task 0.0 in stage 121.0 (TID 177) in 19 ms on localhost (executor driver) (1/1)
14:51:29.707 INFO TaskSchedulerImpl - Removed TaskSet 121.0, whose tasks have all completed, from pool
14:51:29.707 INFO DAGScheduler - ResultStage 121 (collect at ReadsSparkSinkUnitTest.java:182) finished in 0.028 s
14:51:29.707 INFO DAGScheduler - Job 84 is finished. Cancelling potential speculative or zombie tasks for this job
14:51:29.707 INFO TaskSchedulerImpl - Killing all running tasks in stage 121: Stage finished
14:51:29.707 INFO DAGScheduler - Job 84 finished: collect at ReadsSparkSinkUnitTest.java:182, took 0.029403 s
14:51:29.722 INFO SparkContext - Starting job: count at ReadsSparkSinkUnitTest.java:185
14:51:29.723 INFO DAGScheduler - Got job 85 (count at ReadsSparkSinkUnitTest.java:185) with 1 output partitions
14:51:29.723 INFO DAGScheduler - Final stage: ResultStage 122 (count at ReadsSparkSinkUnitTest.java:185)
14:51:29.723 INFO DAGScheduler - Parents of final stage: List()
14:51:29.723 INFO DAGScheduler - Missing parents: List()
14:51:29.723 INFO DAGScheduler - Submitting ResultStage 122 (MapPartitionsRDD[536] at filter at ReadsSparkSource.java:96), which has no missing parents
14:51:29.740 INFO MemoryStore - Block broadcast_224 stored as values in memory (estimated size 426.1 KiB, free 1918.7 MiB)
14:51:29.742 INFO MemoryStore - Block broadcast_224_piece0 stored as bytes in memory (estimated size 153.6 KiB, free 1918.6 MiB)
14:51:29.742 INFO BlockManagerInfo - Added broadcast_224_piece0 in memory on localhost:44923 (size: 153.6 KiB, free: 1919.7 MiB)
14:51:29.742 INFO SparkContext - Created broadcast 224 from broadcast at DAGScheduler.scala:1580
14:51:29.742 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 122 (MapPartitionsRDD[536] at filter at ReadsSparkSource.java:96) (first 15 tasks are for partitions Vector(0))
14:51:29.742 INFO TaskSchedulerImpl - Adding task set 122.0 with 1 tasks resource profile 0
14:51:29.743 INFO TaskSetManager - Starting task 0.0 in stage 122.0 (TID 178) (localhost, executor driver, partition 0, PROCESS_LOCAL, 7872 bytes)
14:51:29.743 INFO Executor - Running task 0.0 in stage 122.0 (TID 178)
14:51:29.778 INFO NewHadoopRDD - Input split: file:/home/runner/work/gatk/gatk/src/test/resources/org/broadinstitute/hellbender/tools/BQSR/HiSeq.1mb.1RG.2k_lines.bam:0+222075
14:51:29.789 INFO Executor - Finished task 0.0 in stage 122.0 (TID 178). 989 bytes result sent to driver
14:51:29.789 INFO TaskSetManager - Finished task 0.0 in stage 122.0 (TID 178) in 46 ms on localhost (executor driver) (1/1)
14:51:29.789 INFO TaskSchedulerImpl - Removed TaskSet 122.0, whose tasks have all completed, from pool
14:51:29.790 INFO DAGScheduler - ResultStage 122 (count at ReadsSparkSinkUnitTest.java:185) finished in 0.067 s
14:51:29.790 INFO DAGScheduler - Job 85 is finished. Cancelling potential speculative or zombie tasks for this job
14:51:29.790 INFO TaskSchedulerImpl - Killing all running tasks in stage 122: Stage finished
14:51:29.790 INFO DAGScheduler - Job 85 finished: count at ReadsSparkSinkUnitTest.java:185, took 0.067374 s
14:51:29.794 INFO SparkContext - Starting job: count at ReadsSparkSinkUnitTest.java:185
14:51:29.794 INFO DAGScheduler - Got job 86 (count at ReadsSparkSinkUnitTest.java:185) with 1 output partitions
14:51:29.794 INFO DAGScheduler - Final stage: ResultStage 123 (count at ReadsSparkSinkUnitTest.java:185)
14:51:29.794 INFO DAGScheduler - Parents of final stage: List()
14:51:29.794 INFO DAGScheduler - Missing parents: List()
14:51:29.794 INFO DAGScheduler - Submitting ResultStage 123 (MapPartitionsRDD[554] at filter at ReadsSparkSource.java:96), which has no missing parents
14:51:29.801 INFO MemoryStore - Block broadcast_225 stored as values in memory (estimated size 148.1 KiB, free 1918.4 MiB)
14:51:29.802 INFO MemoryStore - Block broadcast_225_piece0 stored as bytes in memory (estimated size 54.5 KiB, free 1918.4 MiB)
14:51:29.802 INFO BlockManagerInfo - Added broadcast_225_piece0 in memory on localhost:44923 (size: 54.5 KiB, free: 1919.6 MiB)
14:51:29.802 INFO SparkContext - Created broadcast 225 from broadcast at DAGScheduler.scala:1580
14:51:29.803 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 123 (MapPartitionsRDD[554] at filter at ReadsSparkSource.java:96) (first 15 tasks are for partitions Vector(0))
14:51:29.803 INFO TaskSchedulerImpl - Adding task set 123.0 with 1 tasks resource profile 0
14:51:29.803 INFO TaskSetManager - Starting task 0.0 in stage 123.0 (TID 179) (localhost, executor driver, partition 0, PROCESS_LOCAL, 7807 bytes)
14:51:29.804 INFO Executor - Running task 0.0 in stage 123.0 (TID 179)
14:51:29.816 INFO NewHadoopRDD - Input split: file:/tmp/ReadsSparkSinkUnitTest188333678568964105.bam:0+237038
14:51:29.820 INFO Executor - Finished task 0.0 in stage 123.0 (TID 179). 989 bytes result sent to driver
14:51:29.820 INFO TaskSetManager - Finished task 0.0 in stage 123.0 (TID 179) in 17 ms on localhost (executor driver) (1/1)
14:51:29.821 INFO TaskSchedulerImpl - Removed TaskSet 123.0, whose tasks have all completed, from pool
14:51:29.821 INFO DAGScheduler - ResultStage 123 (count at ReadsSparkSinkUnitTest.java:185) finished in 0.026 s
14:51:29.821 INFO DAGScheduler - Job 86 is finished. Cancelling potential speculative or zombie tasks for this job
14:51:29.821 INFO TaskSchedulerImpl - Killing all running tasks in stage 123: Stage finished
14:51:29.821 INFO DAGScheduler - Job 86 finished: count at ReadsSparkSinkUnitTest.java:185, took 0.027432 s
14:51:29.824 INFO MemoryStore - Block broadcast_226 stored as values in memory (estimated size 297.9 KiB, free 1918.1 MiB)
14:51:29.830 INFO MemoryStore - Block broadcast_226_piece0 stored as bytes in memory (estimated size 50.2 KiB, free 1918.0 MiB)
14:51:29.831 INFO BlockManagerInfo - Added broadcast_226_piece0 in memory on localhost:44923 (size: 50.2 KiB, free: 1919.6 MiB)
14:51:29.831 INFO SparkContext - Created broadcast 226 from newAPIHadoopFile at PathSplitSource.java:96
14:51:29.853 INFO MemoryStore - Block broadcast_227 stored as values in memory (estimated size 297.9 KiB, free 1917.7 MiB)
14:51:29.860 INFO MemoryStore - Block broadcast_227_piece0 stored as bytes in memory (estimated size 50.2 KiB, free 1917.7 MiB)
14:51:29.860 INFO BlockManagerInfo - Added broadcast_227_piece0 in memory on localhost:44923 (size: 50.2 KiB, free: 1919.5 MiB)
14:51:29.860 INFO SparkContext - Created broadcast 227 from newAPIHadoopFile at PathSplitSource.java:96
14:51:29.888 INFO FileInputFormat - Total input files to process : 1
14:51:29.890 INFO MemoryStore - Block broadcast_228 stored as values in memory (estimated size 160.7 KiB, free 1917.5 MiB)
14:51:29.891 INFO MemoryStore - Block broadcast_228_piece0 stored as bytes in memory (estimated size 9.6 KiB, free 1917.5 MiB)
14:51:29.891 INFO BlockManagerInfo - Added broadcast_228_piece0 in memory on localhost:44923 (size: 9.6 KiB, free: 1919.5 MiB)
14:51:29.891 INFO SparkContext - Created broadcast 228 from broadcast at ReadsSparkSink.java:133
14:51:29.893 INFO MemoryStore - Block broadcast_229 stored as values in memory (estimated size 163.2 KiB, free 1917.4 MiB)
14:51:29.893 INFO MemoryStore - Block broadcast_229_piece0 stored as bytes in memory (estimated size 9.6 KiB, free 1917.3 MiB)
14:51:29.894 INFO BlockManagerInfo - Added broadcast_229_piece0 in memory on localhost:44923 (size: 9.6 KiB, free: 1919.5 MiB)
14:51:29.894 INFO SparkContext - Created broadcast 229 from broadcast at BamSink.java:76
14:51:29.896 INFO PathOutputCommitterFactory - No output committer factory defined, defaulting to FileOutputCommitterFactory
14:51:29.896 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
14:51:29.896 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
14:51:29.916 INFO SparkContext - Starting job: runJob at SparkHadoopWriter.scala:83
14:51:29.917 INFO DAGScheduler - Registering RDD 568 (mapToPair at SparkUtils.java:161) as input to shuffle 26
14:51:29.917 INFO DAGScheduler - Got job 87 (runJob at SparkHadoopWriter.scala:83) with 1 output partitions
14:51:29.917 INFO DAGScheduler - Final stage: ResultStage 125 (runJob at SparkHadoopWriter.scala:83)
14:51:29.917 INFO DAGScheduler - Parents of final stage: List(ShuffleMapStage 124)
14:51:29.917 INFO DAGScheduler - Missing parents: List(ShuffleMapStage 124)
14:51:29.917 INFO DAGScheduler - Submitting ShuffleMapStage 124 (MapPartitionsRDD[568] at mapToPair at SparkUtils.java:161), which has no missing parents
14:51:29.935 INFO MemoryStore - Block broadcast_230 stored as values in memory (estimated size 520.4 KiB, free 1916.8 MiB)
14:51:29.936 INFO MemoryStore - Block broadcast_230_piece0 stored as bytes in memory (estimated size 166.1 KiB, free 1916.7 MiB)
14:51:29.936 INFO BlockManagerInfo - Added broadcast_230_piece0 in memory on localhost:44923 (size: 166.1 KiB, free: 1919.4 MiB)
14:51:29.936 INFO SparkContext - Created broadcast 230 from broadcast at DAGScheduler.scala:1580
14:51:29.937 INFO DAGScheduler - Submitting 1 missing tasks from ShuffleMapStage 124 (MapPartitionsRDD[568] at mapToPair at SparkUtils.java:161) (first 15 tasks are for partitions Vector(0))
14:51:29.937 INFO TaskSchedulerImpl - Adding task set 124.0 with 1 tasks resource profile 0
14:51:29.937 INFO TaskSetManager - Starting task 0.0 in stage 124.0 (TID 180) (localhost, executor driver, partition 0, PROCESS_LOCAL, 7861 bytes)
14:51:29.938 INFO Executor - Running task 0.0 in stage 124.0 (TID 180)
14:51:29.971 INFO NewHadoopRDD - Input split: file:/home/runner/work/gatk/gatk/src/test/resources/org/broadinstitute/hellbender/tools/BQSR/HiSeq.1mb.1RG.2k_lines.bam:0+222075
14:51:29.989 INFO Executor - Finished task 0.0 in stage 124.0 (TID 180). 1148 bytes result sent to driver
14:51:29.989 INFO TaskSetManager - Finished task 0.0 in stage 124.0 (TID 180) in 52 ms on localhost (executor driver) (1/1)
14:51:29.989 INFO TaskSchedulerImpl - Removed TaskSet 124.0, whose tasks have all completed, from pool
14:51:29.990 INFO DAGScheduler - ShuffleMapStage 124 (mapToPair at SparkUtils.java:161) finished in 0.073 s
14:51:29.990 INFO DAGScheduler - looking for newly runnable stages
14:51:29.990 INFO DAGScheduler - running: HashSet()
14:51:29.990 INFO DAGScheduler - waiting: HashSet(ResultStage 125)
14:51:29.990 INFO DAGScheduler - failed: HashSet()
14:51:29.990 INFO DAGScheduler - Submitting ResultStage 125 (MapPartitionsRDD[573] at mapToPair at BamSink.java:91), which has no missing parents
14:51:29.997 INFO MemoryStore - Block broadcast_231 stored as values in memory (estimated size 241.4 KiB, free 1916.4 MiB)
14:51:29.998 INFO MemoryStore - Block broadcast_231_piece0 stored as bytes in memory (estimated size 67.0 KiB, free 1916.4 MiB)
14:51:29.998 INFO BlockManagerInfo - Added broadcast_231_piece0 in memory on localhost:44923 (size: 67.0 KiB, free: 1919.3 MiB)
14:51:29.998 INFO SparkContext - Created broadcast 231 from broadcast at DAGScheduler.scala:1580
14:51:29.998 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 125 (MapPartitionsRDD[573] at mapToPair at BamSink.java:91) (first 15 tasks are for partitions Vector(0))
14:51:29.998 INFO TaskSchedulerImpl - Adding task set 125.0 with 1 tasks resource profile 0
14:51:29.999 INFO TaskSetManager - Starting task 0.0 in stage 125.0 (TID 181) (localhost, executor driver, partition 0, NODE_LOCAL, 7513 bytes)
14:51:29.999 INFO Executor - Running task 0.0 in stage 125.0 (TID 181)
14:51:30.007 INFO ShuffleBlockFetcherIterator - Getting 1 (343.8 KiB) non-empty blocks including 1 (343.8 KiB) local and 0 (0.0 B) host-local and 0 (0.0 B) push-merged-local and 0 (0.0 B) remote blocks
14:51:30.007 INFO ShuffleBlockFetcherIterator - Started 0 remote fetches in 0 ms
14:51:30.022 INFO PathOutputCommitterFactory - No output committer factory defined, defaulting to FileOutputCommitterFactory
14:51:30.022 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
14:51:30.022 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
14:51:30.022 INFO PathOutputCommitterFactory - No output committer factory defined, defaulting to FileOutputCommitterFactory
14:51:30.022 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
14:51:30.022 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
14:51:30.051 INFO FileOutputCommitter - Saved output of task 'attempt_202603041451293684511270050505486_0573_r_000000_0' to file:/tmp/ReadsSparkSinkUnitTest17101028114322109348.bam.parts/_temporary/0/task_202603041451293684511270050505486_0573_r_000000
14:51:30.051 INFO SparkHadoopMapRedUtil - attempt_202603041451293684511270050505486_0573_r_000000_0: Committed. Elapsed time: 0 ms.
14:51:30.052 INFO Executor - Finished task 0.0 in stage 125.0 (TID 181). 1858 bytes result sent to driver
14:51:30.052 INFO TaskSetManager - Finished task 0.0 in stage 125.0 (TID 181) in 53 ms on localhost (executor driver) (1/1)
14:51:30.052 INFO TaskSchedulerImpl - Removed TaskSet 125.0, whose tasks have all completed, from pool
14:51:30.053 INFO DAGScheduler - ResultStage 125 (runJob at SparkHadoopWriter.scala:83) finished in 0.063 s
14:51:30.053 INFO DAGScheduler - Job 87 is finished. Cancelling potential speculative or zombie tasks for this job
14:51:30.053 INFO TaskSchedulerImpl - Killing all running tasks in stage 125: Stage finished
14:51:30.053 INFO DAGScheduler - Job 87 finished: runJob at SparkHadoopWriter.scala:83, took 0.136861 s
14:51:30.053 INFO SparkHadoopWriter - Start to commit write Job job_202603041451293684511270050505486_0573.
14:51:30.059 INFO SparkHadoopWriter - Write Job job_202603041451293684511270050505486_0573 committed. Elapsed time: 6 ms.
14:51:30.073 INFO HadoopFileSystemWrapper - Concatenating 3 parts to /tmp/ReadsSparkSinkUnitTest17101028114322109348.bam
14:51:30.079 INFO HadoopFileSystemWrapper - Concatenating to /tmp/ReadsSparkSinkUnitTest17101028114322109348.bam done
14:51:30.079 INFO IndexFileMerger - Merging .sbi files in temp directory /tmp/ReadsSparkSinkUnitTest17101028114322109348.bam.parts/ to /tmp/ReadsSparkSinkUnitTest17101028114322109348.bam.sbi
14:51:30.085 INFO IndexFileMerger - Done merging .sbi files
14:51:30.085 INFO IndexFileMerger - Merging .bai files in temp directory /tmp/ReadsSparkSinkUnitTest17101028114322109348.bam.parts/ to /tmp/ReadsSparkSinkUnitTest17101028114322109348.bam.bai
14:51:30.092 INFO IndexFileMerger - Done merging .bai files
14:51:30.095 INFO MemoryStore - Block broadcast_232 stored as values in memory (estimated size 13.3 KiB, free 1916.4 MiB)
14:51:30.095 INFO MemoryStore - Block broadcast_232_piece0 stored as bytes in memory (estimated size 8.3 KiB, free 1916.3 MiB)
14:51:30.096 INFO BlockManagerInfo - Added broadcast_232_piece0 in memory on localhost:44923 (size: 8.3 KiB, free: 1919.3 MiB)
14:51:30.096 INFO SparkContext - Created broadcast 232 from broadcast at BamSource.java:104
14:51:30.097 INFO MemoryStore - Block broadcast_233 stored as values in memory (estimated size 297.9 KiB, free 1916.1 MiB)
14:51:30.108 INFO MemoryStore - Block broadcast_233_piece0 stored as bytes in memory (estimated size 50.2 KiB, free 1916.0 MiB)
14:51:30.108 INFO BlockManagerInfo - Added broadcast_233_piece0 in memory on localhost:44923 (size: 50.2 KiB, free: 1919.2 MiB)
14:51:30.108 INFO SparkContext - Created broadcast 233 from newAPIHadoopFile at PathSplitSource.java:96
14:51:30.120 INFO FileInputFormat - Total input files to process : 1
14:51:30.134 INFO SparkContext - Starting job: collect at ReadsSparkSinkUnitTest.java:182
14:51:30.135 INFO DAGScheduler - Got job 88 (collect at ReadsSparkSinkUnitTest.java:182) with 1 output partitions
14:51:30.135 INFO DAGScheduler - Final stage: ResultStage 126 (collect at ReadsSparkSinkUnitTest.java:182)
14:51:30.135 INFO DAGScheduler - Parents of final stage: List()
14:51:30.135 INFO DAGScheduler - Missing parents: List()
14:51:30.135 INFO DAGScheduler - Submitting ResultStage 126 (MapPartitionsRDD[579] at filter at ReadsSparkSource.java:96), which has no missing parents
14:51:30.141 INFO MemoryStore - Block broadcast_234 stored as values in memory (estimated size 148.2 KiB, free 1915.9 MiB)
14:51:30.142 INFO MemoryStore - Block broadcast_234_piece0 stored as bytes in memory (estimated size 54.6 KiB, free 1915.8 MiB)
14:51:30.142 INFO BlockManagerInfo - Added broadcast_234_piece0 in memory on localhost:44923 (size: 54.6 KiB, free: 1919.2 MiB)
14:51:30.143 INFO SparkContext - Created broadcast 234 from broadcast at DAGScheduler.scala:1580
14:51:30.143 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 126 (MapPartitionsRDD[579] at filter at ReadsSparkSource.java:96) (first 15 tasks are for partitions Vector(0))
14:51:30.143 INFO TaskSchedulerImpl - Adding task set 126.0 with 1 tasks resource profile 0
14:51:30.144 INFO TaskSetManager - Starting task 0.0 in stage 126.0 (TID 182) (localhost, executor driver, partition 0, PROCESS_LOCAL, 7809 bytes)
14:51:30.144 INFO Executor - Running task 0.0 in stage 126.0 (TID 182)
14:51:30.156 INFO NewHadoopRDD - Input split: file:/tmp/ReadsSparkSinkUnitTest17101028114322109348.bam:0+237038
14:51:30.161 INFO Executor - Finished task 0.0 in stage 126.0 (TID 182). 651483 bytes result sent to driver
14:51:30.163 INFO TaskSetManager - Finished task 0.0 in stage 126.0 (TID 182) in 20 ms on localhost (executor driver) (1/1)
14:51:30.163 INFO TaskSchedulerImpl - Removed TaskSet 126.0, whose tasks have all completed, from pool
14:51:30.163 INFO DAGScheduler - ResultStage 126 (collect at ReadsSparkSinkUnitTest.java:182) finished in 0.027 s
14:51:30.163 INFO DAGScheduler - Job 88 is finished. Cancelling potential speculative or zombie tasks for this job
14:51:30.163 INFO TaskSchedulerImpl - Killing all running tasks in stage 126: Stage finished
14:51:30.163 INFO DAGScheduler - Job 88 finished: collect at ReadsSparkSinkUnitTest.java:182, took 0.028911 s
14:51:30.173 INFO SparkContext - Starting job: count at ReadsSparkSinkUnitTest.java:185
14:51:30.173 INFO DAGScheduler - Got job 89 (count at ReadsSparkSinkUnitTest.java:185) with 1 output partitions
14:51:30.173 INFO DAGScheduler - Final stage: ResultStage 127 (count at ReadsSparkSinkUnitTest.java:185)
14:51:30.173 INFO DAGScheduler - Parents of final stage: List()
14:51:30.173 INFO DAGScheduler - Missing parents: List()
14:51:30.174 INFO DAGScheduler - Submitting ResultStage 127 (MapPartitionsRDD[561] at filter at ReadsSparkSource.java:96), which has no missing parents
14:51:30.190 INFO MemoryStore - Block broadcast_235 stored as values in memory (estimated size 426.1 KiB, free 1915.4 MiB)
14:51:30.198 INFO BlockManagerInfo - Removed broadcast_222_piece0 on localhost:44923 in memory (size: 50.2 KiB, free: 1919.2 MiB)
14:51:30.199 INFO MemoryStore - Block broadcast_235_piece0 stored as bytes in memory (estimated size 153.6 KiB, free 1915.6 MiB)
14:51:30.199 INFO BlockManagerInfo - Added broadcast_235_piece0 in memory on localhost:44923 (size: 153.6 KiB, free: 1919.1 MiB)
14:51:30.199 INFO BlockManagerInfo - Removed broadcast_234_piece0 on localhost:44923 in memory (size: 54.6 KiB, free: 1919.1 MiB)
14:51:30.199 INFO SparkContext - Created broadcast 235 from broadcast at DAGScheduler.scala:1580
14:51:30.200 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 127 (MapPartitionsRDD[561] at filter at ReadsSparkSource.java:96) (first 15 tasks are for partitions Vector(0))
14:51:30.200 INFO TaskSchedulerImpl - Adding task set 127.0 with 1 tasks resource profile 0
14:51:30.200 INFO BlockManagerInfo - Removed broadcast_231_piece0 on localhost:44923 in memory (size: 67.0 KiB, free: 1919.2 MiB)
14:51:30.200 INFO TaskSetManager - Starting task 0.0 in stage 127.0 (TID 183) (localhost, executor driver, partition 0, PROCESS_LOCAL, 7872 bytes)
14:51:30.200 INFO BlockManagerInfo - Removed broadcast_223_piece0 on localhost:44923 in memory (size: 54.6 KiB, free: 1919.3 MiB)
14:51:30.201 INFO Executor - Running task 0.0 in stage 127.0 (TID 183)
14:51:30.202 INFO BlockManagerInfo - Removed broadcast_215_piece0 on localhost:44923 in memory (size: 50.2 KiB, free: 1919.3 MiB)
14:51:30.203 INFO BlockManagerInfo - Removed broadcast_230_piece0 on localhost:44923 in memory (size: 166.1 KiB, free: 1919.5 MiB)
14:51:30.203 INFO BlockManagerInfo - Removed broadcast_227_piece0 on localhost:44923 in memory (size: 50.2 KiB, free: 1919.5 MiB)
14:51:30.204 INFO BlockManagerInfo - Removed broadcast_229_piece0 on localhost:44923 in memory (size: 9.6 KiB, free: 1919.5 MiB)
14:51:30.205 INFO BlockManagerInfo - Removed broadcast_225_piece0 on localhost:44923 in memory (size: 54.5 KiB, free: 1919.6 MiB)
14:51:30.205 INFO BlockManagerInfo - Removed broadcast_221_piece0 on localhost:44923 in memory (size: 233.0 B, free: 1919.6 MiB)
14:51:30.206 INFO BlockManagerInfo - Removed broadcast_228_piece0 on localhost:44923 in memory (size: 9.6 KiB, free: 1919.6 MiB)
14:51:30.206 INFO BlockManagerInfo - Removed broadcast_224_piece0 on localhost:44923 in memory (size: 153.6 KiB, free: 1919.7 MiB)
14:51:30.242 INFO NewHadoopRDD - Input split: file:/home/runner/work/gatk/gatk/src/test/resources/org/broadinstitute/hellbender/tools/BQSR/HiSeq.1mb.1RG.2k_lines.bam:0+222075
14:51:30.253 INFO Executor - Finished task 0.0 in stage 127.0 (TID 183). 989 bytes result sent to driver
14:51:30.253 INFO TaskSetManager - Finished task 0.0 in stage 127.0 (TID 183) in 53 ms on localhost (executor driver) (1/1)
14:51:30.253 INFO TaskSchedulerImpl - Removed TaskSet 127.0, whose tasks have all completed, from pool
14:51:30.254 INFO DAGScheduler - ResultStage 127 (count at ReadsSparkSinkUnitTest.java:185) finished in 0.080 s
14:51:30.254 INFO DAGScheduler - Job 89 is finished. Cancelling potential speculative or zombie tasks for this job
14:51:30.254 INFO TaskSchedulerImpl - Killing all running tasks in stage 127: Stage finished
14:51:30.254 INFO DAGScheduler - Job 89 finished: count at ReadsSparkSinkUnitTest.java:185, took 0.080939 s
14:51:30.257 INFO SparkContext - Starting job: count at ReadsSparkSinkUnitTest.java:185
14:51:30.258 INFO DAGScheduler - Got job 90 (count at ReadsSparkSinkUnitTest.java:185) with 1 output partitions
14:51:30.258 INFO DAGScheduler - Final stage: ResultStage 128 (count at ReadsSparkSinkUnitTest.java:185)
14:51:30.258 INFO DAGScheduler - Parents of final stage: List()
14:51:30.258 INFO DAGScheduler - Missing parents: List()
14:51:30.258 INFO DAGScheduler - Submitting ResultStage 128 (MapPartitionsRDD[579] at filter at ReadsSparkSource.java:96), which has no missing parents
14:51:30.265 INFO MemoryStore - Block broadcast_236 stored as values in memory (estimated size 148.1 KiB, free 1918.6 MiB)
14:51:30.266 INFO MemoryStore - Block broadcast_236_piece0 stored as bytes in memory (estimated size 54.5 KiB, free 1918.5 MiB)
14:51:30.266 INFO BlockManagerInfo - Added broadcast_236_piece0 in memory on localhost:44923 (size: 54.5 KiB, free: 1919.7 MiB)
14:51:30.266 INFO SparkContext - Created broadcast 236 from broadcast at DAGScheduler.scala:1580
14:51:30.266 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 128 (MapPartitionsRDD[579] at filter at ReadsSparkSource.java:96) (first 15 tasks are for partitions Vector(0))
14:51:30.267 INFO TaskSchedulerImpl - Adding task set 128.0 with 1 tasks resource profile 0
14:51:30.267 INFO TaskSetManager - Starting task 0.0 in stage 128.0 (TID 184) (localhost, executor driver, partition 0, PROCESS_LOCAL, 7809 bytes)
14:51:30.268 INFO Executor - Running task 0.0 in stage 128.0 (TID 184)
14:51:30.281 INFO NewHadoopRDD - Input split: file:/tmp/ReadsSparkSinkUnitTest17101028114322109348.bam:0+237038
14:51:30.285 INFO Executor - Finished task 0.0 in stage 128.0 (TID 184). 989 bytes result sent to driver
14:51:30.285 INFO TaskSetManager - Finished task 0.0 in stage 128.0 (TID 184) in 18 ms on localhost (executor driver) (1/1)
14:51:30.286 INFO TaskSchedulerImpl - Removed TaskSet 128.0, whose tasks have all completed, from pool
14:51:30.286 INFO DAGScheduler - ResultStage 128 (count at ReadsSparkSinkUnitTest.java:185) finished in 0.028 s
14:51:30.286 INFO DAGScheduler - Job 90 is finished. Cancelling potential speculative or zombie tasks for this job
14:51:30.286 INFO TaskSchedulerImpl - Killing all running tasks in stage 128: Stage finished
14:51:30.286 INFO DAGScheduler - Job 90 finished: count at ReadsSparkSinkUnitTest.java:185, took 0.028480 s
14:51:30.289 INFO MemoryStore - Block broadcast_237 stored as values in memory (estimated size 297.9 KiB, free 1918.2 MiB)
14:51:30.296 INFO MemoryStore - Block broadcast_237_piece0 stored as bytes in memory (estimated size 50.2 KiB, free 1918.2 MiB)
14:51:30.296 INFO BlockManagerInfo - Added broadcast_237_piece0 in memory on localhost:44923 (size: 50.2 KiB, free: 1919.6 MiB)
14:51:30.296 INFO SparkContext - Created broadcast 237 from newAPIHadoopFile at PathSplitSource.java:96
14:51:30.320 INFO MemoryStore - Block broadcast_238 stored as values in memory (estimated size 297.9 KiB, free 1917.9 MiB)
14:51:30.327 INFO MemoryStore - Block broadcast_238_piece0 stored as bytes in memory (estimated size 50.2 KiB, free 1917.9 MiB)
14:51:30.327 INFO BlockManagerInfo - Added broadcast_238_piece0 in memory on localhost:44923 (size: 50.2 KiB, free: 1919.6 MiB)
14:51:30.327 INFO SparkContext - Created broadcast 238 from newAPIHadoopFile at PathSplitSource.java:96
14:51:30.349 INFO FileInputFormat - Total input files to process : 1
14:51:30.351 INFO MemoryStore - Block broadcast_239 stored as values in memory (estimated size 160.7 KiB, free 1917.7 MiB)
14:51:30.352 INFO MemoryStore - Block broadcast_239_piece0 stored as bytes in memory (estimated size 9.6 KiB, free 1917.7 MiB)
14:51:30.352 INFO BlockManagerInfo - Added broadcast_239_piece0 in memory on localhost:44923 (size: 9.6 KiB, free: 1919.6 MiB)
14:51:30.352 INFO SparkContext - Created broadcast 239 from broadcast at ReadsSparkSink.java:133
14:51:30.353 WARN HtsjdkReadsRddStorage - Unrecognized write option: DISABLE
14:51:30.354 INFO MemoryStore - Block broadcast_240 stored as values in memory (estimated size 163.2 KiB, free 1917.5 MiB)
14:51:30.354 INFO MemoryStore - Block broadcast_240_piece0 stored as bytes in memory (estimated size 9.6 KiB, free 1917.5 MiB)
14:51:30.355 INFO BlockManagerInfo - Added broadcast_240_piece0 in memory on localhost:44923 (size: 9.6 KiB, free: 1919.6 MiB)
14:51:30.355 INFO SparkContext - Created broadcast 240 from broadcast at BamSink.java:76
14:51:30.357 INFO PathOutputCommitterFactory - No output committer factory defined, defaulting to FileOutputCommitterFactory
14:51:30.357 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
14:51:30.357 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
14:51:30.377 INFO SparkContext - Starting job: runJob at SparkHadoopWriter.scala:83
14:51:30.377 INFO DAGScheduler - Registering RDD 593 (mapToPair at SparkUtils.java:161) as input to shuffle 27
14:51:30.377 INFO DAGScheduler - Got job 91 (runJob at SparkHadoopWriter.scala:83) with 1 output partitions
14:51:30.377 INFO DAGScheduler - Final stage: ResultStage 130 (runJob at SparkHadoopWriter.scala:83)
14:51:30.378 INFO DAGScheduler - Parents of final stage: List(ShuffleMapStage 129)
14:51:30.378 INFO DAGScheduler - Missing parents: List(ShuffleMapStage 129)
14:51:30.378 INFO DAGScheduler - Submitting ShuffleMapStage 129 (MapPartitionsRDD[593] at mapToPair at SparkUtils.java:161), which has no missing parents
14:51:30.395 INFO MemoryStore - Block broadcast_241 stored as values in memory (estimated size 520.4 KiB, free 1917.0 MiB)
14:51:30.397 INFO MemoryStore - Block broadcast_241_piece0 stored as bytes in memory (estimated size 166.1 KiB, free 1916.8 MiB)
14:51:30.397 INFO BlockManagerInfo - Added broadcast_241_piece0 in memory on localhost:44923 (size: 166.1 KiB, free: 1919.4 MiB)
14:51:30.397 INFO SparkContext - Created broadcast 241 from broadcast at DAGScheduler.scala:1580
14:51:30.397 INFO DAGScheduler - Submitting 1 missing tasks from ShuffleMapStage 129 (MapPartitionsRDD[593] at mapToPair at SparkUtils.java:161) (first 15 tasks are for partitions Vector(0))
14:51:30.397 INFO TaskSchedulerImpl - Adding task set 129.0 with 1 tasks resource profile 0
14:51:30.398 INFO TaskSetManager - Starting task 0.0 in stage 129.0 (TID 185) (localhost, executor driver, partition 0, PROCESS_LOCAL, 7861 bytes)
14:51:30.398 INFO Executor - Running task 0.0 in stage 129.0 (TID 185)
14:51:30.432 INFO NewHadoopRDD - Input split: file:/home/runner/work/gatk/gatk/src/test/resources/org/broadinstitute/hellbender/tools/BQSR/HiSeq.1mb.1RG.2k_lines.bam:0+222075
14:51:30.448 INFO Executor - Finished task 0.0 in stage 129.0 (TID 185). 1148 bytes result sent to driver
14:51:30.449 INFO TaskSetManager - Finished task 0.0 in stage 129.0 (TID 185) in 51 ms on localhost (executor driver) (1/1)
14:51:30.449 INFO TaskSchedulerImpl - Removed TaskSet 129.0, whose tasks have all completed, from pool
14:51:30.449 INFO DAGScheduler - ShuffleMapStage 129 (mapToPair at SparkUtils.java:161) finished in 0.071 s
14:51:30.449 INFO DAGScheduler - looking for newly runnable stages
14:51:30.449 INFO DAGScheduler - running: HashSet()
14:51:30.449 INFO DAGScheduler - waiting: HashSet(ResultStage 130)
14:51:30.449 INFO DAGScheduler - failed: HashSet()
14:51:30.450 INFO DAGScheduler - Submitting ResultStage 130 (MapPartitionsRDD[598] at mapToPair at BamSink.java:91), which has no missing parents
14:51:30.457 INFO MemoryStore - Block broadcast_242 stored as values in memory (estimated size 241.4 KiB, free 1916.6 MiB)
14:51:30.458 INFO MemoryStore - Block broadcast_242_piece0 stored as bytes in memory (estimated size 67.0 KiB, free 1916.5 MiB)
14:51:30.458 INFO BlockManagerInfo - Added broadcast_242_piece0 in memory on localhost:44923 (size: 67.0 KiB, free: 1919.3 MiB)
14:51:30.458 INFO SparkContext - Created broadcast 242 from broadcast at DAGScheduler.scala:1580
14:51:30.458 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 130 (MapPartitionsRDD[598] at mapToPair at BamSink.java:91) (first 15 tasks are for partitions Vector(0))
14:51:30.458 INFO TaskSchedulerImpl - Adding task set 130.0 with 1 tasks resource profile 0
14:51:30.459 INFO TaskSetManager - Starting task 0.0 in stage 130.0 (TID 186) (localhost, executor driver, partition 0, NODE_LOCAL, 7513 bytes)
14:51:30.459 INFO Executor - Running task 0.0 in stage 130.0 (TID 186)
14:51:30.464 INFO ShuffleBlockFetcherIterator - Getting 1 (343.8 KiB) non-empty blocks including 1 (343.8 KiB) local and 0 (0.0 B) host-local and 0 (0.0 B) push-merged-local and 0 (0.0 B) remote blocks
14:51:30.464 INFO ShuffleBlockFetcherIterator - Started 0 remote fetches in 0 ms
14:51:30.476 INFO PathOutputCommitterFactory - No output committer factory defined, defaulting to FileOutputCommitterFactory
14:51:30.476 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
14:51:30.476 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
14:51:30.476 INFO PathOutputCommitterFactory - No output committer factory defined, defaulting to FileOutputCommitterFactory
14:51:30.476 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
14:51:30.476 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
14:51:30.499 INFO FileOutputCommitter - Saved output of task 'attempt_202603041451303737399252585316198_0598_r_000000_0' to file:/tmp/ReadsSparkSinkUnitTest16425437879294739236.bam.parts/_temporary/0/task_202603041451303737399252585316198_0598_r_000000
14:51:30.500 INFO SparkHadoopMapRedUtil - attempt_202603041451303737399252585316198_0598_r_000000_0: Committed. Elapsed time: 0 ms.
14:51:30.500 INFO Executor - Finished task 0.0 in stage 130.0 (TID 186). 1858 bytes result sent to driver
14:51:30.501 INFO TaskSetManager - Finished task 0.0 in stage 130.0 (TID 186) in 42 ms on localhost (executor driver) (1/1)
14:51:30.501 INFO TaskSchedulerImpl - Removed TaskSet 130.0, whose tasks have all completed, from pool
14:51:30.501 INFO DAGScheduler - ResultStage 130 (runJob at SparkHadoopWriter.scala:83) finished in 0.051 s
14:51:30.501 INFO DAGScheduler - Job 91 is finished. Cancelling potential speculative or zombie tasks for this job
14:51:30.501 INFO TaskSchedulerImpl - Killing all running tasks in stage 130: Stage finished
14:51:30.501 INFO DAGScheduler - Job 91 finished: runJob at SparkHadoopWriter.scala:83, took 0.124229 s
14:51:30.501 INFO SparkHadoopWriter - Start to commit write Job job_202603041451303737399252585316198_0598.
14:51:30.507 INFO SparkHadoopWriter - Write Job job_202603041451303737399252585316198_0598 committed. Elapsed time: 5 ms.
14:51:30.521 INFO HadoopFileSystemWrapper - Concatenating 3 parts to /tmp/ReadsSparkSinkUnitTest16425437879294739236.bam
14:51:30.526 INFO HadoopFileSystemWrapper - Concatenating to /tmp/ReadsSparkSinkUnitTest16425437879294739236.bam done
14:51:30.526 INFO IndexFileMerger - Merging .bai files in temp directory /tmp/ReadsSparkSinkUnitTest16425437879294739236.bam.parts/ to /tmp/ReadsSparkSinkUnitTest16425437879294739236.bam.bai
14:51:30.533 INFO IndexFileMerger - Done merging .bai files
14:51:30.536 INFO MemoryStore - Block broadcast_243 stored as values in memory (estimated size 297.9 KiB, free 1916.3 MiB)
14:51:30.542 INFO MemoryStore - Block broadcast_243_piece0 stored as bytes in memory (estimated size 50.2 KiB, free 1916.2 MiB)
14:51:30.542 INFO BlockManagerInfo - Added broadcast_243_piece0 in memory on localhost:44923 (size: 50.2 KiB, free: 1919.3 MiB)
14:51:30.543 INFO SparkContext - Created broadcast 243 from newAPIHadoopFile at PathSplitSource.java:96
14:51:30.564 INFO FileInputFormat - Total input files to process : 1
14:51:30.601 INFO SparkContext - Starting job: collect at ReadsSparkSinkUnitTest.java:182
14:51:30.601 INFO DAGScheduler - Got job 92 (collect at ReadsSparkSinkUnitTest.java:182) with 1 output partitions
14:51:30.601 INFO DAGScheduler - Final stage: ResultStage 131 (collect at ReadsSparkSinkUnitTest.java:182)
14:51:30.601 INFO DAGScheduler - Parents of final stage: List()
14:51:30.601 INFO DAGScheduler - Missing parents: List()
14:51:30.602 INFO DAGScheduler - Submitting ResultStage 131 (MapPartitionsRDD[605] at filter at ReadsSparkSource.java:96), which has no missing parents
14:51:30.618 INFO MemoryStore - Block broadcast_244 stored as values in memory (estimated size 426.2 KiB, free 1915.8 MiB)
14:51:30.620 INFO MemoryStore - Block broadcast_244_piece0 stored as bytes in memory (estimated size 153.7 KiB, free 1915.6 MiB)
14:51:30.620 INFO BlockManagerInfo - Added broadcast_244_piece0 in memory on localhost:44923 (size: 153.7 KiB, free: 1919.1 MiB)
14:51:30.620 INFO SparkContext - Created broadcast 244 from broadcast at DAGScheduler.scala:1580
14:51:30.620 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 131 (MapPartitionsRDD[605] at filter at ReadsSparkSource.java:96) (first 15 tasks are for partitions Vector(0))
14:51:30.620 INFO TaskSchedulerImpl - Adding task set 131.0 with 1 tasks resource profile 0
14:51:30.621 INFO TaskSetManager - Starting task 0.0 in stage 131.0 (TID 187) (localhost, executor driver, partition 0, PROCESS_LOCAL, 7809 bytes)
14:51:30.621 INFO Executor - Running task 0.0 in stage 131.0 (TID 187)
14:51:30.661 INFO NewHadoopRDD - Input split: file:/tmp/ReadsSparkSinkUnitTest16425437879294739236.bam:0+237038
14:51:30.674 INFO Executor - Finished task 0.0 in stage 131.0 (TID 187). 651526 bytes result sent to driver
14:51:30.677 INFO TaskSetManager - Finished task 0.0 in stage 131.0 (TID 187) in 56 ms on localhost (executor driver) (1/1)
14:51:30.678 INFO TaskSchedulerImpl - Removed TaskSet 131.0, whose tasks have all completed, from pool
14:51:30.678 INFO DAGScheduler - ResultStage 131 (collect at ReadsSparkSinkUnitTest.java:182) finished in 0.076 s
14:51:30.678 INFO DAGScheduler - Job 92 is finished. Cancelling potential speculative or zombie tasks for this job
14:51:30.678 INFO TaskSchedulerImpl - Killing all running tasks in stage 131: Stage finished
14:51:30.678 INFO DAGScheduler - Job 92 finished: collect at ReadsSparkSinkUnitTest.java:182, took 0.077060 s
14:51:30.688 INFO SparkContext - Starting job: count at ReadsSparkSinkUnitTest.java:185
14:51:30.688 INFO DAGScheduler - Got job 93 (count at ReadsSparkSinkUnitTest.java:185) with 1 output partitions
14:51:30.688 INFO DAGScheduler - Final stage: ResultStage 132 (count at ReadsSparkSinkUnitTest.java:185)
14:51:30.688 INFO DAGScheduler - Parents of final stage: List()
14:51:30.688 INFO DAGScheduler - Missing parents: List()
14:51:30.688 INFO DAGScheduler - Submitting ResultStage 132 (MapPartitionsRDD[586] at filter at ReadsSparkSource.java:96), which has no missing parents
14:51:30.706 INFO MemoryStore - Block broadcast_245 stored as values in memory (estimated size 426.1 KiB, free 1915.2 MiB)
14:51:30.707 INFO MemoryStore - Block broadcast_245_piece0 stored as bytes in memory (estimated size 153.6 KiB, free 1915.1 MiB)
14:51:30.707 INFO BlockManagerInfo - Added broadcast_245_piece0 in memory on localhost:44923 (size: 153.6 KiB, free: 1919.0 MiB)
14:51:30.707 INFO SparkContext - Created broadcast 245 from broadcast at DAGScheduler.scala:1580
14:51:30.708 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 132 (MapPartitionsRDD[586] at filter at ReadsSparkSource.java:96) (first 15 tasks are for partitions Vector(0))
14:51:30.708 INFO TaskSchedulerImpl - Adding task set 132.0 with 1 tasks resource profile 0
14:51:30.708 INFO TaskSetManager - Starting task 0.0 in stage 132.0 (TID 188) (localhost, executor driver, partition 0, PROCESS_LOCAL, 7872 bytes)
14:51:30.709 INFO Executor - Running task 0.0 in stage 132.0 (TID 188)
14:51:30.743 INFO NewHadoopRDD - Input split: file:/home/runner/work/gatk/gatk/src/test/resources/org/broadinstitute/hellbender/tools/BQSR/HiSeq.1mb.1RG.2k_lines.bam:0+222075
14:51:30.753 INFO Executor - Finished task 0.0 in stage 132.0 (TID 188). 989 bytes result sent to driver
14:51:30.753 INFO TaskSetManager - Finished task 0.0 in stage 132.0 (TID 188) in 45 ms on localhost (executor driver) (1/1)
14:51:30.754 INFO TaskSchedulerImpl - Removed TaskSet 132.0, whose tasks have all completed, from pool
14:51:30.754 INFO DAGScheduler - ResultStage 132 (count at ReadsSparkSinkUnitTest.java:185) finished in 0.065 s
14:51:30.754 INFO DAGScheduler - Job 93 is finished. Cancelling potential speculative or zombie tasks for this job
14:51:30.754 INFO TaskSchedulerImpl - Killing all running tasks in stage 132: Stage finished
14:51:30.754 INFO DAGScheduler - Job 93 finished: count at ReadsSparkSinkUnitTest.java:185, took 0.066202 s
14:51:30.758 INFO SparkContext - Starting job: count at ReadsSparkSinkUnitTest.java:185
14:51:30.759 INFO DAGScheduler - Got job 94 (count at ReadsSparkSinkUnitTest.java:185) with 1 output partitions
14:51:30.759 INFO DAGScheduler - Final stage: ResultStage 133 (count at ReadsSparkSinkUnitTest.java:185)
14:51:30.759 INFO DAGScheduler - Parents of final stage: List()
14:51:30.759 INFO DAGScheduler - Missing parents: List()
14:51:30.759 INFO DAGScheduler - Submitting ResultStage 133 (MapPartitionsRDD[605] at filter at ReadsSparkSource.java:96), which has no missing parents
14:51:30.785 INFO MemoryStore - Block broadcast_246 stored as values in memory (estimated size 426.1 KiB, free 1914.7 MiB)
14:51:30.787 INFO MemoryStore - Block broadcast_246_piece0 stored as bytes in memory (estimated size 153.6 KiB, free 1914.5 MiB)
14:51:30.787 INFO BlockManagerInfo - Added broadcast_246_piece0 in memory on localhost:44923 (size: 153.6 KiB, free: 1918.8 MiB)
14:51:30.787 INFO SparkContext - Created broadcast 246 from broadcast at DAGScheduler.scala:1580
14:51:30.788 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 133 (MapPartitionsRDD[605] at filter at ReadsSparkSource.java:96) (first 15 tasks are for partitions Vector(0))
14:51:30.788 INFO TaskSchedulerImpl - Adding task set 133.0 with 1 tasks resource profile 0
14:51:30.788 INFO TaskSetManager - Starting task 0.0 in stage 133.0 (TID 189) (localhost, executor driver, partition 0, PROCESS_LOCAL, 7809 bytes)
14:51:30.789 INFO Executor - Running task 0.0 in stage 133.0 (TID 189)
14:51:30.818 INFO BlockManagerInfo - Removed broadcast_238_piece0 on localhost:44923 in memory (size: 50.2 KiB, free: 1918.9 MiB)
14:51:30.819 INFO BlockManagerInfo - Removed broadcast_233_piece0 on localhost:44923 in memory (size: 50.2 KiB, free: 1918.9 MiB)
14:51:30.820 INFO BlockManagerInfo - Removed broadcast_236_piece0 on localhost:44923 in memory (size: 54.5 KiB, free: 1919.0 MiB)
14:51:30.821 INFO BlockManagerInfo - Removed broadcast_244_piece0 on localhost:44923 in memory (size: 153.7 KiB, free: 1919.1 MiB)
14:51:30.822 INFO BlockManagerInfo - Removed broadcast_241_piece0 on localhost:44923 in memory (size: 166.1 KiB, free: 1919.3 MiB)
14:51:30.822 INFO BlockManagerInfo - Removed broadcast_226_piece0 on localhost:44923 in memory (size: 50.2 KiB, free: 1919.4 MiB)
14:51:30.823 INFO BlockManagerInfo - Removed broadcast_239_piece0 on localhost:44923 in memory (size: 9.6 KiB, free: 1919.4 MiB)
14:51:30.824 INFO BlockManagerInfo - Removed broadcast_235_piece0 on localhost:44923 in memory (size: 153.6 KiB, free: 1919.5 MiB)
14:51:30.824 INFO BlockManagerInfo - Removed broadcast_242_piece0 on localhost:44923 in memory (size: 67.0 KiB, free: 1919.6 MiB)
14:51:30.825 INFO BlockManagerInfo - Removed broadcast_240_piece0 on localhost:44923 in memory (size: 9.6 KiB, free: 1919.6 MiB)
14:51:30.825 INFO BlockManagerInfo - Removed broadcast_232_piece0 on localhost:44923 in memory (size: 8.3 KiB, free: 1919.6 MiB)
14:51:30.826 INFO BlockManagerInfo - Removed broadcast_245_piece0 on localhost:44923 in memory (size: 153.6 KiB, free: 1919.8 MiB)
14:51:30.833 INFO NewHadoopRDD - Input split: file:/tmp/ReadsSparkSinkUnitTest16425437879294739236.bam:0+237038
14:51:30.845 INFO Executor - Finished task 0.0 in stage 133.0 (TID 189). 1032 bytes result sent to driver
14:51:30.846 INFO TaskSetManager - Finished task 0.0 in stage 133.0 (TID 189) in 58 ms on localhost (executor driver) (1/1)
14:51:30.846 INFO TaskSchedulerImpl - Removed TaskSet 133.0, whose tasks have all completed, from pool
14:51:30.846 INFO DAGScheduler - ResultStage 133 (count at ReadsSparkSinkUnitTest.java:185) finished in 0.087 s
14:51:30.846 INFO DAGScheduler - Job 94 is finished. Cancelling potential speculative or zombie tasks for this job
14:51:30.846 INFO TaskSchedulerImpl - Killing all running tasks in stage 133: Stage finished
14:51:30.846 INFO DAGScheduler - Job 94 finished: count at ReadsSparkSinkUnitTest.java:185, took 0.087725 s
14:51:30.850 INFO MemoryStore - Block broadcast_247 stored as values in memory (estimated size 297.9 KiB, free 1918.5 MiB)
14:51:30.860 INFO MemoryStore - Block broadcast_247_piece0 stored as bytes in memory (estimated size 50.2 KiB, free 1918.4 MiB)
14:51:30.860 INFO BlockManagerInfo - Added broadcast_247_piece0 in memory on localhost:44923 (size: 50.2 KiB, free: 1919.7 MiB)
14:51:30.861 INFO SparkContext - Created broadcast 247 from newAPIHadoopFile at PathSplitSource.java:96
14:51:30.890 INFO MemoryStore - Block broadcast_248 stored as values in memory (estimated size 297.9 KiB, free 1918.1 MiB)
14:51:30.897 INFO MemoryStore - Block broadcast_248_piece0 stored as bytes in memory (estimated size 50.2 KiB, free 1918.1 MiB)
14:51:30.897 INFO BlockManagerInfo - Added broadcast_248_piece0 in memory on localhost:44923 (size: 50.2 KiB, free: 1919.7 MiB)
14:51:30.897 INFO SparkContext - Created broadcast 248 from newAPIHadoopFile at PathSplitSource.java:96
14:51:30.919 INFO FileInputFormat - Total input files to process : 1
14:51:30.921 INFO MemoryStore - Block broadcast_249 stored as values in memory (estimated size 160.7 KiB, free 1917.9 MiB)
14:51:30.922 INFO MemoryStore - Block broadcast_249_piece0 stored as bytes in memory (estimated size 9.6 KiB, free 1917.9 MiB)
14:51:30.922 INFO BlockManagerInfo - Added broadcast_249_piece0 in memory on localhost:44923 (size: 9.6 KiB, free: 1919.6 MiB)
14:51:30.922 INFO SparkContext - Created broadcast 249 from broadcast at ReadsSparkSink.java:133
14:51:30.923 WARN HtsjdkReadsRddStorage - Unrecognized write option: DISABLE
14:51:30.924 INFO MemoryStore - Block broadcast_250 stored as values in memory (estimated size 163.2 KiB, free 1917.7 MiB)
14:51:30.924 INFO MemoryStore - Block broadcast_250_piece0 stored as bytes in memory (estimated size 9.6 KiB, free 1917.7 MiB)
14:51:30.925 INFO BlockManagerInfo - Added broadcast_250_piece0 in memory on localhost:44923 (size: 9.6 KiB, free: 1919.6 MiB)
14:51:30.925 INFO SparkContext - Created broadcast 250 from broadcast at BamSink.java:76
14:51:30.927 INFO PathOutputCommitterFactory - No output committer factory defined, defaulting to FileOutputCommitterFactory
14:51:30.927 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
14:51:30.927 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
14:51:30.948 INFO SparkContext - Starting job: runJob at SparkHadoopWriter.scala:83
14:51:30.949 INFO DAGScheduler - Registering RDD 619 (mapToPair at SparkUtils.java:161) as input to shuffle 28
14:51:30.949 INFO DAGScheduler - Got job 95 (runJob at SparkHadoopWriter.scala:83) with 1 output partitions
14:51:30.949 INFO DAGScheduler - Final stage: ResultStage 135 (runJob at SparkHadoopWriter.scala:83)
14:51:30.949 INFO DAGScheduler - Parents of final stage: List(ShuffleMapStage 134)
14:51:30.950 INFO DAGScheduler - Missing parents: List(ShuffleMapStage 134)
14:51:30.950 INFO DAGScheduler - Submitting ShuffleMapStage 134 (MapPartitionsRDD[619] at mapToPair at SparkUtils.java:161), which has no missing parents
14:51:30.970 INFO MemoryStore - Block broadcast_251 stored as values in memory (estimated size 520.4 KiB, free 1917.2 MiB)
14:51:30.972 INFO MemoryStore - Block broadcast_251_piece0 stored as bytes in memory (estimated size 166.1 KiB, free 1917.1 MiB)
14:51:30.972 INFO BlockManagerInfo - Added broadcast_251_piece0 in memory on localhost:44923 (size: 166.1 KiB, free: 1919.5 MiB)
14:51:30.972 INFO SparkContext - Created broadcast 251 from broadcast at DAGScheduler.scala:1580
14:51:30.973 INFO DAGScheduler - Submitting 1 missing tasks from ShuffleMapStage 134 (MapPartitionsRDD[619] at mapToPair at SparkUtils.java:161) (first 15 tasks are for partitions Vector(0))
14:51:30.973 INFO TaskSchedulerImpl - Adding task set 134.0 with 1 tasks resource profile 0
14:51:30.973 INFO TaskSetManager - Starting task 0.0 in stage 134.0 (TID 190) (localhost, executor driver, partition 0, PROCESS_LOCAL, 7861 bytes)
14:51:30.974 INFO Executor - Running task 0.0 in stage 134.0 (TID 190)
14:51:31.006 INFO NewHadoopRDD - Input split: file:/home/runner/work/gatk/gatk/src/test/resources/org/broadinstitute/hellbender/tools/BQSR/HiSeq.1mb.1RG.2k_lines.bam:0+222075
14:51:31.025 INFO Executor - Finished task 0.0 in stage 134.0 (TID 190). 1148 bytes result sent to driver
14:51:31.026 INFO TaskSetManager - Finished task 0.0 in stage 134.0 (TID 190) in 53 ms on localhost (executor driver) (1/1)
14:51:31.026 INFO TaskSchedulerImpl - Removed TaskSet 134.0, whose tasks have all completed, from pool
14:51:31.026 INFO DAGScheduler - ShuffleMapStage 134 (mapToPair at SparkUtils.java:161) finished in 0.076 s
14:51:31.026 INFO DAGScheduler - looking for newly runnable stages
14:51:31.026 INFO DAGScheduler - running: HashSet()
14:51:31.026 INFO DAGScheduler - waiting: HashSet(ResultStage 135)
14:51:31.026 INFO DAGScheduler - failed: HashSet()
14:51:31.026 INFO DAGScheduler - Submitting ResultStage 135 (MapPartitionsRDD[624] at mapToPair at BamSink.java:91), which has no missing parents
14:51:31.035 INFO MemoryStore - Block broadcast_252 stored as values in memory (estimated size 241.4 KiB, free 1916.8 MiB)
14:51:31.036 INFO MemoryStore - Block broadcast_252_piece0 stored as bytes in memory (estimated size 67.0 KiB, free 1916.8 MiB)
14:51:31.036 INFO BlockManagerInfo - Added broadcast_252_piece0 in memory on localhost:44923 (size: 67.0 KiB, free: 1919.4 MiB)
14:51:31.036 INFO SparkContext - Created broadcast 252 from broadcast at DAGScheduler.scala:1580
14:51:31.036 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 135 (MapPartitionsRDD[624] at mapToPair at BamSink.java:91) (first 15 tasks are for partitions Vector(0))
14:51:31.036 INFO TaskSchedulerImpl - Adding task set 135.0 with 1 tasks resource profile 0
14:51:31.037 INFO TaskSetManager - Starting task 0.0 in stage 135.0 (TID 191) (localhost, executor driver, partition 0, NODE_LOCAL, 7513 bytes)
14:51:31.037 INFO Executor - Running task 0.0 in stage 135.0 (TID 191)
14:51:31.042 INFO ShuffleBlockFetcherIterator - Getting 1 (343.8 KiB) non-empty blocks including 1 (343.8 KiB) local and 0 (0.0 B) host-local and 0 (0.0 B) push-merged-local and 0 (0.0 B) remote blocks
14:51:31.042 INFO ShuffleBlockFetcherIterator - Started 0 remote fetches in 0 ms
14:51:31.054 INFO PathOutputCommitterFactory - No output committer factory defined, defaulting to FileOutputCommitterFactory
14:51:31.054 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
14:51:31.054 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
14:51:31.054 INFO PathOutputCommitterFactory - No output committer factory defined, defaulting to FileOutputCommitterFactory
14:51:31.054 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
14:51:31.054 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
14:51:31.078 INFO FileOutputCommitter - Saved output of task 'attempt_202603041451307854179409748239521_0624_r_000000_0' to file:/tmp/ReadsSparkSinkUnitTest13155466689265805089.bam.parts/_temporary/0/task_202603041451307854179409748239521_0624_r_000000
14:51:31.078 INFO SparkHadoopMapRedUtil - attempt_202603041451307854179409748239521_0624_r_000000_0: Committed. Elapsed time: 0 ms.
14:51:31.079 INFO Executor - Finished task 0.0 in stage 135.0 (TID 191). 1858 bytes result sent to driver
14:51:31.080 INFO TaskSetManager - Finished task 0.0 in stage 135.0 (TID 191) in 43 ms on localhost (executor driver) (1/1)
14:51:31.080 INFO TaskSchedulerImpl - Removed TaskSet 135.0, whose tasks have all completed, from pool
14:51:31.080 INFO DAGScheduler - ResultStage 135 (runJob at SparkHadoopWriter.scala:83) finished in 0.053 s
14:51:31.080 INFO DAGScheduler - Job 95 is finished. Cancelling potential speculative or zombie tasks for this job
14:51:31.080 INFO TaskSchedulerImpl - Killing all running tasks in stage 135: Stage finished
14:51:31.080 INFO DAGScheduler - Job 95 finished: runJob at SparkHadoopWriter.scala:83, took 0.131729 s
14:51:31.081 INFO SparkHadoopWriter - Start to commit write Job job_202603041451307854179409748239521_0624.
14:51:31.086 INFO SparkHadoopWriter - Write Job job_202603041451307854179409748239521_0624 committed. Elapsed time: 5 ms.
14:51:31.103 INFO HadoopFileSystemWrapper - Concatenating 3 parts to /tmp/ReadsSparkSinkUnitTest13155466689265805089.bam
14:51:31.108 INFO HadoopFileSystemWrapper - Concatenating to /tmp/ReadsSparkSinkUnitTest13155466689265805089.bam done
14:51:31.108 INFO IndexFileMerger - Merging .sbi files in temp directory /tmp/ReadsSparkSinkUnitTest13155466689265805089.bam.parts/ to /tmp/ReadsSparkSinkUnitTest13155466689265805089.bam.sbi
14:51:31.114 INFO IndexFileMerger - Done merging .sbi files
14:51:31.116 INFO MemoryStore - Block broadcast_253 stored as values in memory (estimated size 320.0 B, free 1916.8 MiB)
14:51:31.116 INFO MemoryStore - Block broadcast_253_piece0 stored as bytes in memory (estimated size 233.0 B, free 1916.8 MiB)
14:51:31.116 INFO BlockManagerInfo - Added broadcast_253_piece0 in memory on localhost:44923 (size: 233.0 B, free: 1919.4 MiB)
14:51:31.117 INFO SparkContext - Created broadcast 253 from broadcast at BamSource.java:104
14:51:31.118 INFO MemoryStore - Block broadcast_254 stored as values in memory (estimated size 297.9 KiB, free 1916.5 MiB)
14:51:31.125 INFO MemoryStore - Block broadcast_254_piece0 stored as bytes in memory (estimated size 50.2 KiB, free 1916.4 MiB)
14:51:31.125 INFO BlockManagerInfo - Added broadcast_254_piece0 in memory on localhost:44923 (size: 50.2 KiB, free: 1919.4 MiB)
14:51:31.126 INFO SparkContext - Created broadcast 254 from newAPIHadoopFile at PathSplitSource.java:96
14:51:31.135 INFO FileInputFormat - Total input files to process : 1
14:51:31.150 INFO SparkContext - Starting job: collect at ReadsSparkSinkUnitTest.java:182
14:51:31.150 INFO DAGScheduler - Got job 96 (collect at ReadsSparkSinkUnitTest.java:182) with 1 output partitions
14:51:31.150 INFO DAGScheduler - Final stage: ResultStage 136 (collect at ReadsSparkSinkUnitTest.java:182)
14:51:31.150 INFO DAGScheduler - Parents of final stage: List()
14:51:31.150 INFO DAGScheduler - Missing parents: List()
14:51:31.150 INFO DAGScheduler - Submitting ResultStage 136 (MapPartitionsRDD[630] at filter at ReadsSparkSource.java:96), which has no missing parents
14:51:31.156 INFO MemoryStore - Block broadcast_255 stored as values in memory (estimated size 148.2 KiB, free 1916.3 MiB)
14:51:31.157 INFO MemoryStore - Block broadcast_255_piece0 stored as bytes in memory (estimated size 54.6 KiB, free 1916.2 MiB)
14:51:31.157 INFO BlockManagerInfo - Added broadcast_255_piece0 in memory on localhost:44923 (size: 54.6 KiB, free: 1919.3 MiB)
14:51:31.158 INFO SparkContext - Created broadcast 255 from broadcast at DAGScheduler.scala:1580
14:51:31.158 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 136 (MapPartitionsRDD[630] at filter at ReadsSparkSource.java:96) (first 15 tasks are for partitions Vector(0))
14:51:31.158 INFO TaskSchedulerImpl - Adding task set 136.0 with 1 tasks resource profile 0
14:51:31.159 INFO TaskSetManager - Starting task 0.0 in stage 136.0 (TID 192) (localhost, executor driver, partition 0, PROCESS_LOCAL, 7809 bytes)
14:51:31.159 INFO Executor - Running task 0.0 in stage 136.0 (TID 192)
14:51:31.172 INFO NewHadoopRDD - Input split: file:/tmp/ReadsSparkSinkUnitTest13155466689265805089.bam:0+237038
14:51:31.177 INFO Executor - Finished task 0.0 in stage 136.0 (TID 192). 651483 bytes result sent to driver
14:51:31.178 INFO TaskSetManager - Finished task 0.0 in stage 136.0 (TID 192) in 19 ms on localhost (executor driver) (1/1)
14:51:31.179 INFO TaskSchedulerImpl - Removed TaskSet 136.0, whose tasks have all completed, from pool
14:51:31.179 INFO DAGScheduler - ResultStage 136 (collect at ReadsSparkSinkUnitTest.java:182) finished in 0.029 s
14:51:31.179 INFO DAGScheduler - Job 96 is finished. Cancelling potential speculative or zombie tasks for this job
14:51:31.179 INFO TaskSchedulerImpl - Killing all running tasks in stage 136: Stage finished
14:51:31.179 INFO DAGScheduler - Job 96 finished: collect at ReadsSparkSinkUnitTest.java:182, took 0.029372 s
14:51:31.189 INFO SparkContext - Starting job: count at ReadsSparkSinkUnitTest.java:185
14:51:31.189 INFO DAGScheduler - Got job 97 (count at ReadsSparkSinkUnitTest.java:185) with 1 output partitions
14:51:31.189 INFO DAGScheduler - Final stage: ResultStage 137 (count at ReadsSparkSinkUnitTest.java:185)
14:51:31.189 INFO DAGScheduler - Parents of final stage: List()
14:51:31.189 INFO DAGScheduler - Missing parents: List()
14:51:31.189 INFO DAGScheduler - Submitting ResultStage 137 (MapPartitionsRDD[612] at filter at ReadsSparkSource.java:96), which has no missing parents
14:51:31.206 INFO MemoryStore - Block broadcast_256 stored as values in memory (estimated size 426.1 KiB, free 1915.8 MiB)
14:51:31.208 INFO MemoryStore - Block broadcast_256_piece0 stored as bytes in memory (estimated size 153.6 KiB, free 1915.7 MiB)
14:51:31.208 INFO BlockManagerInfo - Added broadcast_256_piece0 in memory on localhost:44923 (size: 153.6 KiB, free: 1919.2 MiB)
14:51:31.208 INFO SparkContext - Created broadcast 256 from broadcast at DAGScheduler.scala:1580
14:51:31.208 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 137 (MapPartitionsRDD[612] at filter at ReadsSparkSource.java:96) (first 15 tasks are for partitions Vector(0))
14:51:31.208 INFO TaskSchedulerImpl - Adding task set 137.0 with 1 tasks resource profile 0
14:51:31.209 INFO TaskSetManager - Starting task 0.0 in stage 137.0 (TID 193) (localhost, executor driver, partition 0, PROCESS_LOCAL, 7872 bytes)
14:51:31.209 INFO Executor - Running task 0.0 in stage 137.0 (TID 193)
14:51:31.248 INFO NewHadoopRDD - Input split: file:/home/runner/work/gatk/gatk/src/test/resources/org/broadinstitute/hellbender/tools/BQSR/HiSeq.1mb.1RG.2k_lines.bam:0+222075
14:51:31.258 INFO Executor - Finished task 0.0 in stage 137.0 (TID 193). 989 bytes result sent to driver
14:51:31.259 INFO TaskSetManager - Finished task 0.0 in stage 137.0 (TID 193) in 50 ms on localhost (executor driver) (1/1)
14:51:31.259 INFO TaskSchedulerImpl - Removed TaskSet 137.0, whose tasks have all completed, from pool
14:51:31.259 INFO DAGScheduler - ResultStage 137 (count at ReadsSparkSinkUnitTest.java:185) finished in 0.069 s
14:51:31.259 INFO DAGScheduler - Job 97 is finished. Cancelling potential speculative or zombie tasks for this job
14:51:31.259 INFO TaskSchedulerImpl - Killing all running tasks in stage 137: Stage finished
14:51:31.259 INFO DAGScheduler - Job 97 finished: count at ReadsSparkSinkUnitTest.java:185, took 0.070740 s
14:51:31.263 INFO SparkContext - Starting job: count at ReadsSparkSinkUnitTest.java:185
14:51:31.263 INFO DAGScheduler - Got job 98 (count at ReadsSparkSinkUnitTest.java:185) with 1 output partitions
14:51:31.263 INFO DAGScheduler - Final stage: ResultStage 138 (count at ReadsSparkSinkUnitTest.java:185)
14:51:31.263 INFO DAGScheduler - Parents of final stage: List()
14:51:31.263 INFO DAGScheduler - Missing parents: List()
14:51:31.263 INFO DAGScheduler - Submitting ResultStage 138 (MapPartitionsRDD[630] at filter at ReadsSparkSource.java:96), which has no missing parents
14:51:31.274 INFO MemoryStore - Block broadcast_257 stored as values in memory (estimated size 148.1 KiB, free 1915.5 MiB)
14:51:31.275 INFO MemoryStore - Block broadcast_257_piece0 stored as bytes in memory (estimated size 54.5 KiB, free 1915.5 MiB)
14:51:31.275 INFO BlockManagerInfo - Added broadcast_257_piece0 in memory on localhost:44923 (size: 54.5 KiB, free: 1919.1 MiB)
14:51:31.275 INFO SparkContext - Created broadcast 257 from broadcast at DAGScheduler.scala:1580
14:51:31.275 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 138 (MapPartitionsRDD[630] at filter at ReadsSparkSource.java:96) (first 15 tasks are for partitions Vector(0))
14:51:31.275 INFO TaskSchedulerImpl - Adding task set 138.0 with 1 tasks resource profile 0
14:51:31.276 INFO TaskSetManager - Starting task 0.0 in stage 138.0 (TID 194) (localhost, executor driver, partition 0, PROCESS_LOCAL, 7809 bytes)
14:51:31.277 INFO Executor - Running task 0.0 in stage 138.0 (TID 194)
14:51:31.290 INFO NewHadoopRDD - Input split: file:/tmp/ReadsSparkSinkUnitTest13155466689265805089.bam:0+237038
14:51:31.294 INFO Executor - Finished task 0.0 in stage 138.0 (TID 194). 989 bytes result sent to driver
14:51:31.294 INFO TaskSetManager - Finished task 0.0 in stage 138.0 (TID 194) in 18 ms on localhost (executor driver) (1/1)
14:51:31.295 INFO TaskSchedulerImpl - Removed TaskSet 138.0, whose tasks have all completed, from pool
14:51:31.295 INFO DAGScheduler - ResultStage 138 (count at ReadsSparkSinkUnitTest.java:185) finished in 0.031 s
14:51:31.295 INFO DAGScheduler - Job 98 is finished. Cancelling potential speculative or zombie tasks for this job
14:51:31.295 INFO TaskSchedulerImpl - Killing all running tasks in stage 138: Stage finished
14:51:31.295 INFO DAGScheduler - Job 98 finished: count at ReadsSparkSinkUnitTest.java:185, took 0.031939 s
14:51:31.298 INFO MemoryStore - Block broadcast_258 stored as values in memory (estimated size 297.9 KiB, free 1915.2 MiB)
14:51:31.304 INFO MemoryStore - Block broadcast_258_piece0 stored as bytes in memory (estimated size 50.2 KiB, free 1915.1 MiB)
14:51:31.305 INFO BlockManagerInfo - Added broadcast_258_piece0 in memory on localhost:44923 (size: 50.2 KiB, free: 1919.1 MiB)
14:51:31.305 INFO SparkContext - Created broadcast 258 from newAPIHadoopFile at PathSplitSource.java:96
14:51:31.328 INFO MemoryStore - Block broadcast_259 stored as values in memory (estimated size 297.9 KiB, free 1914.8 MiB)
14:51:31.334 INFO MemoryStore - Block broadcast_259_piece0 stored as bytes in memory (estimated size 50.2 KiB, free 1914.8 MiB)
14:51:31.334 INFO BlockManagerInfo - Added broadcast_259_piece0 in memory on localhost:44923 (size: 50.2 KiB, free: 1919.0 MiB)
14:51:31.335 INFO SparkContext - Created broadcast 259 from newAPIHadoopFile at PathSplitSource.java:96
14:51:31.355 INFO FileInputFormat - Total input files to process : 1
14:51:31.357 INFO MemoryStore - Block broadcast_260 stored as values in memory (estimated size 160.7 KiB, free 1914.6 MiB)
14:51:31.363 INFO MemoryStore - Block broadcast_260_piece0 stored as bytes in memory (estimated size 9.6 KiB, free 1914.9 MiB)
14:51:31.363 INFO BlockManagerInfo - Removed broadcast_252_piece0 on localhost:44923 in memory (size: 67.0 KiB, free: 1919.1 MiB)
14:51:31.363 INFO BlockManagerInfo - Added broadcast_260_piece0 in memory on localhost:44923 (size: 9.6 KiB, free: 1919.1 MiB)
14:51:31.363 INFO SparkContext - Created broadcast 260 from broadcast at ReadsSparkSink.java:133
14:51:31.363 INFO BlockManagerInfo - Removed broadcast_250_piece0 on localhost:44923 in memory (size: 9.6 KiB, free: 1919.1 MiB)
14:51:31.364 INFO BlockManagerInfo - Removed broadcast_246_piece0 on localhost:44923 in memory (size: 153.6 KiB, free: 1919.2 MiB)
14:51:31.364 WARN HtsjdkReadsRddStorage - Unrecognized write option: DISABLE
14:51:31.364 WARN HtsjdkReadsRddStorage - Unrecognized write option: DISABLE
14:51:31.364 INFO BlockManagerInfo - Removed broadcast_237_piece0 on localhost:44923 in memory (size: 50.2 KiB, free: 1919.3 MiB)
14:51:31.365 INFO BlockManagerInfo - Removed broadcast_256_piece0 on localhost:44923 in memory (size: 153.6 KiB, free: 1919.4 MiB)
14:51:31.365 INFO MemoryStore - Block broadcast_261 stored as values in memory (estimated size 163.2 KiB, free 1916.4 MiB)
14:51:31.366 INFO BlockManagerInfo - Removed broadcast_254_piece0 on localhost:44923 in memory (size: 50.2 KiB, free: 1919.5 MiB)
14:51:31.366 INFO BlockManagerInfo - Removed broadcast_255_piece0 on localhost:44923 in memory (size: 54.6 KiB, free: 1919.5 MiB)
14:51:31.366 INFO MemoryStore - Block broadcast_261_piece0 stored as bytes in memory (estimated size 9.6 KiB, free 1916.9 MiB)
14:51:31.366 INFO BlockManagerInfo - Added broadcast_261_piece0 in memory on localhost:44923 (size: 9.6 KiB, free: 1919.5 MiB)
14:51:31.367 INFO SparkContext - Created broadcast 261 from broadcast at BamSink.java:76
14:51:31.367 INFO BlockManagerInfo - Removed broadcast_259_piece0 on localhost:44923 in memory (size: 50.2 KiB, free: 1919.6 MiB)
14:51:31.367 INFO BlockManagerInfo - Removed broadcast_249_piece0 on localhost:44923 in memory (size: 9.6 KiB, free: 1919.6 MiB)
14:51:31.368 INFO BlockManagerInfo - Removed broadcast_243_piece0 on localhost:44923 in memory (size: 50.2 KiB, free: 1919.6 MiB)
14:51:31.368 INFO BlockManagerInfo - Removed broadcast_248_piece0 on localhost:44923 in memory (size: 50.2 KiB, free: 1919.7 MiB)
14:51:31.369 INFO BlockManagerInfo - Removed broadcast_251_piece0 on localhost:44923 in memory (size: 166.1 KiB, free: 1919.8 MiB)
14:51:31.369 INFO BlockManagerInfo - Removed broadcast_257_piece0 on localhost:44923 in memory (size: 54.5 KiB, free: 1919.9 MiB)
14:51:31.369 INFO PathOutputCommitterFactory - No output committer factory defined, defaulting to FileOutputCommitterFactory
14:51:31.370 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
14:51:31.370 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
14:51:31.370 INFO BlockManagerInfo - Removed broadcast_253_piece0 on localhost:44923 in memory (size: 233.0 B, free: 1919.9 MiB)
14:51:31.372 INFO BlockManagerInfo - Removed broadcast_247_piece0 on localhost:44923 in memory (size: 50.2 KiB, free: 1919.9 MiB)
14:51:31.391 INFO SparkContext - Starting job: runJob at SparkHadoopWriter.scala:83
14:51:31.392 INFO DAGScheduler - Registering RDD 644 (mapToPair at SparkUtils.java:161) as input to shuffle 29
14:51:31.392 INFO DAGScheduler - Got job 99 (runJob at SparkHadoopWriter.scala:83) with 1 output partitions
14:51:31.392 INFO DAGScheduler - Final stage: ResultStage 140 (runJob at SparkHadoopWriter.scala:83)
14:51:31.392 INFO DAGScheduler - Parents of final stage: List(ShuffleMapStage 139)
14:51:31.392 INFO DAGScheduler - Missing parents: List(ShuffleMapStage 139)
14:51:31.393 INFO DAGScheduler - Submitting ShuffleMapStage 139 (MapPartitionsRDD[644] at mapToPair at SparkUtils.java:161), which has no missing parents
14:51:31.410 INFO MemoryStore - Block broadcast_262 stored as values in memory (estimated size 520.4 KiB, free 1918.8 MiB)
14:51:31.412 INFO MemoryStore - Block broadcast_262_piece0 stored as bytes in memory (estimated size 166.1 KiB, free 1918.7 MiB)
14:51:31.412 INFO BlockManagerInfo - Added broadcast_262_piece0 in memory on localhost:44923 (size: 166.1 KiB, free: 1919.8 MiB)
14:51:31.412 INFO SparkContext - Created broadcast 262 from broadcast at DAGScheduler.scala:1580
14:51:31.412 INFO DAGScheduler - Submitting 1 missing tasks from ShuffleMapStage 139 (MapPartitionsRDD[644] at mapToPair at SparkUtils.java:161) (first 15 tasks are for partitions Vector(0))
14:51:31.412 INFO TaskSchedulerImpl - Adding task set 139.0 with 1 tasks resource profile 0
14:51:31.413 INFO TaskSetManager - Starting task 0.0 in stage 139.0 (TID 195) (localhost, executor driver, partition 0, PROCESS_LOCAL, 7861 bytes)
14:51:31.413 INFO Executor - Running task 0.0 in stage 139.0 (TID 195)
14:51:31.449 INFO NewHadoopRDD - Input split: file:/home/runner/work/gatk/gatk/src/test/resources/org/broadinstitute/hellbender/tools/BQSR/HiSeq.1mb.1RG.2k_lines.bam:0+222075
14:51:31.466 INFO Executor - Finished task 0.0 in stage 139.0 (TID 195). 1148 bytes result sent to driver
14:51:31.467 INFO TaskSetManager - Finished task 0.0 in stage 139.0 (TID 195) in 54 ms on localhost (executor driver) (1/1)
14:51:31.467 INFO TaskSchedulerImpl - Removed TaskSet 139.0, whose tasks have all completed, from pool
14:51:31.467 INFO DAGScheduler - ShuffleMapStage 139 (mapToPair at SparkUtils.java:161) finished in 0.074 s
14:51:31.467 INFO DAGScheduler - looking for newly runnable stages
14:51:31.467 INFO DAGScheduler - running: HashSet()
14:51:31.467 INFO DAGScheduler - waiting: HashSet(ResultStage 140)
14:51:31.467 INFO DAGScheduler - failed: HashSet()
14:51:31.467 INFO DAGScheduler - Submitting ResultStage 140 (MapPartitionsRDD[649] at mapToPair at BamSink.java:91), which has no missing parents
14:51:31.474 INFO MemoryStore - Block broadcast_263 stored as values in memory (estimated size 241.4 KiB, free 1918.4 MiB)
14:51:31.475 INFO MemoryStore - Block broadcast_263_piece0 stored as bytes in memory (estimated size 67.0 KiB, free 1918.4 MiB)
14:51:31.476 INFO BlockManagerInfo - Added broadcast_263_piece0 in memory on localhost:44923 (size: 67.0 KiB, free: 1919.7 MiB)
14:51:31.476 INFO SparkContext - Created broadcast 263 from broadcast at DAGScheduler.scala:1580
14:51:31.476 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 140 (MapPartitionsRDD[649] at mapToPair at BamSink.java:91) (first 15 tasks are for partitions Vector(0))
14:51:31.476 INFO TaskSchedulerImpl - Adding task set 140.0 with 1 tasks resource profile 0
14:51:31.477 INFO TaskSetManager - Starting task 0.0 in stage 140.0 (TID 196) (localhost, executor driver, partition 0, NODE_LOCAL, 7513 bytes)
14:51:31.477 INFO Executor - Running task 0.0 in stage 140.0 (TID 196)
14:51:31.482 INFO ShuffleBlockFetcherIterator - Getting 1 (343.8 KiB) non-empty blocks including 1 (343.8 KiB) local and 0 (0.0 B) host-local and 0 (0.0 B) push-merged-local and 0 (0.0 B) remote blocks
14:51:31.482 INFO ShuffleBlockFetcherIterator - Started 0 remote fetches in 0 ms
14:51:31.494 INFO PathOutputCommitterFactory - No output committer factory defined, defaulting to FileOutputCommitterFactory
14:51:31.494 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
14:51:31.494 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
14:51:31.494 INFO PathOutputCommitterFactory - No output committer factory defined, defaulting to FileOutputCommitterFactory
14:51:31.494 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
14:51:31.494 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
14:51:31.511 INFO FileOutputCommitter - Saved output of task 'attempt_20260304145131333695463525078374_0649_r_000000_0' to file:/tmp/ReadsSparkSinkUnitTest115549008421436922882.bam.parts/_temporary/0/task_20260304145131333695463525078374_0649_r_000000
14:51:31.511 INFO SparkHadoopMapRedUtil - attempt_20260304145131333695463525078374_0649_r_000000_0: Committed. Elapsed time: 0 ms.
14:51:31.512 INFO Executor - Finished task 0.0 in stage 140.0 (TID 196). 1858 bytes result sent to driver
14:51:31.512 INFO TaskSetManager - Finished task 0.0 in stage 140.0 (TID 196) in 35 ms on localhost (executor driver) (1/1)
14:51:31.512 INFO TaskSchedulerImpl - Removed TaskSet 140.0, whose tasks have all completed, from pool
14:51:31.513 INFO DAGScheduler - ResultStage 140 (runJob at SparkHadoopWriter.scala:83) finished in 0.044 s
14:51:31.513 INFO DAGScheduler - Job 99 is finished. Cancelling potential speculative or zombie tasks for this job
14:51:31.513 INFO TaskSchedulerImpl - Killing all running tasks in stage 140: Stage finished
14:51:31.513 INFO DAGScheduler - Job 99 finished: runJob at SparkHadoopWriter.scala:83, took 0.121345 s
14:51:31.513 INFO SparkHadoopWriter - Start to commit write Job job_20260304145131333695463525078374_0649.
14:51:31.519 INFO SparkHadoopWriter - Write Job job_20260304145131333695463525078374_0649 committed. Elapsed time: 5 ms.
14:51:31.533 INFO HadoopFileSystemWrapper - Concatenating 3 parts to /tmp/ReadsSparkSinkUnitTest115549008421436922882.bam
14:51:31.538 INFO HadoopFileSystemWrapper - Concatenating to /tmp/ReadsSparkSinkUnitTest115549008421436922882.bam done
14:51:31.541 INFO MemoryStore - Block broadcast_264 stored as values in memory (estimated size 297.9 KiB, free 1918.1 MiB)
14:51:31.547 INFO MemoryStore - Block broadcast_264_piece0 stored as bytes in memory (estimated size 50.2 KiB, free 1918.0 MiB)
14:51:31.547 INFO BlockManagerInfo - Added broadcast_264_piece0 in memory on localhost:44923 (size: 50.2 KiB, free: 1919.7 MiB)
14:51:31.548 INFO SparkContext - Created broadcast 264 from newAPIHadoopFile at PathSplitSource.java:96
14:51:31.569 INFO FileInputFormat - Total input files to process : 1
14:51:31.605 INFO SparkContext - Starting job: collect at ReadsSparkSinkUnitTest.java:182
14:51:31.605 INFO DAGScheduler - Got job 100 (collect at ReadsSparkSinkUnitTest.java:182) with 1 output partitions
14:51:31.605 INFO DAGScheduler - Final stage: ResultStage 141 (collect at ReadsSparkSinkUnitTest.java:182)
14:51:31.605 INFO DAGScheduler - Parents of final stage: List()
14:51:31.605 INFO DAGScheduler - Missing parents: List()
14:51:31.605 INFO DAGScheduler - Submitting ResultStage 141 (MapPartitionsRDD[656] at filter at ReadsSparkSource.java:96), which has no missing parents
14:51:31.623 INFO MemoryStore - Block broadcast_265 stored as values in memory (estimated size 426.2 KiB, free 1917.6 MiB)
14:51:31.624 INFO MemoryStore - Block broadcast_265_piece0 stored as bytes in memory (estimated size 153.6 KiB, free 1917.4 MiB)
14:51:31.624 INFO BlockManagerInfo - Added broadcast_265_piece0 in memory on localhost:44923 (size: 153.6 KiB, free: 1919.5 MiB)
14:51:31.624 INFO SparkContext - Created broadcast 265 from broadcast at DAGScheduler.scala:1580
14:51:31.625 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 141 (MapPartitionsRDD[656] at filter at ReadsSparkSource.java:96) (first 15 tasks are for partitions Vector(0))
14:51:31.625 INFO TaskSchedulerImpl - Adding task set 141.0 with 1 tasks resource profile 0
14:51:31.625 INFO TaskSetManager - Starting task 0.0 in stage 141.0 (TID 197) (localhost, executor driver, partition 0, PROCESS_LOCAL, 7810 bytes)
14:51:31.626 INFO Executor - Running task 0.0 in stage 141.0 (TID 197)
14:51:31.660 INFO NewHadoopRDD - Input split: file:/tmp/ReadsSparkSinkUnitTest115549008421436922882.bam:0+237038
14:51:31.673 INFO Executor - Finished task 0.0 in stage 141.0 (TID 197). 651483 bytes result sent to driver
14:51:31.674 INFO TaskSetManager - Finished task 0.0 in stage 141.0 (TID 197) in 49 ms on localhost (executor driver) (1/1)
14:51:31.674 INFO TaskSchedulerImpl - Removed TaskSet 141.0, whose tasks have all completed, from pool
14:51:31.675 INFO DAGScheduler - ResultStage 141 (collect at ReadsSparkSinkUnitTest.java:182) finished in 0.069 s
14:51:31.675 INFO DAGScheduler - Job 100 is finished. Cancelling potential speculative or zombie tasks for this job
14:51:31.675 INFO TaskSchedulerImpl - Killing all running tasks in stage 141: Stage finished
14:51:31.675 INFO DAGScheduler - Job 100 finished: collect at ReadsSparkSinkUnitTest.java:182, took 0.070174 s
14:51:31.684 INFO SparkContext - Starting job: count at ReadsSparkSinkUnitTest.java:185
14:51:31.685 INFO DAGScheduler - Got job 101 (count at ReadsSparkSinkUnitTest.java:185) with 1 output partitions
14:51:31.685 INFO DAGScheduler - Final stage: ResultStage 142 (count at ReadsSparkSinkUnitTest.java:185)
14:51:31.685 INFO DAGScheduler - Parents of final stage: List()
14:51:31.685 INFO DAGScheduler - Missing parents: List()
14:51:31.685 INFO DAGScheduler - Submitting ResultStage 142 (MapPartitionsRDD[637] at filter at ReadsSparkSource.java:96), which has no missing parents
14:51:31.702 INFO MemoryStore - Block broadcast_266 stored as values in memory (estimated size 426.1 KiB, free 1917.0 MiB)
14:51:31.704 INFO MemoryStore - Block broadcast_266_piece0 stored as bytes in memory (estimated size 153.6 KiB, free 1916.9 MiB)
14:51:31.704 INFO BlockManagerInfo - Added broadcast_266_piece0 in memory on localhost:44923 (size: 153.6 KiB, free: 1919.4 MiB)
14:51:31.704 INFO SparkContext - Created broadcast 266 from broadcast at DAGScheduler.scala:1580
14:51:31.704 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 142 (MapPartitionsRDD[637] at filter at ReadsSparkSource.java:96) (first 15 tasks are for partitions Vector(0))
14:51:31.704 INFO TaskSchedulerImpl - Adding task set 142.0 with 1 tasks resource profile 0
14:51:31.705 INFO TaskSetManager - Starting task 0.0 in stage 142.0 (TID 198) (localhost, executor driver, partition 0, PROCESS_LOCAL, 7872 bytes)
14:51:31.705 INFO Executor - Running task 0.0 in stage 142.0 (TID 198)
14:51:31.740 INFO NewHadoopRDD - Input split: file:/home/runner/work/gatk/gatk/src/test/resources/org/broadinstitute/hellbender/tools/BQSR/HiSeq.1mb.1RG.2k_lines.bam:0+222075
14:51:31.750 INFO Executor - Finished task 0.0 in stage 142.0 (TID 198). 989 bytes result sent to driver
14:51:31.751 INFO TaskSetManager - Finished task 0.0 in stage 142.0 (TID 198) in 45 ms on localhost (executor driver) (1/1)
14:51:31.751 INFO TaskSchedulerImpl - Removed TaskSet 142.0, whose tasks have all completed, from pool
14:51:31.751 INFO DAGScheduler - ResultStage 142 (count at ReadsSparkSinkUnitTest.java:185) finished in 0.066 s
14:51:31.751 INFO DAGScheduler - Job 101 is finished. Cancelling potential speculative or zombie tasks for this job
14:51:31.751 INFO TaskSchedulerImpl - Killing all running tasks in stage 142: Stage finished
14:51:31.751 INFO DAGScheduler - Job 101 finished: count at ReadsSparkSinkUnitTest.java:185, took 0.066542 s
14:51:31.755 INFO SparkContext - Starting job: count at ReadsSparkSinkUnitTest.java:185
14:51:31.755 INFO DAGScheduler - Got job 102 (count at ReadsSparkSinkUnitTest.java:185) with 1 output partitions
14:51:31.755 INFO DAGScheduler - Final stage: ResultStage 143 (count at ReadsSparkSinkUnitTest.java:185)
14:51:31.755 INFO DAGScheduler - Parents of final stage: List()
14:51:31.755 INFO DAGScheduler - Missing parents: List()
14:51:31.755 INFO DAGScheduler - Submitting ResultStage 143 (MapPartitionsRDD[656] at filter at ReadsSparkSource.java:96), which has no missing parents
14:51:31.773 INFO MemoryStore - Block broadcast_267 stored as values in memory (estimated size 426.1 KiB, free 1916.5 MiB)
14:51:31.774 INFO MemoryStore - Block broadcast_267_piece0 stored as bytes in memory (estimated size 153.6 KiB, free 1916.3 MiB)
14:51:31.774 INFO BlockManagerInfo - Added broadcast_267_piece0 in memory on localhost:44923 (size: 153.6 KiB, free: 1919.2 MiB)
14:51:31.774 INFO SparkContext - Created broadcast 267 from broadcast at DAGScheduler.scala:1580
14:51:31.775 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 143 (MapPartitionsRDD[656] at filter at ReadsSparkSource.java:96) (first 15 tasks are for partitions Vector(0))
14:51:31.775 INFO TaskSchedulerImpl - Adding task set 143.0 with 1 tasks resource profile 0
14:51:31.775 INFO TaskSetManager - Starting task 0.0 in stage 143.0 (TID 199) (localhost, executor driver, partition 0, PROCESS_LOCAL, 7810 bytes)
14:51:31.776 INFO Executor - Running task 0.0 in stage 143.0 (TID 199)
14:51:31.809 INFO NewHadoopRDD - Input split: file:/tmp/ReadsSparkSinkUnitTest115549008421436922882.bam:0+237038
14:51:31.820 INFO Executor - Finished task 0.0 in stage 143.0 (TID 199). 989 bytes result sent to driver
14:51:31.821 INFO TaskSetManager - Finished task 0.0 in stage 143.0 (TID 199) in 46 ms on localhost (executor driver) (1/1)
14:51:31.821 INFO TaskSchedulerImpl - Removed TaskSet 143.0, whose tasks have all completed, from pool
14:51:31.821 INFO DAGScheduler - ResultStage 143 (count at ReadsSparkSinkUnitTest.java:185) finished in 0.065 s
14:51:31.821 INFO DAGScheduler - Job 102 is finished. Cancelling potential speculative or zombie tasks for this job
14:51:31.821 INFO TaskSchedulerImpl - Killing all running tasks in stage 143: Stage finished
14:51:31.821 INFO DAGScheduler - Job 102 finished: count at ReadsSparkSinkUnitTest.java:185, took 0.066710 s
14:51:31.824 INFO MemoryStore - Block broadcast_268 stored as values in memory (estimated size 298.0 KiB, free 1916.0 MiB)
14:51:31.830 INFO MemoryStore - Block broadcast_268_piece0 stored as bytes in memory (estimated size 50.3 KiB, free 1916.0 MiB)
14:51:31.831 INFO BlockManagerInfo - Added broadcast_268_piece0 in memory on localhost:44923 (size: 50.3 KiB, free: 1919.2 MiB)
14:51:31.831 INFO SparkContext - Created broadcast 268 from newAPIHadoopFile at PathSplitSource.java:96
14:51:31.859 INFO MemoryStore - Block broadcast_269 stored as values in memory (estimated size 298.0 KiB, free 1915.7 MiB)
14:51:31.865 INFO MemoryStore - Block broadcast_269_piece0 stored as bytes in memory (estimated size 50.3 KiB, free 1915.6 MiB)
14:51:31.865 INFO BlockManagerInfo - Added broadcast_269_piece0 in memory on localhost:44923 (size: 50.3 KiB, free: 1919.1 MiB)
14:51:31.865 INFO SparkContext - Created broadcast 269 from newAPIHadoopFile at PathSplitSource.java:96
14:51:31.886 INFO FileInputFormat - Total input files to process : 1
14:51:31.888 INFO MemoryStore - Block broadcast_270 stored as values in memory (estimated size 160.7 KiB, free 1915.5 MiB)
14:51:31.888 INFO MemoryStore - Block broadcast_270_piece0 stored as bytes in memory (estimated size 9.6 KiB, free 1915.5 MiB)
14:51:31.889 INFO BlockManagerInfo - Added broadcast_270_piece0 in memory on localhost:44923 (size: 9.6 KiB, free: 1919.1 MiB)
14:51:31.889 INFO SparkContext - Created broadcast 270 from broadcast at ReadsSparkSink.java:133
14:51:31.890 INFO MemoryStore - Block broadcast_271 stored as values in memory (estimated size 163.2 KiB, free 1915.3 MiB)
14:51:31.891 INFO MemoryStore - Block broadcast_271_piece0 stored as bytes in memory (estimated size 9.6 KiB, free 1915.3 MiB)
14:51:31.891 INFO BlockManagerInfo - Added broadcast_271_piece0 in memory on localhost:44923 (size: 9.6 KiB, free: 1919.1 MiB)
14:51:31.891 INFO SparkContext - Created broadcast 271 from broadcast at BamSink.java:76
14:51:31.893 INFO PathOutputCommitterFactory - No output committer factory defined, defaulting to FileOutputCommitterFactory
14:51:31.893 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
14:51:31.893 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
14:51:31.913 INFO SparkContext - Starting job: runJob at SparkHadoopWriter.scala:83
14:51:31.913 INFO DAGScheduler - Registering RDD 670 (mapToPair at SparkUtils.java:161) as input to shuffle 30
14:51:31.913 INFO DAGScheduler - Got job 103 (runJob at SparkHadoopWriter.scala:83) with 1 output partitions
14:51:31.913 INFO DAGScheduler - Final stage: ResultStage 145 (runJob at SparkHadoopWriter.scala:83)
14:51:31.913 INFO DAGScheduler - Parents of final stage: List(ShuffleMapStage 144)
14:51:31.913 INFO DAGScheduler - Missing parents: List(ShuffleMapStage 144)
14:51:31.914 INFO DAGScheduler - Submitting ShuffleMapStage 144 (MapPartitionsRDD[670] at mapToPair at SparkUtils.java:161), which has no missing parents
14:51:31.931 INFO MemoryStore - Block broadcast_272 stored as values in memory (estimated size 520.4 KiB, free 1914.8 MiB)
14:51:31.933 INFO MemoryStore - Block broadcast_272_piece0 stored as bytes in memory (estimated size 166.1 KiB, free 1914.6 MiB)
14:51:31.933 INFO BlockManagerInfo - Added broadcast_272_piece0 in memory on localhost:44923 (size: 166.1 KiB, free: 1918.9 MiB)
14:51:31.933 INFO SparkContext - Created broadcast 272 from broadcast at DAGScheduler.scala:1580
14:51:31.934 INFO DAGScheduler - Submitting 1 missing tasks from ShuffleMapStage 144 (MapPartitionsRDD[670] at mapToPair at SparkUtils.java:161) (first 15 tasks are for partitions Vector(0))
14:51:31.934 INFO TaskSchedulerImpl - Adding task set 144.0 with 1 tasks resource profile 0
14:51:31.934 INFO TaskSetManager - Starting task 0.0 in stage 144.0 (TID 200) (localhost, executor driver, partition 0, PROCESS_LOCAL, 7901 bytes)
14:51:31.935 INFO Executor - Running task 0.0 in stage 144.0 (TID 200)
14:51:31.971 INFO NewHadoopRDD - Input split: file:/home/runner/work/gatk/gatk/src/test/resources/org/broadinstitute/hellbender/tools/BQSR/expected.HiSeq.1mb.1RG.2k_lines.alternate.recalibrated.DIQ.bam:0+216896
14:51:31.994 INFO Executor - Finished task 0.0 in stage 144.0 (TID 200). 1234 bytes result sent to driver
14:51:31.995 INFO BlockManagerInfo - Removed broadcast_262_piece0 on localhost:44923 in memory (size: 166.1 KiB, free: 1919.1 MiB)
14:51:31.995 INFO TaskSetManager - Finished task 0.0 in stage 144.0 (TID 200) in 61 ms on localhost (executor driver) (1/1)
14:51:31.995 INFO TaskSchedulerImpl - Removed TaskSet 144.0, whose tasks have all completed, from pool
14:51:31.995 INFO DAGScheduler - ShuffleMapStage 144 (mapToPair at SparkUtils.java:161) finished in 0.081 s
14:51:31.995 INFO DAGScheduler - looking for newly runnable stages
14:51:31.995 INFO DAGScheduler - running: HashSet()
14:51:31.995 INFO DAGScheduler - waiting: HashSet(ResultStage 145)
14:51:31.995 INFO DAGScheduler - failed: HashSet()
14:51:31.995 INFO BlockManagerInfo - Removed broadcast_263_piece0 on localhost:44923 in memory (size: 67.0 KiB, free: 1919.2 MiB)
14:51:31.996 INFO DAGScheduler - Submitting ResultStage 145 (MapPartitionsRDD[675] at mapToPair at BamSink.java:91), which has no missing parents
14:51:31.997 INFO BlockManagerInfo - Removed broadcast_260_piece0 on localhost:44923 in memory (size: 9.6 KiB, free: 1919.2 MiB)
14:51:31.997 INFO BlockManagerInfo - Removed broadcast_258_piece0 on localhost:44923 in memory (size: 50.2 KiB, free: 1919.2 MiB)
14:51:31.998 INFO BlockManagerInfo - Removed broadcast_267_piece0 on localhost:44923 in memory (size: 153.6 KiB, free: 1919.4 MiB)
14:51:31.998 INFO BlockManagerInfo - Removed broadcast_265_piece0 on localhost:44923 in memory (size: 153.6 KiB, free: 1919.5 MiB)
14:51:31.998 INFO BlockManagerInfo - Removed broadcast_266_piece0 on localhost:44923 in memory (size: 153.6 KiB, free: 1919.7 MiB)
14:51:31.999 INFO BlockManagerInfo - Removed broadcast_269_piece0 on localhost:44923 in memory (size: 50.3 KiB, free: 1919.7 MiB)
14:51:32.000 INFO BlockManagerInfo - Removed broadcast_261_piece0 on localhost:44923 in memory (size: 9.6 KiB, free: 1919.7 MiB)
14:51:32.000 INFO BlockManagerInfo - Removed broadcast_264_piece0 on localhost:44923 in memory (size: 50.2 KiB, free: 1919.8 MiB)
14:51:32.005 INFO MemoryStore - Block broadcast_273 stored as values in memory (estimated size 241.4 KiB, free 1918.4 MiB)
14:51:32.006 INFO MemoryStore - Block broadcast_273_piece0 stored as bytes in memory (estimated size 67.0 KiB, free 1918.4 MiB)
14:51:32.006 INFO BlockManagerInfo - Added broadcast_273_piece0 in memory on localhost:44923 (size: 67.0 KiB, free: 1919.7 MiB)
14:51:32.006 INFO SparkContext - Created broadcast 273 from broadcast at DAGScheduler.scala:1580
14:51:32.007 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 145 (MapPartitionsRDD[675] at mapToPair at BamSink.java:91) (first 15 tasks are for partitions Vector(0))
14:51:32.007 INFO TaskSchedulerImpl - Adding task set 145.0 with 1 tasks resource profile 0
14:51:32.007 INFO TaskSetManager - Starting task 0.0 in stage 145.0 (TID 201) (localhost, executor driver, partition 0, NODE_LOCAL, 7513 bytes)
14:51:32.008 INFO Executor - Running task 0.0 in stage 145.0 (TID 201)
14:51:32.012 INFO ShuffleBlockFetcherIterator - Getting 1 (343.8 KiB) non-empty blocks including 1 (343.8 KiB) local and 0 (0.0 B) host-local and 0 (0.0 B) push-merged-local and 0 (0.0 B) remote blocks
14:51:32.012 INFO ShuffleBlockFetcherIterator - Started 0 remote fetches in 0 ms
14:51:32.024 INFO PathOutputCommitterFactory - No output committer factory defined, defaulting to FileOutputCommitterFactory
14:51:32.024 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
14:51:32.024 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
14:51:32.024 INFO PathOutputCommitterFactory - No output committer factory defined, defaulting to FileOutputCommitterFactory
14:51:32.025 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
14:51:32.025 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
14:51:32.054 INFO FileOutputCommitter - Saved output of task 'attempt_202603041451315431950473203570548_0675_r_000000_0' to file:/tmp/ReadsSparkSinkUnitTest211196066682765412277.bam.parts/_temporary/0/task_202603041451315431950473203570548_0675_r_000000
14:51:32.054 INFO SparkHadoopMapRedUtil - attempt_202603041451315431950473203570548_0675_r_000000_0: Committed. Elapsed time: 0 ms.
14:51:32.054 INFO Executor - Finished task 0.0 in stage 145.0 (TID 201). 1858 bytes result sent to driver
14:51:32.055 INFO TaskSetManager - Finished task 0.0 in stage 145.0 (TID 201) in 48 ms on localhost (executor driver) (1/1)
14:51:32.055 INFO TaskSchedulerImpl - Removed TaskSet 145.0, whose tasks have all completed, from pool
14:51:32.055 INFO DAGScheduler - ResultStage 145 (runJob at SparkHadoopWriter.scala:83) finished in 0.059 s
14:51:32.055 INFO DAGScheduler - Job 103 is finished. Cancelling potential speculative or zombie tasks for this job
14:51:32.055 INFO TaskSchedulerImpl - Killing all running tasks in stage 145: Stage finished
14:51:32.055 INFO DAGScheduler - Job 103 finished: runJob at SparkHadoopWriter.scala:83, took 0.142569 s
14:51:32.055 INFO SparkHadoopWriter - Start to commit write Job job_202603041451315431950473203570548_0675.
14:51:32.061 INFO SparkHadoopWriter - Write Job job_202603041451315431950473203570548_0675 committed. Elapsed time: 5 ms.
14:51:32.077 INFO HadoopFileSystemWrapper - Concatenating 3 parts to /tmp/ReadsSparkSinkUnitTest211196066682765412277.bam
14:51:32.083 INFO HadoopFileSystemWrapper - Concatenating to /tmp/ReadsSparkSinkUnitTest211196066682765412277.bam done
14:51:32.083 INFO IndexFileMerger - Merging .sbi files in temp directory /tmp/ReadsSparkSinkUnitTest211196066682765412277.bam.parts/ to /tmp/ReadsSparkSinkUnitTest211196066682765412277.bam.sbi
14:51:32.088 INFO IndexFileMerger - Done merging .sbi files
14:51:32.088 INFO IndexFileMerger - Merging .bai files in temp directory /tmp/ReadsSparkSinkUnitTest211196066682765412277.bam.parts/ to /tmp/ReadsSparkSinkUnitTest211196066682765412277.bam.bai
14:51:32.096 INFO IndexFileMerger - Done merging .bai files
14:51:32.098 INFO MemoryStore - Block broadcast_274 stored as values in memory (estimated size 320.0 B, free 1918.4 MiB)
14:51:32.098 INFO MemoryStore - Block broadcast_274_piece0 stored as bytes in memory (estimated size 233.0 B, free 1918.4 MiB)
14:51:32.098 INFO BlockManagerInfo - Added broadcast_274_piece0 in memory on localhost:44923 (size: 233.0 B, free: 1919.7 MiB)
14:51:32.099 INFO SparkContext - Created broadcast 274 from broadcast at BamSource.java:104
14:51:32.100 INFO MemoryStore - Block broadcast_275 stored as values in memory (estimated size 297.9 KiB, free 1918.1 MiB)
14:51:32.106 INFO MemoryStore - Block broadcast_275_piece0 stored as bytes in memory (estimated size 50.2 KiB, free 1918.0 MiB)
14:51:32.106 INFO BlockManagerInfo - Added broadcast_275_piece0 in memory on localhost:44923 (size: 50.2 KiB, free: 1919.7 MiB)
14:51:32.107 INFO SparkContext - Created broadcast 275 from newAPIHadoopFile at PathSplitSource.java:96
14:51:32.116 INFO FileInputFormat - Total input files to process : 1
14:51:32.131 INFO SparkContext - Starting job: collect at ReadsSparkSinkUnitTest.java:182
14:51:32.131 INFO DAGScheduler - Got job 104 (collect at ReadsSparkSinkUnitTest.java:182) with 1 output partitions
14:51:32.131 INFO DAGScheduler - Final stage: ResultStage 146 (collect at ReadsSparkSinkUnitTest.java:182)
14:51:32.131 INFO DAGScheduler - Parents of final stage: List()
14:51:32.131 INFO DAGScheduler - Missing parents: List()
14:51:32.131 INFO DAGScheduler - Submitting ResultStage 146 (MapPartitionsRDD[681] at filter at ReadsSparkSource.java:96), which has no missing parents
14:51:32.137 INFO MemoryStore - Block broadcast_276 stored as values in memory (estimated size 148.2 KiB, free 1917.9 MiB)
14:51:32.138 INFO MemoryStore - Block broadcast_276_piece0 stored as bytes in memory (estimated size 54.5 KiB, free 1917.8 MiB)
14:51:32.138 INFO BlockManagerInfo - Added broadcast_276_piece0 in memory on localhost:44923 (size: 54.5 KiB, free: 1919.6 MiB)
14:51:32.139 INFO SparkContext - Created broadcast 276 from broadcast at DAGScheduler.scala:1580
14:51:32.139 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 146 (MapPartitionsRDD[681] at filter at ReadsSparkSource.java:96) (first 15 tasks are for partitions Vector(0))
14:51:32.139 INFO TaskSchedulerImpl - Adding task set 146.0 with 1 tasks resource profile 0
14:51:32.140 INFO TaskSetManager - Starting task 0.0 in stage 146.0 (TID 202) (localhost, executor driver, partition 0, PROCESS_LOCAL, 7810 bytes)
14:51:32.140 INFO Executor - Running task 0.0 in stage 146.0 (TID 202)
14:51:32.153 INFO NewHadoopRDD - Input split: file:/tmp/ReadsSparkSinkUnitTest211196066682765412277.bam:0+235514
14:51:32.158 INFO Executor - Finished task 0.0 in stage 146.0 (TID 202). 650141 bytes result sent to driver
14:51:32.160 INFO TaskSetManager - Finished task 0.0 in stage 146.0 (TID 202) in 21 ms on localhost (executor driver) (1/1)
14:51:32.160 INFO TaskSchedulerImpl - Removed TaskSet 146.0, whose tasks have all completed, from pool
14:51:32.160 INFO DAGScheduler - ResultStage 146 (collect at ReadsSparkSinkUnitTest.java:182) finished in 0.029 s
14:51:32.160 INFO DAGScheduler - Job 104 is finished. Cancelling potential speculative or zombie tasks for this job
14:51:32.160 INFO TaskSchedulerImpl - Killing all running tasks in stage 146: Stage finished
14:51:32.160 INFO DAGScheduler - Job 104 finished: collect at ReadsSparkSinkUnitTest.java:182, took 0.029325 s
14:51:32.170 INFO SparkContext - Starting job: count at ReadsSparkSinkUnitTest.java:185
14:51:32.170 INFO DAGScheduler - Got job 105 (count at ReadsSparkSinkUnitTest.java:185) with 1 output partitions
14:51:32.170 INFO DAGScheduler - Final stage: ResultStage 147 (count at ReadsSparkSinkUnitTest.java:185)
14:51:32.170 INFO DAGScheduler - Parents of final stage: List()
14:51:32.170 INFO DAGScheduler - Missing parents: List()
14:51:32.170 INFO DAGScheduler - Submitting ResultStage 147 (MapPartitionsRDD[663] at filter at ReadsSparkSource.java:96), which has no missing parents
14:51:32.187 INFO MemoryStore - Block broadcast_277 stored as values in memory (estimated size 426.1 KiB, free 1917.4 MiB)
14:51:32.189 INFO MemoryStore - Block broadcast_277_piece0 stored as bytes in memory (estimated size 153.6 KiB, free 1917.2 MiB)
14:51:32.189 INFO BlockManagerInfo - Added broadcast_277_piece0 in memory on localhost:44923 (size: 153.6 KiB, free: 1919.5 MiB)
14:51:32.189 INFO SparkContext - Created broadcast 277 from broadcast at DAGScheduler.scala:1580
14:51:32.189 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 147 (MapPartitionsRDD[663] at filter at ReadsSparkSource.java:96) (first 15 tasks are for partitions Vector(0))
14:51:32.189 INFO TaskSchedulerImpl - Adding task set 147.0 with 1 tasks resource profile 0
14:51:32.190 INFO TaskSetManager - Starting task 0.0 in stage 147.0 (TID 203) (localhost, executor driver, partition 0, PROCESS_LOCAL, 7912 bytes)
14:51:32.190 INFO Executor - Running task 0.0 in stage 147.0 (TID 203)
14:51:32.225 INFO NewHadoopRDD - Input split: file:/home/runner/work/gatk/gatk/src/test/resources/org/broadinstitute/hellbender/tools/BQSR/expected.HiSeq.1mb.1RG.2k_lines.alternate.recalibrated.DIQ.bam:0+216896
14:51:32.238 INFO Executor - Finished task 0.0 in stage 147.0 (TID 203). 989 bytes result sent to driver
14:51:32.239 INFO TaskSetManager - Finished task 0.0 in stage 147.0 (TID 203) in 49 ms on localhost (executor driver) (1/1)
14:51:32.239 INFO TaskSchedulerImpl - Removed TaskSet 147.0, whose tasks have all completed, from pool
14:51:32.239 INFO DAGScheduler - ResultStage 147 (count at ReadsSparkSinkUnitTest.java:185) finished in 0.069 s
14:51:32.239 INFO DAGScheduler - Job 105 is finished. Cancelling potential speculative or zombie tasks for this job
14:51:32.239 INFO TaskSchedulerImpl - Killing all running tasks in stage 147: Stage finished
14:51:32.239 INFO DAGScheduler - Job 105 finished: count at ReadsSparkSinkUnitTest.java:185, took 0.069676 s
14:51:32.243 INFO SparkContext - Starting job: count at ReadsSparkSinkUnitTest.java:185
14:51:32.243 INFO DAGScheduler - Got job 106 (count at ReadsSparkSinkUnitTest.java:185) with 1 output partitions
14:51:32.243 INFO DAGScheduler - Final stage: ResultStage 148 (count at ReadsSparkSinkUnitTest.java:185)
14:51:32.243 INFO DAGScheduler - Parents of final stage: List()
14:51:32.243 INFO DAGScheduler - Missing parents: List()
14:51:32.243 INFO DAGScheduler - Submitting ResultStage 148 (MapPartitionsRDD[681] at filter at ReadsSparkSource.java:96), which has no missing parents
14:51:32.249 INFO MemoryStore - Block broadcast_278 stored as values in memory (estimated size 148.1 KiB, free 1917.1 MiB)
14:51:32.250 INFO MemoryStore - Block broadcast_278_piece0 stored as bytes in memory (estimated size 54.5 KiB, free 1917.1 MiB)
14:51:32.250 INFO BlockManagerInfo - Added broadcast_278_piece0 in memory on localhost:44923 (size: 54.5 KiB, free: 1919.4 MiB)
14:51:32.250 INFO SparkContext - Created broadcast 278 from broadcast at DAGScheduler.scala:1580
14:51:32.251 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 148 (MapPartitionsRDD[681] at filter at ReadsSparkSource.java:96) (first 15 tasks are for partitions Vector(0))
14:51:32.251 INFO TaskSchedulerImpl - Adding task set 148.0 with 1 tasks resource profile 0
14:51:32.251 INFO TaskSetManager - Starting task 0.0 in stage 148.0 (TID 204) (localhost, executor driver, partition 0, PROCESS_LOCAL, 7810 bytes)
14:51:32.252 INFO Executor - Running task 0.0 in stage 148.0 (TID 204)
14:51:32.264 INFO NewHadoopRDD - Input split: file:/tmp/ReadsSparkSinkUnitTest211196066682765412277.bam:0+235514
14:51:32.270 INFO Executor - Finished task 0.0 in stage 148.0 (TID 204). 989 bytes result sent to driver
14:51:32.270 INFO TaskSetManager - Finished task 0.0 in stage 148.0 (TID 204) in 19 ms on localhost (executor driver) (1/1)
14:51:32.270 INFO TaskSchedulerImpl - Removed TaskSet 148.0, whose tasks have all completed, from pool
14:51:32.270 INFO DAGScheduler - ResultStage 148 (count at ReadsSparkSinkUnitTest.java:185) finished in 0.027 s
14:51:32.270 INFO DAGScheduler - Job 106 is finished. Cancelling potential speculative or zombie tasks for this job
14:51:32.270 INFO TaskSchedulerImpl - Killing all running tasks in stage 148: Stage finished
14:51:32.271 INFO DAGScheduler - Job 106 finished: count at ReadsSparkSinkUnitTest.java:185, took 0.027764 s
14:51:32.273 INFO MemoryStore - Block broadcast_279 stored as values in memory (estimated size 298.0 KiB, free 1916.8 MiB)
14:51:32.279 INFO MemoryStore - Block broadcast_279_piece0 stored as bytes in memory (estimated size 50.2 KiB, free 1916.7 MiB)
14:51:32.280 INFO BlockManagerInfo - Added broadcast_279_piece0 in memory on localhost:44923 (size: 50.2 KiB, free: 1919.3 MiB)
14:51:32.280 INFO SparkContext - Created broadcast 279 from newAPIHadoopFile at PathSplitSource.java:96
14:51:32.303 INFO MemoryStore - Block broadcast_280 stored as values in memory (estimated size 298.0 KiB, free 1916.4 MiB)
14:51:32.309 INFO MemoryStore - Block broadcast_280_piece0 stored as bytes in memory (estimated size 50.2 KiB, free 1916.4 MiB)
14:51:32.309 INFO BlockManagerInfo - Added broadcast_280_piece0 in memory on localhost:44923 (size: 50.2 KiB, free: 1919.3 MiB)
14:51:32.310 INFO SparkContext - Created broadcast 280 from newAPIHadoopFile at PathSplitSource.java:96
14:51:32.330 INFO FileInputFormat - Total input files to process : 1
14:51:32.332 INFO MemoryStore - Block broadcast_281 stored as values in memory (estimated size 19.6 KiB, free 1916.4 MiB)
14:51:32.332 INFO MemoryStore - Block broadcast_281_piece0 stored as bytes in memory (estimated size 1890.0 B, free 1916.3 MiB)
14:51:32.332 INFO BlockManagerInfo - Added broadcast_281_piece0 in memory on localhost:44923 (size: 1890.0 B, free: 1919.3 MiB)
14:51:32.333 INFO SparkContext - Created broadcast 281 from broadcast at ReadsSparkSink.java:133
14:51:32.333 INFO MemoryStore - Block broadcast_282 stored as values in memory (estimated size 20.0 KiB, free 1916.3 MiB)
14:51:32.334 INFO MemoryStore - Block broadcast_282_piece0 stored as bytes in memory (estimated size 1890.0 B, free 1916.3 MiB)
14:51:32.334 INFO BlockManagerInfo - Added broadcast_282_piece0 in memory on localhost:44923 (size: 1890.0 B, free: 1919.3 MiB)
14:51:32.334 INFO SparkContext - Created broadcast 282 from broadcast at BamSink.java:76
14:51:32.336 INFO PathOutputCommitterFactory - No output committer factory defined, defaulting to FileOutputCommitterFactory
14:51:32.336 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
14:51:32.336 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
14:51:32.355 INFO SparkContext - Starting job: runJob at SparkHadoopWriter.scala:83
14:51:32.356 INFO DAGScheduler - Registering RDD 695 (mapToPair at SparkUtils.java:161) as input to shuffle 31
14:51:32.356 INFO DAGScheduler - Got job 107 (runJob at SparkHadoopWriter.scala:83) with 1 output partitions
14:51:32.356 INFO DAGScheduler - Final stage: ResultStage 150 (runJob at SparkHadoopWriter.scala:83)
14:51:32.356 INFO DAGScheduler - Parents of final stage: List(ShuffleMapStage 149)
14:51:32.356 INFO DAGScheduler - Missing parents: List(ShuffleMapStage 149)
14:51:32.356 INFO DAGScheduler - Submitting ShuffleMapStage 149 (MapPartitionsRDD[695] at mapToPair at SparkUtils.java:161), which has no missing parents
14:51:32.374 INFO MemoryStore - Block broadcast_283 stored as values in memory (estimated size 434.3 KiB, free 1915.9 MiB)
14:51:32.375 INFO MemoryStore - Block broadcast_283_piece0 stored as bytes in memory (estimated size 157.6 KiB, free 1915.8 MiB)
14:51:32.375 INFO BlockManagerInfo - Added broadcast_283_piece0 in memory on localhost:44923 (size: 157.6 KiB, free: 1919.1 MiB)
14:51:32.376 INFO SparkContext - Created broadcast 283 from broadcast at DAGScheduler.scala:1580
14:51:32.376 INFO DAGScheduler - Submitting 1 missing tasks from ShuffleMapStage 149 (MapPartitionsRDD[695] at mapToPair at SparkUtils.java:161) (first 15 tasks are for partitions Vector(0))
14:51:32.376 INFO TaskSchedulerImpl - Adding task set 149.0 with 1 tasks resource profile 0
14:51:32.377 INFO TaskSetManager - Starting task 0.0 in stage 149.0 (TID 205) (localhost, executor driver, partition 0, PROCESS_LOCAL, 7882 bytes)
14:51:32.377 INFO Executor - Running task 0.0 in stage 149.0 (TID 205)
14:51:32.411 INFO NewHadoopRDD - Input split: file:/home/runner/work/gatk/gatk/src/test/resources/org/broadinstitute/hellbender/tools/BQSR/CEUTrio.HiSeq.WGS.b37.ch20.1m-1m1k.NA12878.bam:0+211123
14:51:32.426 INFO Executor - Finished task 0.0 in stage 149.0 (TID 205). 1148 bytes result sent to driver
14:51:32.426 INFO TaskSetManager - Finished task 0.0 in stage 149.0 (TID 205) in 49 ms on localhost (executor driver) (1/1)
14:51:32.426 INFO TaskSchedulerImpl - Removed TaskSet 149.0, whose tasks have all completed, from pool
14:51:32.426 INFO DAGScheduler - ShuffleMapStage 149 (mapToPair at SparkUtils.java:161) finished in 0.069 s
14:51:32.426 INFO DAGScheduler - looking for newly runnable stages
14:51:32.426 INFO DAGScheduler - running: HashSet()
14:51:32.426 INFO DAGScheduler - waiting: HashSet(ResultStage 150)
14:51:32.426 INFO DAGScheduler - failed: HashSet()
14:51:32.427 INFO DAGScheduler - Submitting ResultStage 150 (MapPartitionsRDD[700] at mapToPair at BamSink.java:91), which has no missing parents
14:51:32.437 INFO MemoryStore - Block broadcast_284 stored as values in memory (estimated size 155.3 KiB, free 1915.6 MiB)
14:51:32.438 INFO MemoryStore - Block broadcast_284_piece0 stored as bytes in memory (estimated size 58.4 KiB, free 1915.5 MiB)
14:51:32.439 INFO BlockManagerInfo - Added broadcast_284_piece0 in memory on localhost:44923 (size: 58.4 KiB, free: 1919.1 MiB)
14:51:32.439 INFO SparkContext - Created broadcast 284 from broadcast at DAGScheduler.scala:1580
14:51:32.439 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 150 (MapPartitionsRDD[700] at mapToPair at BamSink.java:91) (first 15 tasks are for partitions Vector(0))
14:51:32.439 INFO TaskSchedulerImpl - Adding task set 150.0 with 1 tasks resource profile 0
14:51:32.440 INFO TaskSetManager - Starting task 0.0 in stage 150.0 (TID 206) (localhost, executor driver, partition 0, NODE_LOCAL, 7513 bytes)
14:51:32.440 INFO Executor - Running task 0.0 in stage 150.0 (TID 206)
14:51:32.447 INFO ShuffleBlockFetcherIterator - Getting 1 (312.6 KiB) non-empty blocks including 1 (312.6 KiB) local and 0 (0.0 B) host-local and 0 (0.0 B) push-merged-local and 0 (0.0 B) remote blocks
14:51:32.447 INFO ShuffleBlockFetcherIterator - Started 0 remote fetches in 0 ms
14:51:32.461 INFO PathOutputCommitterFactory - No output committer factory defined, defaulting to FileOutputCommitterFactory
14:51:32.461 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
14:51:32.461 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
14:51:32.462 INFO PathOutputCommitterFactory - No output committer factory defined, defaulting to FileOutputCommitterFactory
14:51:32.462 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
14:51:32.462 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
14:51:32.489 INFO FileOutputCommitter - Saved output of task 'attempt_202603041451324082718054271517691_0700_r_000000_0' to file:/tmp/ReadsSparkSinkUnitTest39651934907865679162.bam.parts/_temporary/0/task_202603041451324082718054271517691_0700_r_000000
14:51:32.489 INFO SparkHadoopMapRedUtil - attempt_202603041451324082718054271517691_0700_r_000000_0: Committed. Elapsed time: 0 ms.
14:51:32.490 INFO Executor - Finished task 0.0 in stage 150.0 (TID 206). 1858 bytes result sent to driver
14:51:32.490 INFO TaskSetManager - Finished task 0.0 in stage 150.0 (TID 206) in 50 ms on localhost (executor driver) (1/1)
14:51:32.490 INFO TaskSchedulerImpl - Removed TaskSet 150.0, whose tasks have all completed, from pool
14:51:32.490 INFO DAGScheduler - ResultStage 150 (runJob at SparkHadoopWriter.scala:83) finished in 0.063 s
14:51:32.491 INFO DAGScheduler - Job 107 is finished. Cancelling potential speculative or zombie tasks for this job
14:51:32.491 INFO TaskSchedulerImpl - Killing all running tasks in stage 150: Stage finished
14:51:32.491 INFO DAGScheduler - Job 107 finished: runJob at SparkHadoopWriter.scala:83, took 0.135389 s
14:51:32.491 INFO SparkHadoopWriter - Start to commit write Job job_202603041451324082718054271517691_0700.
14:51:32.497 INFO SparkHadoopWriter - Write Job job_202603041451324082718054271517691_0700 committed. Elapsed time: 6 ms.
14:51:32.512 INFO HadoopFileSystemWrapper - Concatenating 3 parts to /tmp/ReadsSparkSinkUnitTest39651934907865679162.bam
14:51:32.518 INFO HadoopFileSystemWrapper - Concatenating to /tmp/ReadsSparkSinkUnitTest39651934907865679162.bam done
14:51:32.518 INFO IndexFileMerger - Merging .sbi files in temp directory /tmp/ReadsSparkSinkUnitTest39651934907865679162.bam.parts/ to /tmp/ReadsSparkSinkUnitTest39651934907865679162.bam.sbi
14:51:32.524 INFO IndexFileMerger - Done merging .sbi files
14:51:32.524 INFO IndexFileMerger - Merging .bai files in temp directory /tmp/ReadsSparkSinkUnitTest39651934907865679162.bam.parts/ to /tmp/ReadsSparkSinkUnitTest39651934907865679162.bam.bai
14:51:32.530 INFO IndexFileMerger - Done merging .bai files
14:51:32.532 INFO MemoryStore - Block broadcast_285 stored as values in memory (estimated size 312.0 B, free 1915.5 MiB)
14:51:32.537 INFO MemoryStore - Block broadcast_285_piece0 stored as bytes in memory (estimated size 231.0 B, free 1915.5 MiB)
14:51:32.537 INFO BlockManagerInfo - Added broadcast_285_piece0 in memory on localhost:44923 (size: 231.0 B, free: 1919.1 MiB)
14:51:32.537 INFO BlockManagerInfo - Removed broadcast_268_piece0 on localhost:44923 in memory (size: 50.3 KiB, free: 1919.1 MiB)
14:51:32.538 INFO SparkContext - Created broadcast 285 from broadcast at BamSource.java:104
14:51:32.538 INFO BlockManagerInfo - Removed broadcast_277_piece0 on localhost:44923 in memory (size: 153.6 KiB, free: 1919.3 MiB)
14:51:32.539 INFO MemoryStore - Block broadcast_286 stored as values in memory (estimated size 297.9 KiB, free 1916.2 MiB)
14:51:32.539 INFO BlockManagerInfo - Removed broadcast_280_piece0 on localhost:44923 in memory (size: 50.2 KiB, free: 1919.3 MiB)
14:51:32.540 INFO BlockManagerInfo - Removed broadcast_275_piece0 on localhost:44923 in memory (size: 50.2 KiB, free: 1919.4 MiB)
14:51:32.540 INFO BlockManagerInfo - Removed broadcast_283_piece0 on localhost:44923 in memory (size: 157.6 KiB, free: 1919.5 MiB)
14:51:32.541 INFO BlockManagerInfo - Removed broadcast_276_piece0 on localhost:44923 in memory (size: 54.5 KiB, free: 1919.6 MiB)
14:51:32.541 INFO BlockManagerInfo - Removed broadcast_273_piece0 on localhost:44923 in memory (size: 67.0 KiB, free: 1919.7 MiB)
14:51:32.542 INFO BlockManagerInfo - Removed broadcast_278_piece0 on localhost:44923 in memory (size: 54.5 KiB, free: 1919.7 MiB)
14:51:32.542 INFO BlockManagerInfo - Removed broadcast_274_piece0 on localhost:44923 in memory (size: 233.0 B, free: 1919.7 MiB)
14:51:32.543 INFO BlockManagerInfo - Removed broadcast_281_piece0 on localhost:44923 in memory (size: 1890.0 B, free: 1919.7 MiB)
14:51:32.543 INFO BlockManagerInfo - Removed broadcast_284_piece0 on localhost:44923 in memory (size: 58.4 KiB, free: 1919.8 MiB)
14:51:32.544 INFO BlockManagerInfo - Removed broadcast_272_piece0 on localhost:44923 in memory (size: 166.1 KiB, free: 1919.9 MiB)
14:51:32.544 INFO BlockManagerInfo - Removed broadcast_282_piece0 on localhost:44923 in memory (size: 1890.0 B, free: 1919.9 MiB)
14:51:32.545 INFO BlockManagerInfo - Removed broadcast_270_piece0 on localhost:44923 in memory (size: 9.6 KiB, free: 1919.9 MiB)
14:51:32.546 INFO BlockManagerInfo - Removed broadcast_271_piece0 on localhost:44923 in memory (size: 9.6 KiB, free: 1920.0 MiB)
14:51:32.548 INFO MemoryStore - Block broadcast_286_piece0 stored as bytes in memory (estimated size 50.2 KiB, free 1919.3 MiB)
14:51:32.548 INFO BlockManagerInfo - Added broadcast_286_piece0 in memory on localhost:44923 (size: 50.2 KiB, free: 1919.9 MiB)
14:51:32.549 INFO SparkContext - Created broadcast 286 from newAPIHadoopFile at PathSplitSource.java:96
14:51:32.558 INFO FileInputFormat - Total input files to process : 1
14:51:32.573 INFO SparkContext - Starting job: collect at ReadsSparkSinkUnitTest.java:182
14:51:32.574 INFO DAGScheduler - Got job 108 (collect at ReadsSparkSinkUnitTest.java:182) with 1 output partitions
14:51:32.574 INFO DAGScheduler - Final stage: ResultStage 151 (collect at ReadsSparkSinkUnitTest.java:182)
14:51:32.574 INFO DAGScheduler - Parents of final stage: List()
14:51:32.574 INFO DAGScheduler - Missing parents: List()
14:51:32.574 INFO DAGScheduler - Submitting ResultStage 151 (MapPartitionsRDD[706] at filter at ReadsSparkSource.java:96), which has no missing parents
14:51:32.581 INFO MemoryStore - Block broadcast_287 stored as values in memory (estimated size 148.2 KiB, free 1919.2 MiB)
14:51:32.582 INFO MemoryStore - Block broadcast_287_piece0 stored as bytes in memory (estimated size 54.6 KiB, free 1919.1 MiB)
14:51:32.582 INFO BlockManagerInfo - Added broadcast_287_piece0 in memory on localhost:44923 (size: 54.6 KiB, free: 1919.8 MiB)
14:51:32.582 INFO SparkContext - Created broadcast 287 from broadcast at DAGScheduler.scala:1580
14:51:32.582 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 151 (MapPartitionsRDD[706] at filter at ReadsSparkSource.java:96) (first 15 tasks are for partitions Vector(0))
14:51:32.582 INFO TaskSchedulerImpl - Adding task set 151.0 with 1 tasks resource profile 0
14:51:32.583 INFO TaskSetManager - Starting task 0.0 in stage 151.0 (TID 207) (localhost, executor driver, partition 0, PROCESS_LOCAL, 7809 bytes)
14:51:32.583 INFO Executor - Running task 0.0 in stage 151.0 (TID 207)
14:51:32.597 INFO NewHadoopRDD - Input split: file:/tmp/ReadsSparkSinkUnitTest39651934907865679162.bam:0+236517
14:51:32.602 INFO Executor - Finished task 0.0 in stage 151.0 (TID 207). 749470 bytes result sent to driver
14:51:32.604 INFO TaskSetManager - Finished task 0.0 in stage 151.0 (TID 207) in 21 ms on localhost (executor driver) (1/1)
14:51:32.604 INFO TaskSchedulerImpl - Removed TaskSet 151.0, whose tasks have all completed, from pool
14:51:32.604 INFO DAGScheduler - ResultStage 151 (collect at ReadsSparkSinkUnitTest.java:182) finished in 0.030 s
14:51:32.604 INFO DAGScheduler - Job 108 is finished. Cancelling potential speculative or zombie tasks for this job
14:51:32.604 INFO TaskSchedulerImpl - Killing all running tasks in stage 151: Stage finished
14:51:32.604 INFO DAGScheduler - Job 108 finished: collect at ReadsSparkSinkUnitTest.java:182, took 0.031118 s
14:51:32.615 INFO SparkContext - Starting job: count at ReadsSparkSinkUnitTest.java:185
14:51:32.616 INFO DAGScheduler - Got job 109 (count at ReadsSparkSinkUnitTest.java:185) with 1 output partitions
14:51:32.616 INFO DAGScheduler - Final stage: ResultStage 152 (count at ReadsSparkSinkUnitTest.java:185)
14:51:32.616 INFO DAGScheduler - Parents of final stage: List()
14:51:32.616 INFO DAGScheduler - Missing parents: List()
14:51:32.616 INFO DAGScheduler - Submitting ResultStage 152 (MapPartitionsRDD[688] at filter at ReadsSparkSource.java:96), which has no missing parents
14:51:32.633 INFO MemoryStore - Block broadcast_288 stored as values in memory (estimated size 426.1 KiB, free 1918.7 MiB)
14:51:32.635 INFO MemoryStore - Block broadcast_288_piece0 stored as bytes in memory (estimated size 153.6 KiB, free 1918.6 MiB)
14:51:32.635 INFO BlockManagerInfo - Added broadcast_288_piece0 in memory on localhost:44923 (size: 153.6 KiB, free: 1919.7 MiB)
14:51:32.635 INFO SparkContext - Created broadcast 288 from broadcast at DAGScheduler.scala:1580
14:51:32.636 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 152 (MapPartitionsRDD[688] at filter at ReadsSparkSource.java:96) (first 15 tasks are for partitions Vector(0))
14:51:32.636 INFO TaskSchedulerImpl - Adding task set 152.0 with 1 tasks resource profile 0
14:51:32.636 INFO TaskSetManager - Starting task 0.0 in stage 152.0 (TID 208) (localhost, executor driver, partition 0, PROCESS_LOCAL, 7893 bytes)
14:51:32.636 INFO Executor - Running task 0.0 in stage 152.0 (TID 208)
14:51:32.673 INFO NewHadoopRDD - Input split: file:/home/runner/work/gatk/gatk/src/test/resources/org/broadinstitute/hellbender/tools/BQSR/CEUTrio.HiSeq.WGS.b37.ch20.1m-1m1k.NA12878.bam:0+211123
14:51:32.680 INFO Executor - Finished task 0.0 in stage 152.0 (TID 208). 989 bytes result sent to driver
14:51:32.681 INFO TaskSetManager - Finished task 0.0 in stage 152.0 (TID 208) in 45 ms on localhost (executor driver) (1/1)
14:51:32.681 INFO TaskSchedulerImpl - Removed TaskSet 152.0, whose tasks have all completed, from pool
14:51:32.681 INFO DAGScheduler - ResultStage 152 (count at ReadsSparkSinkUnitTest.java:185) finished in 0.065 s
14:51:32.681 INFO DAGScheduler - Job 109 is finished. Cancelling potential speculative or zombie tasks for this job
14:51:32.681 INFO TaskSchedulerImpl - Killing all running tasks in stage 152: Stage finished
14:51:32.681 INFO DAGScheduler - Job 109 finished: count at ReadsSparkSinkUnitTest.java:185, took 0.066117 s
14:51:32.685 INFO SparkContext - Starting job: count at ReadsSparkSinkUnitTest.java:185
14:51:32.686 INFO DAGScheduler - Got job 110 (count at ReadsSparkSinkUnitTest.java:185) with 1 output partitions
14:51:32.686 INFO DAGScheduler - Final stage: ResultStage 153 (count at ReadsSparkSinkUnitTest.java:185)
14:51:32.686 INFO DAGScheduler - Parents of final stage: List()
14:51:32.686 INFO DAGScheduler - Missing parents: List()
14:51:32.686 INFO DAGScheduler - Submitting ResultStage 153 (MapPartitionsRDD[706] at filter at ReadsSparkSource.java:96), which has no missing parents
14:51:32.692 INFO MemoryStore - Block broadcast_289 stored as values in memory (estimated size 148.1 KiB, free 1918.4 MiB)
14:51:32.693 INFO MemoryStore - Block broadcast_289_piece0 stored as bytes in memory (estimated size 54.5 KiB, free 1918.4 MiB)
14:51:32.694 INFO BlockManagerInfo - Added broadcast_289_piece0 in memory on localhost:44923 (size: 54.5 KiB, free: 1919.6 MiB)
14:51:32.694 INFO SparkContext - Created broadcast 289 from broadcast at DAGScheduler.scala:1580
14:51:32.694 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 153 (MapPartitionsRDD[706] at filter at ReadsSparkSource.java:96) (first 15 tasks are for partitions Vector(0))
14:51:32.694 INFO TaskSchedulerImpl - Adding task set 153.0 with 1 tasks resource profile 0
14:51:32.695 INFO TaskSetManager - Starting task 0.0 in stage 153.0 (TID 209) (localhost, executor driver, partition 0, PROCESS_LOCAL, 7809 bytes)
14:51:32.695 INFO Executor - Running task 0.0 in stage 153.0 (TID 209)
14:51:32.708 INFO NewHadoopRDD - Input split: file:/tmp/ReadsSparkSinkUnitTest39651934907865679162.bam:0+236517
14:51:32.711 INFO Executor - Finished task 0.0 in stage 153.0 (TID 209). 989 bytes result sent to driver
14:51:32.712 INFO TaskSetManager - Finished task 0.0 in stage 153.0 (TID 209) in 17 ms on localhost (executor driver) (1/1)
14:51:32.712 INFO TaskSchedulerImpl - Removed TaskSet 153.0, whose tasks have all completed, from pool
14:51:32.712 INFO DAGScheduler - ResultStage 153 (count at ReadsSparkSinkUnitTest.java:185) finished in 0.026 s
14:51:32.712 INFO DAGScheduler - Job 110 is finished. Cancelling potential speculative or zombie tasks for this job
14:51:32.712 INFO TaskSchedulerImpl - Killing all running tasks in stage 153: Stage finished
14:51:32.712 INFO DAGScheduler - Job 110 finished: count at ReadsSparkSinkUnitTest.java:185, took 0.026902 s
14:51:32.715 INFO MemoryStore - Block broadcast_290 stored as values in memory (estimated size 576.0 B, free 1918.4 MiB)
14:51:32.716 INFO MemoryStore - Block broadcast_290_piece0 stored as bytes in memory (estimated size 228.0 B, free 1918.4 MiB)
14:51:32.716 INFO BlockManagerInfo - Added broadcast_290_piece0 in memory on localhost:44923 (size: 228.0 B, free: 1919.6 MiB)
14:51:32.716 INFO SparkContext - Created broadcast 290 from broadcast at CramSource.java:114
14:51:32.717 INFO MemoryStore - Block broadcast_291 stored as values in memory (estimated size 297.9 KiB, free 1918.1 MiB)
14:51:32.723 INFO MemoryStore - Block broadcast_291_piece0 stored as bytes in memory (estimated size 50.2 KiB, free 1918.0 MiB)
14:51:32.723 INFO BlockManagerInfo - Added broadcast_291_piece0 in memory on localhost:44923 (size: 50.2 KiB, free: 1919.6 MiB)
14:51:32.724 INFO SparkContext - Created broadcast 291 from newAPIHadoopFile at PathSplitSource.java:96
14:51:32.740 INFO MemoryStore - Block broadcast_292 stored as values in memory (estimated size 576.0 B, free 1918.0 MiB)
14:51:32.741 INFO MemoryStore - Block broadcast_292_piece0 stored as bytes in memory (estimated size 228.0 B, free 1918.0 MiB)
14:51:32.741 INFO BlockManagerInfo - Added broadcast_292_piece0 in memory on localhost:44923 (size: 228.0 B, free: 1919.6 MiB)
14:51:32.741 INFO SparkContext - Created broadcast 292 from broadcast at CramSource.java:114
14:51:32.742 INFO MemoryStore - Block broadcast_293 stored as values in memory (estimated size 297.9 KiB, free 1917.7 MiB)
14:51:32.749 INFO MemoryStore - Block broadcast_293_piece0 stored as bytes in memory (estimated size 50.2 KiB, free 1917.7 MiB)
14:51:32.749 INFO BlockManagerInfo - Added broadcast_293_piece0 in memory on localhost:44923 (size: 50.2 KiB, free: 1919.5 MiB)
14:51:32.749 INFO SparkContext - Created broadcast 293 from newAPIHadoopFile at PathSplitSource.java:96
14:51:32.764 INFO FileInputFormat - Total input files to process : 1
14:51:32.766 INFO MemoryStore - Block broadcast_294 stored as values in memory (estimated size 6.0 KiB, free 1917.7 MiB)
14:51:32.766 INFO MemoryStore - Block broadcast_294_piece0 stored as bytes in memory (estimated size 1473.0 B, free 1917.7 MiB)
14:51:32.766 INFO BlockManagerInfo - Added broadcast_294_piece0 in memory on localhost:44923 (size: 1473.0 B, free: 1919.5 MiB)
14:51:32.767 INFO SparkContext - Created broadcast 294 from broadcast at ReadsSparkSink.java:133
14:51:32.767 INFO MemoryStore - Block broadcast_295 stored as values in memory (estimated size 6.2 KiB, free 1917.7 MiB)
14:51:32.768 INFO MemoryStore - Block broadcast_295_piece0 stored as bytes in memory (estimated size 1473.0 B, free 1917.7 MiB)
14:51:32.768 INFO BlockManagerInfo - Added broadcast_295_piece0 in memory on localhost:44923 (size: 1473.0 B, free: 1919.5 MiB)
14:51:32.768 INFO SparkContext - Created broadcast 295 from broadcast at CramSink.java:76
14:51:32.770 INFO PathOutputCommitterFactory - No output committer factory defined, defaulting to FileOutputCommitterFactory
14:51:32.770 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
14:51:32.770 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
14:51:32.790 INFO SparkContext - Starting job: runJob at SparkHadoopWriter.scala:83
14:51:32.790 INFO DAGScheduler - Registering RDD 718 (mapToPair at SparkUtils.java:161) as input to shuffle 32
14:51:32.791 INFO DAGScheduler - Got job 111 (runJob at SparkHadoopWriter.scala:83) with 1 output partitions
14:51:32.791 INFO DAGScheduler - Final stage: ResultStage 155 (runJob at SparkHadoopWriter.scala:83)
14:51:32.791 INFO DAGScheduler - Parents of final stage: List(ShuffleMapStage 154)
14:51:32.791 INFO DAGScheduler - Missing parents: List(ShuffleMapStage 154)
14:51:32.791 INFO DAGScheduler - Submitting ShuffleMapStage 154 (MapPartitionsRDD[718] at mapToPair at SparkUtils.java:161), which has no missing parents
14:51:32.803 INFO MemoryStore - Block broadcast_296 stored as values in memory (estimated size 292.8 KiB, free 1917.4 MiB)
14:51:32.804 INFO MemoryStore - Block broadcast_296_piece0 stored as bytes in memory (estimated size 107.3 KiB, free 1917.3 MiB)
14:51:32.804 INFO BlockManagerInfo - Added broadcast_296_piece0 in memory on localhost:44923 (size: 107.3 KiB, free: 1919.4 MiB)
14:51:32.804 INFO SparkContext - Created broadcast 296 from broadcast at DAGScheduler.scala:1580
14:51:32.804 INFO DAGScheduler - Submitting 1 missing tasks from ShuffleMapStage 154 (MapPartitionsRDD[718] at mapToPair at SparkUtils.java:161) (first 15 tasks are for partitions Vector(0))
14:51:32.804 INFO TaskSchedulerImpl - Adding task set 154.0 with 1 tasks resource profile 0
14:51:32.805 INFO TaskSetManager - Starting task 0.0 in stage 154.0 (TID 210) (localhost, executor driver, partition 0, PROCESS_LOCAL, 7869 bytes)
14:51:32.805 INFO Executor - Running task 0.0 in stage 154.0 (TID 210)
14:51:32.830 INFO NewHadoopRDD - Input split: file:/home/runner/work/gatk/gatk/src/test/resources/org/broadinstitute/hellbender/tools/BQSR/NA12878.chr17_69k_70k.dictFix.cram:0+50619
14:51:32.846 INFO Executor - Finished task 0.0 in stage 154.0 (TID 210). 1148 bytes result sent to driver
14:51:32.846 INFO TaskSetManager - Finished task 0.0 in stage 154.0 (TID 210) in 41 ms on localhost (executor driver) (1/1)
14:51:32.846 INFO TaskSchedulerImpl - Removed TaskSet 154.0, whose tasks have all completed, from pool
14:51:32.847 INFO DAGScheduler - ShuffleMapStage 154 (mapToPair at SparkUtils.java:161) finished in 0.056 s
14:51:32.847 INFO DAGScheduler - looking for newly runnable stages
14:51:32.847 INFO DAGScheduler - running: HashSet()
14:51:32.847 INFO DAGScheduler - waiting: HashSet(ResultStage 155)
14:51:32.847 INFO DAGScheduler - failed: HashSet()
14:51:32.847 INFO DAGScheduler - Submitting ResultStage 155 (MapPartitionsRDD[723] at mapToPair at CramSink.java:89), which has no missing parents
14:51:32.858 INFO MemoryStore - Block broadcast_297 stored as values in memory (estimated size 153.2 KiB, free 1917.1 MiB)
14:51:32.859 INFO MemoryStore - Block broadcast_297_piece0 stored as bytes in memory (estimated size 58.0 KiB, free 1917.1 MiB)
14:51:32.859 INFO BlockManagerInfo - Added broadcast_297_piece0 in memory on localhost:44923 (size: 58.0 KiB, free: 1919.4 MiB)
14:51:32.859 INFO SparkContext - Created broadcast 297 from broadcast at DAGScheduler.scala:1580
14:51:32.859 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 155 (MapPartitionsRDD[723] at mapToPair at CramSink.java:89) (first 15 tasks are for partitions Vector(0))
14:51:32.859 INFO TaskSchedulerImpl - Adding task set 155.0 with 1 tasks resource profile 0
14:51:32.860 INFO TaskSetManager - Starting task 0.0 in stage 155.0 (TID 211) (localhost, executor driver, partition 0, NODE_LOCAL, 7513 bytes)
14:51:32.861 INFO Executor - Running task 0.0 in stage 155.0 (TID 211)
14:51:32.865 INFO ShuffleBlockFetcherIterator - Getting 1 (82.3 KiB) non-empty blocks including 1 (82.3 KiB) local and 0 (0.0 B) host-local and 0 (0.0 B) push-merged-local and 0 (0.0 B) remote blocks
14:51:32.865 INFO ShuffleBlockFetcherIterator - Started 0 remote fetches in 0 ms
14:51:32.872 INFO PathOutputCommitterFactory - No output committer factory defined, defaulting to FileOutputCommitterFactory
14:51:32.872 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
14:51:32.872 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
14:51:32.872 INFO PathOutputCommitterFactory - No output committer factory defined, defaulting to FileOutputCommitterFactory
14:51:32.872 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
14:51:32.872 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
14:51:32.892 INFO BlockManagerInfo - Removed broadcast_286_piece0 on localhost:44923 in memory (size: 50.2 KiB, free: 1919.4 MiB)
14:51:32.893 INFO BlockManagerInfo - Removed broadcast_288_piece0 on localhost:44923 in memory (size: 153.6 KiB, free: 1919.6 MiB)
14:51:32.894 INFO BlockManagerInfo - Removed broadcast_287_piece0 on localhost:44923 in memory (size: 54.6 KiB, free: 1919.6 MiB)
14:51:32.895 INFO BlockManagerInfo - Removed broadcast_296_piece0 on localhost:44923 in memory (size: 107.3 KiB, free: 1919.7 MiB)
14:51:32.895 INFO BlockManagerInfo - Removed broadcast_289_piece0 on localhost:44923 in memory (size: 54.5 KiB, free: 1919.8 MiB)
14:51:32.896 INFO BlockManagerInfo - Removed broadcast_279_piece0 on localhost:44923 in memory (size: 50.2 KiB, free: 1919.8 MiB)
14:51:32.896 INFO BlockManagerInfo - Removed broadcast_293_piece0 on localhost:44923 in memory (size: 50.2 KiB, free: 1919.9 MiB)
14:51:32.897 INFO BlockManagerInfo - Removed broadcast_292_piece0 on localhost:44923 in memory (size: 228.0 B, free: 1919.9 MiB)
14:51:32.898 INFO BlockManagerInfo - Removed broadcast_285_piece0 on localhost:44923 in memory (size: 231.0 B, free: 1919.9 MiB)
14:51:32.940 INFO FileOutputCommitter - Saved output of task 'attempt_202603041451328510748198146804006_0723_r_000000_0' to file:/tmp/ReadsSparkSinkUnitTest59674865046503962823.cram.parts/_temporary/0/task_202603041451328510748198146804006_0723_r_000000
14:51:32.940 INFO SparkHadoopMapRedUtil - attempt_202603041451328510748198146804006_0723_r_000000_0: Committed. Elapsed time: 0 ms.
14:51:32.941 INFO Executor - Finished task 0.0 in stage 155.0 (TID 211). 1901 bytes result sent to driver
14:51:32.941 INFO TaskSetManager - Finished task 0.0 in stage 155.0 (TID 211) in 81 ms on localhost (executor driver) (1/1)
14:51:32.941 INFO TaskSchedulerImpl - Removed TaskSet 155.0, whose tasks have all completed, from pool
14:51:32.941 INFO DAGScheduler - ResultStage 155 (runJob at SparkHadoopWriter.scala:83) finished in 0.094 s
14:51:32.942 INFO DAGScheduler - Job 111 is finished. Cancelling potential speculative or zombie tasks for this job
14:51:32.942 INFO TaskSchedulerImpl - Killing all running tasks in stage 155: Stage finished
14:51:32.942 INFO DAGScheduler - Job 111 finished: runJob at SparkHadoopWriter.scala:83, took 0.151811 s
14:51:32.942 INFO SparkHadoopWriter - Start to commit write Job job_202603041451328510748198146804006_0723.
14:51:32.948 INFO SparkHadoopWriter - Write Job job_202603041451328510748198146804006_0723 committed. Elapsed time: 5 ms.
14:51:32.962 INFO HadoopFileSystemWrapper - Concatenating 3 parts to /tmp/ReadsSparkSinkUnitTest59674865046503962823.cram
14:51:32.968 INFO HadoopFileSystemWrapper - Concatenating to /tmp/ReadsSparkSinkUnitTest59674865046503962823.cram done
14:51:32.970 INFO MemoryStore - Block broadcast_298 stored as values in memory (estimated size 504.0 B, free 1919.4 MiB)
14:51:32.971 INFO MemoryStore - Block broadcast_298_piece0 stored as bytes in memory (estimated size 159.0 B, free 1919.4 MiB)
14:51:32.971 INFO BlockManagerInfo - Added broadcast_298_piece0 in memory on localhost:44923 (size: 159.0 B, free: 1919.9 MiB)
14:51:32.972 INFO SparkContext - Created broadcast 298 from broadcast at CramSource.java:114
14:51:32.973 INFO MemoryStore - Block broadcast_299 stored as values in memory (estimated size 297.9 KiB, free 1919.1 MiB)
14:51:32.984 INFO MemoryStore - Block broadcast_299_piece0 stored as bytes in memory (estimated size 50.2 KiB, free 1919.1 MiB)
14:51:32.984 INFO BlockManagerInfo - Added broadcast_299_piece0 in memory on localhost:44923 (size: 50.2 KiB, free: 1919.8 MiB)
14:51:32.984 INFO SparkContext - Created broadcast 299 from newAPIHadoopFile at PathSplitSource.java:96
14:51:33.000 INFO FileInputFormat - Total input files to process : 1
14:51:33.026 INFO SparkContext - Starting job: collect at ReadsSparkSinkUnitTest.java:182
14:51:33.026 INFO DAGScheduler - Got job 112 (collect at ReadsSparkSinkUnitTest.java:182) with 1 output partitions
14:51:33.026 INFO DAGScheduler - Final stage: ResultStage 156 (collect at ReadsSparkSinkUnitTest.java:182)
14:51:33.026 INFO DAGScheduler - Parents of final stage: List()
14:51:33.026 INFO DAGScheduler - Missing parents: List()
14:51:33.026 INFO DAGScheduler - Submitting ResultStage 156 (MapPartitionsRDD[729] at filter at ReadsSparkSource.java:96), which has no missing parents
14:51:33.038 INFO MemoryStore - Block broadcast_300 stored as values in memory (estimated size 286.8 KiB, free 1918.8 MiB)
14:51:33.039 INFO MemoryStore - Block broadcast_300_piece0 stored as bytes in memory (estimated size 103.6 KiB, free 1918.7 MiB)
14:51:33.039 INFO BlockManagerInfo - Added broadcast_300_piece0 in memory on localhost:44923 (size: 103.6 KiB, free: 1919.7 MiB)
14:51:33.040 INFO SparkContext - Created broadcast 300 from broadcast at DAGScheduler.scala:1580
14:51:33.040 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 156 (MapPartitionsRDD[729] at filter at ReadsSparkSource.java:96) (first 15 tasks are for partitions Vector(0))
14:51:33.040 INFO TaskSchedulerImpl - Adding task set 156.0 with 1 tasks resource profile 0
14:51:33.041 INFO TaskSetManager - Starting task 0.0 in stage 156.0 (TID 212) (localhost, executor driver, partition 0, PROCESS_LOCAL, 7810 bytes)
14:51:33.041 INFO Executor - Running task 0.0 in stage 156.0 (TID 212)
14:51:33.065 INFO NewHadoopRDD - Input split: file:/tmp/ReadsSparkSinkUnitTest59674865046503962823.cram:0+43713
14:51:33.088 INFO Executor - Finished task 0.0 in stage 156.0 (TID 212). 154101 bytes result sent to driver
14:51:33.089 INFO TaskSetManager - Finished task 0.0 in stage 156.0 (TID 212) in 48 ms on localhost (executor driver) (1/1)
14:51:33.089 INFO TaskSchedulerImpl - Removed TaskSet 156.0, whose tasks have all completed, from pool
14:51:33.089 INFO DAGScheduler - ResultStage 156 (collect at ReadsSparkSinkUnitTest.java:182) finished in 0.062 s
14:51:33.089 INFO DAGScheduler - Job 112 is finished. Cancelling potential speculative or zombie tasks for this job
14:51:33.089 INFO TaskSchedulerImpl - Killing all running tasks in stage 156: Stage finished
14:51:33.089 INFO DAGScheduler - Job 112 finished: collect at ReadsSparkSinkUnitTest.java:182, took 0.063420 s
14:51:33.094 INFO SparkContext - Starting job: count at ReadsSparkSinkUnitTest.java:185
14:51:33.095 INFO DAGScheduler - Got job 113 (count at ReadsSparkSinkUnitTest.java:185) with 1 output partitions
14:51:33.095 INFO DAGScheduler - Final stage: ResultStage 157 (count at ReadsSparkSinkUnitTest.java:185)
14:51:33.095 INFO DAGScheduler - Parents of final stage: List()
14:51:33.095 INFO DAGScheduler - Missing parents: List()
14:51:33.095 INFO DAGScheduler - Submitting ResultStage 157 (MapPartitionsRDD[712] at filter at ReadsSparkSource.java:96), which has no missing parents
14:51:33.107 INFO MemoryStore - Block broadcast_301 stored as values in memory (estimated size 286.8 KiB, free 1918.4 MiB)
14:51:33.108 INFO MemoryStore - Block broadcast_301_piece0 stored as bytes in memory (estimated size 103.6 KiB, free 1918.3 MiB)
14:51:33.108 INFO BlockManagerInfo - Added broadcast_301_piece0 in memory on localhost:44923 (size: 103.6 KiB, free: 1919.6 MiB)
14:51:33.108 INFO SparkContext - Created broadcast 301 from broadcast at DAGScheduler.scala:1580
14:51:33.109 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 157 (MapPartitionsRDD[712] at filter at ReadsSparkSource.java:96) (first 15 tasks are for partitions Vector(0))
14:51:33.109 INFO TaskSchedulerImpl - Adding task set 157.0 with 1 tasks resource profile 0
14:51:33.109 INFO TaskSetManager - Starting task 0.0 in stage 157.0 (TID 213) (localhost, executor driver, partition 0, PROCESS_LOCAL, 7880 bytes)
14:51:33.110 INFO Executor - Running task 0.0 in stage 157.0 (TID 213)
14:51:33.138 INFO NewHadoopRDD - Input split: file:/home/runner/work/gatk/gatk/src/test/resources/org/broadinstitute/hellbender/tools/BQSR/NA12878.chr17_69k_70k.dictFix.cram:0+50619
14:51:33.145 INFO Executor - Finished task 0.0 in stage 157.0 (TID 213). 989 bytes result sent to driver
14:51:33.146 INFO TaskSetManager - Finished task 0.0 in stage 157.0 (TID 213) in 37 ms on localhost (executor driver) (1/1)
14:51:33.146 INFO TaskSchedulerImpl - Removed TaskSet 157.0, whose tasks have all completed, from pool
14:51:33.146 INFO DAGScheduler - ResultStage 157 (count at ReadsSparkSinkUnitTest.java:185) finished in 0.051 s
14:51:33.146 INFO DAGScheduler - Job 113 is finished. Cancelling potential speculative or zombie tasks for this job
14:51:33.146 INFO TaskSchedulerImpl - Killing all running tasks in stage 157: Stage finished
14:51:33.146 INFO DAGScheduler - Job 113 finished: count at ReadsSparkSinkUnitTest.java:185, took 0.051972 s
14:51:33.150 INFO SparkContext - Starting job: count at ReadsSparkSinkUnitTest.java:185
14:51:33.150 INFO DAGScheduler - Got job 114 (count at ReadsSparkSinkUnitTest.java:185) with 1 output partitions
14:51:33.150 INFO DAGScheduler - Final stage: ResultStage 158 (count at ReadsSparkSinkUnitTest.java:185)
14:51:33.150 INFO DAGScheduler - Parents of final stage: List()
14:51:33.151 INFO DAGScheduler - Missing parents: List()
14:51:33.151 INFO DAGScheduler - Submitting ResultStage 158 (MapPartitionsRDD[729] at filter at ReadsSparkSource.java:96), which has no missing parents
14:51:33.162 INFO MemoryStore - Block broadcast_302 stored as values in memory (estimated size 286.8 KiB, free 1918.1 MiB)
14:51:33.164 INFO MemoryStore - Block broadcast_302_piece0 stored as bytes in memory (estimated size 103.6 KiB, free 1918.0 MiB)
14:51:33.164 INFO BlockManagerInfo - Added broadcast_302_piece0 in memory on localhost:44923 (size: 103.6 KiB, free: 1919.5 MiB)
14:51:33.164 INFO SparkContext - Created broadcast 302 from broadcast at DAGScheduler.scala:1580
14:51:33.164 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 158 (MapPartitionsRDD[729] at filter at ReadsSparkSource.java:96) (first 15 tasks are for partitions Vector(0))
14:51:33.164 INFO TaskSchedulerImpl - Adding task set 158.0 with 1 tasks resource profile 0
14:51:33.165 INFO TaskSetManager - Starting task 0.0 in stage 158.0 (TID 214) (localhost, executor driver, partition 0, PROCESS_LOCAL, 7810 bytes)
14:51:33.165 INFO Executor - Running task 0.0 in stage 158.0 (TID 214)
14:51:33.190 INFO NewHadoopRDD - Input split: file:/tmp/ReadsSparkSinkUnitTest59674865046503962823.cram:0+43713
14:51:33.203 INFO Executor - Finished task 0.0 in stage 158.0 (TID 214). 989 bytes result sent to driver
14:51:33.204 INFO TaskSetManager - Finished task 0.0 in stage 158.0 (TID 214) in 38 ms on localhost (executor driver) (1/1)
14:51:33.204 INFO TaskSchedulerImpl - Removed TaskSet 158.0, whose tasks have all completed, from pool
14:51:33.204 INFO DAGScheduler - ResultStage 158 (count at ReadsSparkSinkUnitTest.java:185) finished in 0.053 s
14:51:33.204 INFO DAGScheduler - Job 114 is finished. Cancelling potential speculative or zombie tasks for this job
14:51:33.204 INFO TaskSchedulerImpl - Killing all running tasks in stage 158: Stage finished
14:51:33.204 INFO DAGScheduler - Job 114 finished: count at ReadsSparkSinkUnitTest.java:185, took 0.053802 s
14:51:33.207 INFO MemoryStore - Block broadcast_303 stored as values in memory (estimated size 297.9 KiB, free 1917.7 MiB)
14:51:33.217 INFO MemoryStore - Block broadcast_303_piece0 stored as bytes in memory (estimated size 50.2 KiB, free 1917.6 MiB)
14:51:33.218 INFO BlockManagerInfo - Added broadcast_303_piece0 in memory on localhost:44923 (size: 50.2 KiB, free: 1919.5 MiB)
14:51:33.218 INFO SparkContext - Created broadcast 303 from newAPIHadoopFile at PathSplitSource.java:96
14:51:33.250 INFO MemoryStore - Block broadcast_304 stored as values in memory (estimated size 297.9 KiB, free 1917.3 MiB)
14:51:33.260 INFO MemoryStore - Block broadcast_304_piece0 stored as bytes in memory (estimated size 50.2 KiB, free 1917.3 MiB)
14:51:33.261 INFO BlockManagerInfo - Added broadcast_304_piece0 in memory on localhost:44923 (size: 50.2 KiB, free: 1919.4 MiB)
14:51:33.261 INFO SparkContext - Created broadcast 304 from newAPIHadoopFile at PathSplitSource.java:96
14:51:33.289 INFO FileInputFormat - Total input files to process : 1
14:51:33.291 INFO MemoryStore - Block broadcast_305 stored as values in memory (estimated size 160.7 KiB, free 1917.1 MiB)
14:51:33.292 INFO MemoryStore - Block broadcast_305_piece0 stored as bytes in memory (estimated size 9.6 KiB, free 1917.1 MiB)
14:51:33.292 INFO BlockManagerInfo - Added broadcast_305_piece0 in memory on localhost:44923 (size: 9.6 KiB, free: 1919.4 MiB)
14:51:33.292 INFO SparkContext - Created broadcast 305 from broadcast at ReadsSparkSink.java:133
14:51:33.296 INFO HadoopMapRedCommitProtocol - Using output committer class org.apache.hadoop.mapred.FileOutputCommitter
14:51:33.296 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
14:51:33.296 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
14:51:33.317 INFO SparkContext - Starting job: runJob at SparkHadoopWriter.scala:83
14:51:33.317 INFO DAGScheduler - Registering RDD 743 (mapToPair at SparkUtils.java:161) as input to shuffle 33
14:51:33.317 INFO DAGScheduler - Got job 115 (runJob at SparkHadoopWriter.scala:83) with 1 output partitions
14:51:33.317 INFO DAGScheduler - Final stage: ResultStage 160 (runJob at SparkHadoopWriter.scala:83)
14:51:33.317 INFO DAGScheduler - Parents of final stage: List(ShuffleMapStage 159)
14:51:33.317 INFO DAGScheduler - Missing parents: List(ShuffleMapStage 159)
14:51:33.318 INFO DAGScheduler - Submitting ShuffleMapStage 159 (MapPartitionsRDD[743] at mapToPair at SparkUtils.java:161), which has no missing parents
14:51:33.336 INFO MemoryStore - Block broadcast_306 stored as values in memory (estimated size 520.4 KiB, free 1916.6 MiB)
14:51:33.338 INFO MemoryStore - Block broadcast_306_piece0 stored as bytes in memory (estimated size 166.1 KiB, free 1916.4 MiB)
14:51:33.338 INFO BlockManagerInfo - Added broadcast_306_piece0 in memory on localhost:44923 (size: 166.1 KiB, free: 1919.3 MiB)
14:51:33.338 INFO SparkContext - Created broadcast 306 from broadcast at DAGScheduler.scala:1580
14:51:33.338 INFO DAGScheduler - Submitting 1 missing tasks from ShuffleMapStage 159 (MapPartitionsRDD[743] at mapToPair at SparkUtils.java:161) (first 15 tasks are for partitions Vector(0))
14:51:33.338 INFO TaskSchedulerImpl - Adding task set 159.0 with 1 tasks resource profile 0
14:51:33.339 INFO TaskSetManager - Starting task 0.0 in stage 159.0 (TID 215) (localhost, executor driver, partition 0, PROCESS_LOCAL, 7861 bytes)
14:51:33.339 INFO Executor - Running task 0.0 in stage 159.0 (TID 215)
14:51:33.375 INFO NewHadoopRDD - Input split: file:/home/runner/work/gatk/gatk/src/test/resources/org/broadinstitute/hellbender/tools/BQSR/HiSeq.1mb.1RG.2k_lines.bam:0+222075
14:51:33.391 INFO Executor - Finished task 0.0 in stage 159.0 (TID 215). 1148 bytes result sent to driver
14:51:33.392 INFO TaskSetManager - Finished task 0.0 in stage 159.0 (TID 215) in 53 ms on localhost (executor driver) (1/1)
14:51:33.392 INFO TaskSchedulerImpl - Removed TaskSet 159.0, whose tasks have all completed, from pool
14:51:33.392 INFO DAGScheduler - ShuffleMapStage 159 (mapToPair at SparkUtils.java:161) finished in 0.074 s
14:51:33.392 INFO DAGScheduler - looking for newly runnable stages
14:51:33.392 INFO DAGScheduler - running: HashSet()
14:51:33.392 INFO DAGScheduler - waiting: HashSet(ResultStage 160)
14:51:33.392 INFO DAGScheduler - failed: HashSet()
14:51:33.392 INFO DAGScheduler - Submitting ResultStage 160 (MapPartitionsRDD[749] at saveAsTextFile at SamSink.java:65), which has no missing parents
14:51:33.400 INFO MemoryStore - Block broadcast_307 stored as values in memory (estimated size 241.1 KiB, free 1916.2 MiB)
14:51:33.400 INFO MemoryStore - Block broadcast_307_piece0 stored as bytes in memory (estimated size 66.9 KiB, free 1916.1 MiB)
14:51:33.401 INFO BlockManagerInfo - Added broadcast_307_piece0 in memory on localhost:44923 (size: 66.9 KiB, free: 1919.2 MiB)
14:51:33.401 INFO SparkContext - Created broadcast 307 from broadcast at DAGScheduler.scala:1580
14:51:33.401 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 160 (MapPartitionsRDD[749] at saveAsTextFile at SamSink.java:65) (first 15 tasks are for partitions Vector(0))
14:51:33.401 INFO TaskSchedulerImpl - Adding task set 160.0 with 1 tasks resource profile 0
14:51:33.402 INFO TaskSetManager - Starting task 0.0 in stage 160.0 (TID 216) (localhost, executor driver, partition 0, NODE_LOCAL, 7513 bytes)
14:51:33.402 INFO Executor - Running task 0.0 in stage 160.0 (TID 216)
14:51:33.406 INFO ShuffleBlockFetcherIterator - Getting 1 (343.8 KiB) non-empty blocks including 1 (343.8 KiB) local and 0 (0.0 B) host-local and 0 (0.0 B) push-merged-local and 0 (0.0 B) remote blocks
14:51:33.407 INFO ShuffleBlockFetcherIterator - Started 0 remote fetches in 0 ms
14:51:33.418 INFO HadoopMapRedCommitProtocol - Using output committer class org.apache.hadoop.mapred.FileOutputCommitter
14:51:33.418 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
14:51:33.418 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
14:51:33.441 INFO FileOutputCommitter - Saved output of task 'attempt_202603041451337013560041304641284_0749_m_000000_0' to file:/tmp/ReadsSparkSinkUnitTest63150524036880750548.sam.parts/_temporary/0/task_202603041451337013560041304641284_0749_m_000000
14:51:33.441 INFO SparkHadoopMapRedUtil - attempt_202603041451337013560041304641284_0749_m_000000_0: Committed. Elapsed time: 0 ms.
14:51:33.441 INFO Executor - Finished task 0.0 in stage 160.0 (TID 216). 1858 bytes result sent to driver
14:51:33.442 INFO TaskSetManager - Finished task 0.0 in stage 160.0 (TID 216) in 40 ms on localhost (executor driver) (1/1)
14:51:33.442 INFO TaskSchedulerImpl - Removed TaskSet 160.0, whose tasks have all completed, from pool
14:51:33.442 INFO DAGScheduler - ResultStage 160 (runJob at SparkHadoopWriter.scala:83) finished in 0.049 s
14:51:33.442 INFO DAGScheduler - Job 115 is finished. Cancelling potential speculative or zombie tasks for this job
14:51:33.442 INFO TaskSchedulerImpl - Killing all running tasks in stage 160: Stage finished
14:51:33.442 INFO DAGScheduler - Job 115 finished: runJob at SparkHadoopWriter.scala:83, took 0.125436 s
14:51:33.442 INFO SparkHadoopWriter - Start to commit write Job job_202603041451337013560041304641284_0749.
14:51:33.448 INFO SparkHadoopWriter - Write Job job_202603041451337013560041304641284_0749 committed. Elapsed time: 5 ms.
14:51:33.460 INFO HadoopFileSystemWrapper - Concatenating 2 parts to /tmp/ReadsSparkSinkUnitTest63150524036880750548.sam
14:51:33.466 INFO HadoopFileSystemWrapper - Concatenating to /tmp/ReadsSparkSinkUnitTest63150524036880750548.sam done
WARNING 2026-03-04 14:51:33 SamReaderFactory Unable to detect file format from input URL or stream, assuming SAM format.
WARNING 2026-03-04 14:51:33 SamReaderFactory Unable to detect file format from input URL or stream, assuming SAM format.
14:51:33.469 INFO MemoryStore - Block broadcast_308 stored as values in memory (estimated size 160.7 KiB, free 1916.0 MiB)
14:51:33.470 INFO MemoryStore - Block broadcast_308_piece0 stored as bytes in memory (estimated size 9.6 KiB, free 1916.0 MiB)
14:51:33.471 INFO BlockManagerInfo - Added broadcast_308_piece0 in memory on localhost:44923 (size: 9.6 KiB, free: 1919.2 MiB)
14:51:33.471 INFO SparkContext - Created broadcast 308 from broadcast at SamSource.java:78
14:51:33.472 INFO MemoryStore - Block broadcast_309 stored as values in memory (estimated size 297.9 KiB, free 1915.7 MiB)
14:51:33.479 INFO BlockManagerInfo - Removed broadcast_305_piece0 on localhost:44923 in memory (size: 9.6 KiB, free: 1919.2 MiB)
14:51:33.480 INFO BlockManagerInfo - Removed broadcast_307_piece0 on localhost:44923 in memory (size: 66.9 KiB, free: 1919.3 MiB)
14:51:33.481 INFO BlockManagerInfo - Removed broadcast_300_piece0 on localhost:44923 in memory (size: 103.6 KiB, free: 1919.4 MiB)
14:51:33.481 INFO BlockManagerInfo - Removed broadcast_302_piece0 on localhost:44923 in memory (size: 103.6 KiB, free: 1919.5 MiB)
14:51:33.482 INFO BlockManagerInfo - Removed broadcast_306_piece0 on localhost:44923 in memory (size: 166.1 KiB, free: 1919.6 MiB)
14:51:33.482 INFO BlockManagerInfo - Removed broadcast_291_piece0 on localhost:44923 in memory (size: 50.2 KiB, free: 1919.7 MiB)
14:51:33.483 INFO BlockManagerInfo - Removed broadcast_301_piece0 on localhost:44923 in memory (size: 103.6 KiB, free: 1919.8 MiB)
14:51:33.483 INFO BlockManagerInfo - Removed broadcast_295_piece0 on localhost:44923 in memory (size: 1473.0 B, free: 1919.8 MiB)
14:51:33.484 INFO BlockManagerInfo - Removed broadcast_304_piece0 on localhost:44923 in memory (size: 50.2 KiB, free: 1919.8 MiB)
14:51:33.484 INFO BlockManagerInfo - Removed broadcast_299_piece0 on localhost:44923 in memory (size: 50.2 KiB, free: 1919.9 MiB)
14:51:33.485 INFO BlockManagerInfo - Removed broadcast_298_piece0 on localhost:44923 in memory (size: 159.0 B, free: 1919.9 MiB)
14:51:33.486 INFO BlockManagerInfo - Removed broadcast_290_piece0 on localhost:44923 in memory (size: 228.0 B, free: 1919.9 MiB)
14:51:33.486 INFO BlockManagerInfo - Removed broadcast_297_piece0 on localhost:44923 in memory (size: 58.0 KiB, free: 1919.9 MiB)
14:51:33.487 INFO MemoryStore - Block broadcast_309_piece0 stored as bytes in memory (estimated size 50.2 KiB, free 1919.2 MiB)
14:51:33.487 INFO BlockManagerInfo - Removed broadcast_294_piece0 on localhost:44923 in memory (size: 1473.0 B, free: 1919.9 MiB)
14:51:33.487 INFO BlockManagerInfo - Added broadcast_309_piece0 in memory on localhost:44923 (size: 50.2 KiB, free: 1919.9 MiB)
14:51:33.488 INFO SparkContext - Created broadcast 309 from newAPIHadoopFile at SamSource.java:108
14:51:33.491 INFO FileInputFormat - Total input files to process : 1
14:51:33.496 INFO SparkContext - Starting job: collect at ReadsSparkSinkUnitTest.java:182
14:51:33.496 INFO DAGScheduler - Got job 116 (collect at ReadsSparkSinkUnitTest.java:182) with 1 output partitions
14:51:33.496 INFO DAGScheduler - Final stage: ResultStage 161 (collect at ReadsSparkSinkUnitTest.java:182)
14:51:33.496 INFO DAGScheduler - Parents of final stage: List()
14:51:33.496 INFO DAGScheduler - Missing parents: List()
14:51:33.497 INFO DAGScheduler - Submitting ResultStage 161 (MapPartitionsRDD[754] at filter at ReadsSparkSource.java:96), which has no missing parents
14:51:33.497 INFO MemoryStore - Block broadcast_310 stored as values in memory (estimated size 7.5 KiB, free 1919.1 MiB)
14:51:33.498 INFO MemoryStore - Block broadcast_310_piece0 stored as bytes in memory (estimated size 3.8 KiB, free 1919.1 MiB)
14:51:33.498 INFO BlockManagerInfo - Added broadcast_310_piece0 in memory on localhost:44923 (size: 3.8 KiB, free: 1919.9 MiB)
14:51:33.498 INFO SparkContext - Created broadcast 310 from broadcast at DAGScheduler.scala:1580
14:51:33.498 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 161 (MapPartitionsRDD[754] at filter at ReadsSparkSource.java:96) (first 15 tasks are for partitions Vector(0))
14:51:33.498 INFO TaskSchedulerImpl - Adding task set 161.0 with 1 tasks resource profile 0
14:51:33.499 INFO TaskSetManager - Starting task 0.0 in stage 161.0 (TID 217) (localhost, executor driver, partition 0, PROCESS_LOCAL, 7809 bytes)
14:51:33.500 INFO Executor - Running task 0.0 in stage 161.0 (TID 217)
14:51:33.501 INFO NewHadoopRDD - Input split: file:/tmp/ReadsSparkSinkUnitTest63150524036880750548.sam:0+847558
14:51:33.520 INFO Executor - Finished task 0.0 in stage 161.0 (TID 217). 651526 bytes result sent to driver
14:51:33.522 INFO TaskSetManager - Finished task 0.0 in stage 161.0 (TID 217) in 23 ms on localhost (executor driver) (1/1)
14:51:33.522 INFO TaskSchedulerImpl - Removed TaskSet 161.0, whose tasks have all completed, from pool
14:51:33.523 INFO DAGScheduler - ResultStage 161 (collect at ReadsSparkSinkUnitTest.java:182) finished in 0.026 s
14:51:33.523 INFO DAGScheduler - Job 116 is finished. Cancelling potential speculative or zombie tasks for this job
14:51:33.523 INFO TaskSchedulerImpl - Killing all running tasks in stage 161: Stage finished
14:51:33.523 INFO DAGScheduler - Job 116 finished: collect at ReadsSparkSinkUnitTest.java:182, took 0.026760 s
14:51:33.538 INFO SparkContext - Starting job: count at ReadsSparkSinkUnitTest.java:185
14:51:33.538 INFO DAGScheduler - Got job 117 (count at ReadsSparkSinkUnitTest.java:185) with 1 output partitions
14:51:33.539 INFO DAGScheduler - Final stage: ResultStage 162 (count at ReadsSparkSinkUnitTest.java:185)
14:51:33.539 INFO DAGScheduler - Parents of final stage: List()
14:51:33.539 INFO DAGScheduler - Missing parents: List()
14:51:33.539 INFO DAGScheduler - Submitting ResultStage 162 (MapPartitionsRDD[736] at filter at ReadsSparkSource.java:96), which has no missing parents
14:51:33.556 INFO MemoryStore - Block broadcast_311 stored as values in memory (estimated size 426.1 KiB, free 1918.7 MiB)
14:51:33.557 INFO MemoryStore - Block broadcast_311_piece0 stored as bytes in memory (estimated size 153.6 KiB, free 1918.6 MiB)
14:51:33.558 INFO BlockManagerInfo - Added broadcast_311_piece0 in memory on localhost:44923 (size: 153.6 KiB, free: 1919.7 MiB)
14:51:33.558 INFO SparkContext - Created broadcast 311 from broadcast at DAGScheduler.scala:1580
14:51:33.558 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 162 (MapPartitionsRDD[736] at filter at ReadsSparkSource.java:96) (first 15 tasks are for partitions Vector(0))
14:51:33.558 INFO TaskSchedulerImpl - Adding task set 162.0 with 1 tasks resource profile 0
14:51:33.559 INFO TaskSetManager - Starting task 0.0 in stage 162.0 (TID 218) (localhost, executor driver, partition 0, PROCESS_LOCAL, 7872 bytes)
14:51:33.559 INFO Executor - Running task 0.0 in stage 162.0 (TID 218)
14:51:33.606 INFO NewHadoopRDD - Input split: file:/home/runner/work/gatk/gatk/src/test/resources/org/broadinstitute/hellbender/tools/BQSR/HiSeq.1mb.1RG.2k_lines.bam:0+222075
14:51:33.617 INFO Executor - Finished task 0.0 in stage 162.0 (TID 218). 989 bytes result sent to driver
14:51:33.617 INFO TaskSetManager - Finished task 0.0 in stage 162.0 (TID 218) in 58 ms on localhost (executor driver) (1/1)
14:51:33.617 INFO TaskSchedulerImpl - Removed TaskSet 162.0, whose tasks have all completed, from pool
14:51:33.617 INFO DAGScheduler - ResultStage 162 (count at ReadsSparkSinkUnitTest.java:185) finished in 0.078 s
14:51:33.618 INFO DAGScheduler - Job 117 is finished. Cancelling potential speculative or zombie tasks for this job
14:51:33.618 INFO TaskSchedulerImpl - Killing all running tasks in stage 162: Stage finished
14:51:33.618 INFO DAGScheduler - Job 117 finished: count at ReadsSparkSinkUnitTest.java:185, took 0.079528 s
14:51:33.621 INFO SparkContext - Starting job: count at ReadsSparkSinkUnitTest.java:185
14:51:33.622 INFO DAGScheduler - Got job 118 (count at ReadsSparkSinkUnitTest.java:185) with 1 output partitions
14:51:33.622 INFO DAGScheduler - Final stage: ResultStage 163 (count at ReadsSparkSinkUnitTest.java:185)
14:51:33.622 INFO DAGScheduler - Parents of final stage: List()
14:51:33.622 INFO DAGScheduler - Missing parents: List()
14:51:33.622 INFO DAGScheduler - Submitting ResultStage 163 (MapPartitionsRDD[754] at filter at ReadsSparkSource.java:96), which has no missing parents
14:51:33.622 INFO MemoryStore - Block broadcast_312 stored as values in memory (estimated size 7.4 KiB, free 1918.6 MiB)
14:51:33.623 INFO MemoryStore - Block broadcast_312_piece0 stored as bytes in memory (estimated size 3.8 KiB, free 1918.6 MiB)
14:51:33.623 INFO BlockManagerInfo - Added broadcast_312_piece0 in memory on localhost:44923 (size: 3.8 KiB, free: 1919.7 MiB)
14:51:33.623 INFO SparkContext - Created broadcast 312 from broadcast at DAGScheduler.scala:1580
14:51:33.624 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 163 (MapPartitionsRDD[754] at filter at ReadsSparkSource.java:96) (first 15 tasks are for partitions Vector(0))
14:51:33.624 INFO TaskSchedulerImpl - Adding task set 163.0 with 1 tasks resource profile 0
14:51:33.624 INFO TaskSetManager - Starting task 0.0 in stage 163.0 (TID 219) (localhost, executor driver, partition 0, PROCESS_LOCAL, 7809 bytes)
14:51:33.624 INFO Executor - Running task 0.0 in stage 163.0 (TID 219)
14:51:33.626 INFO NewHadoopRDD - Input split: file:/tmp/ReadsSparkSinkUnitTest63150524036880750548.sam:0+847558
14:51:33.633 INFO Executor - Finished task 0.0 in stage 163.0 (TID 219). 946 bytes result sent to driver
14:51:33.633 INFO TaskSetManager - Finished task 0.0 in stage 163.0 (TID 219) in 9 ms on localhost (executor driver) (1/1)
14:51:33.633 INFO TaskSchedulerImpl - Removed TaskSet 163.0, whose tasks have all completed, from pool
14:51:33.633 INFO DAGScheduler - ResultStage 163 (count at ReadsSparkSinkUnitTest.java:185) finished in 0.011 s
14:51:33.634 INFO DAGScheduler - Job 118 is finished. Cancelling potential speculative or zombie tasks for this job
14:51:33.634 INFO TaskSchedulerImpl - Killing all running tasks in stage 163: Stage finished
14:51:33.634 INFO DAGScheduler - Job 118 finished: count at ReadsSparkSinkUnitTest.java:185, took 0.012440 s
WARNING 2026-03-04 14:51:33 SamReaderFactory Unable to detect file format from input URL or stream, assuming SAM format.
WARNING 2026-03-04 14:51:33 SamReaderFactory Unable to detect file format from input URL or stream, assuming SAM format.
14:51:33.637 INFO MemoryStore - Block broadcast_313 stored as values in memory (estimated size 21.0 KiB, free 1918.5 MiB)
14:51:33.638 INFO MemoryStore - Block broadcast_313_piece0 stored as bytes in memory (estimated size 2.4 KiB, free 1918.5 MiB)
14:51:33.638 INFO BlockManagerInfo - Added broadcast_313_piece0 in memory on localhost:44923 (size: 2.4 KiB, free: 1919.7 MiB)
14:51:33.638 INFO SparkContext - Created broadcast 313 from broadcast at SamSource.java:78
14:51:33.640 INFO MemoryStore - Block broadcast_314 stored as values in memory (estimated size 298.0 KiB, free 1918.3 MiB)
14:51:33.650 INFO MemoryStore - Block broadcast_314_piece0 stored as bytes in memory (estimated size 50.3 KiB, free 1918.2 MiB)
14:51:33.650 INFO BlockManagerInfo - Added broadcast_314_piece0 in memory on localhost:44923 (size: 50.3 KiB, free: 1919.7 MiB)
14:51:33.651 INFO SparkContext - Created broadcast 314 from newAPIHadoopFile at SamSource.java:108
14:51:33.657 INFO FileInputFormat - Total input files to process : 1
14:51:33.662 INFO SparkContext - Starting job: collect at SparkUtils.java:205
14:51:33.662 INFO DAGScheduler - Got job 119 (collect at SparkUtils.java:205) with 1 output partitions
14:51:33.662 INFO DAGScheduler - Final stage: ResultStage 164 (collect at SparkUtils.java:205)
14:51:33.662 INFO DAGScheduler - Parents of final stage: List()
14:51:33.662 INFO DAGScheduler - Missing parents: List()
14:51:33.662 INFO DAGScheduler - Submitting ResultStage 164 (MapPartitionsRDD[760] at mapPartitions at SparkUtils.java:188), which has no missing parents
14:51:33.663 INFO MemoryStore - Block broadcast_315 stored as values in memory (estimated size 7.9 KiB, free 1918.2 MiB)
14:51:33.663 INFO MemoryStore - Block broadcast_315_piece0 stored as bytes in memory (estimated size 3.9 KiB, free 1918.2 MiB)
14:51:33.664 INFO BlockManagerInfo - Added broadcast_315_piece0 in memory on localhost:44923 (size: 3.9 KiB, free: 1919.7 MiB)
14:51:33.664 INFO SparkContext - Created broadcast 315 from broadcast at DAGScheduler.scala:1580
14:51:33.664 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 164 (MapPartitionsRDD[760] at mapPartitions at SparkUtils.java:188) (first 15 tasks are for partitions Vector(0))
14:51:33.664 INFO TaskSchedulerImpl - Adding task set 164.0 with 1 tasks resource profile 0
14:51:33.665 INFO TaskSetManager - Starting task 0.0 in stage 164.0 (TID 220) (localhost, executor driver, partition 0, PROCESS_LOCAL, 7936 bytes)
14:51:33.665 INFO Executor - Running task 0.0 in stage 164.0 (TID 220)
14:51:33.666 INFO NewHadoopRDD - Input split: file:/home/runner/work/gatk/gatk/src/test/resources/org/broadinstitute/hellbender/engine/CEUTrio.HiSeq.WGS.b37.NA12878.20.21.10000000-10000020.with.unmapped.queryname.samtools.sam:0+224884
14:51:33.669 INFO Executor - Finished task 0.0 in stage 164.0 (TID 220). 1700 bytes result sent to driver
14:51:33.670 INFO TaskSetManager - Finished task 0.0 in stage 164.0 (TID 220) in 6 ms on localhost (executor driver) (1/1)
14:51:33.671 INFO TaskSchedulerImpl - Removed TaskSet 164.0, whose tasks have all completed, from pool
14:51:33.671 INFO DAGScheduler - ResultStage 164 (collect at SparkUtils.java:205) finished in 0.009 s
14:51:33.671 INFO DAGScheduler - Job 119 is finished. Cancelling potential speculative or zombie tasks for this job
14:51:33.671 INFO TaskSchedulerImpl - Killing all running tasks in stage 164: Stage finished
14:51:33.671 INFO DAGScheduler - Job 119 finished: collect at SparkUtils.java:205, took 0.009169 s
WARNING 2026-03-04 14:51:33 SamReaderFactory Unable to detect file format from input URL or stream, assuming SAM format.
WARNING 2026-03-04 14:51:33 SamReaderFactory Unable to detect file format from input URL or stream, assuming SAM format.
14:51:33.677 INFO MemoryStore - Block broadcast_316 stored as values in memory (estimated size 21.0 KiB, free 1918.2 MiB)
14:51:33.678 INFO MemoryStore - Block broadcast_316_piece0 stored as bytes in memory (estimated size 2.4 KiB, free 1918.2 MiB)
14:51:33.678 INFO BlockManagerInfo - Added broadcast_316_piece0 in memory on localhost:44923 (size: 2.4 KiB, free: 1919.7 MiB)
14:51:33.678 INFO SparkContext - Created broadcast 316 from broadcast at SamSource.java:78
14:51:33.679 INFO MemoryStore - Block broadcast_317 stored as values in memory (estimated size 298.0 KiB, free 1917.9 MiB)
14:51:33.685 INFO MemoryStore - Block broadcast_317_piece0 stored as bytes in memory (estimated size 50.3 KiB, free 1917.8 MiB)
14:51:33.686 INFO BlockManagerInfo - Added broadcast_317_piece0 in memory on localhost:44923 (size: 50.3 KiB, free: 1919.6 MiB)
14:51:33.686 INFO SparkContext - Created broadcast 317 from newAPIHadoopFile at SamSource.java:108
14:51:33.688 INFO MemoryStore - Block broadcast_318 stored as values in memory (estimated size 21.0 KiB, free 1917.8 MiB)
14:51:33.689 INFO MemoryStore - Block broadcast_318_piece0 stored as bytes in memory (estimated size 2.4 KiB, free 1917.8 MiB)
14:51:33.689 INFO BlockManagerInfo - Added broadcast_318_piece0 in memory on localhost:44923 (size: 2.4 KiB, free: 1919.6 MiB)
14:51:33.689 INFO SparkContext - Created broadcast 318 from broadcast at ReadsSparkSink.java:133
14:51:33.690 INFO MemoryStore - Block broadcast_319 stored as values in memory (estimated size 21.5 KiB, free 1917.8 MiB)
14:51:33.691 INFO MemoryStore - Block broadcast_319_piece0 stored as bytes in memory (estimated size 2.4 KiB, free 1917.8 MiB)
14:51:33.691 INFO BlockManagerInfo - Added broadcast_319_piece0 in memory on localhost:44923 (size: 2.4 KiB, free: 1919.6 MiB)
14:51:33.691 INFO SparkContext - Created broadcast 319 from broadcast at BamSink.java:76
14:51:33.693 INFO PathOutputCommitterFactory - No output committer factory defined, defaulting to FileOutputCommitterFactory
14:51:33.693 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
14:51:33.693 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
14:51:33.713 INFO SparkContext - Starting job: runJob at SparkHadoopWriter.scala:83
14:51:33.713 INFO DAGScheduler - Got job 120 (runJob at SparkHadoopWriter.scala:83) with 1 output partitions
14:51:33.713 INFO DAGScheduler - Final stage: ResultStage 165 (runJob at SparkHadoopWriter.scala:83)
14:51:33.713 INFO DAGScheduler - Parents of final stage: List()
14:51:33.714 INFO DAGScheduler - Missing parents: List()
14:51:33.714 INFO DAGScheduler - Submitting ResultStage 165 (MapPartitionsRDD[770] at mapToPair at BamSink.java:91), which has no missing parents
14:51:33.720 INFO MemoryStore - Block broadcast_320 stored as values in memory (estimated size 152.3 KiB, free 1917.6 MiB)
14:51:33.721 INFO MemoryStore - Block broadcast_320_piece0 stored as bytes in memory (estimated size 56.4 KiB, free 1917.6 MiB)
14:51:33.721 INFO BlockManagerInfo - Added broadcast_320_piece0 in memory on localhost:44923 (size: 56.4 KiB, free: 1919.6 MiB)
14:51:33.722 INFO SparkContext - Created broadcast 320 from broadcast at DAGScheduler.scala:1580
14:51:33.722 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 165 (MapPartitionsRDD[770] at mapToPair at BamSink.java:91) (first 15 tasks are for partitions Vector(0))
14:51:33.722 INFO TaskSchedulerImpl - Adding task set 165.0 with 1 tasks resource profile 0
14:51:33.723 INFO TaskSetManager - Starting task 0.0 in stage 165.0 (TID 221) (localhost, executor driver, partition 0, PROCESS_LOCAL, 8561 bytes)
14:51:33.723 INFO Executor - Running task 0.0 in stage 165.0 (TID 221)
14:51:33.727 INFO NewHadoopRDD - Input split: file:/home/runner/work/gatk/gatk/src/test/resources/org/broadinstitute/hellbender/engine/CEUTrio.HiSeq.WGS.b37.NA12878.20.21.10000000-10000020.with.unmapped.queryname.samtools.sam:0+224884
14:51:33.731 INFO PathOutputCommitterFactory - No output committer factory defined, defaulting to FileOutputCommitterFactory
14:51:33.731 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
14:51:33.731 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
14:51:33.731 INFO PathOutputCommitterFactory - No output committer factory defined, defaulting to FileOutputCommitterFactory
14:51:33.731 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
14:51:33.731 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
14:51:33.767 INFO FileOutputCommitter - Saved output of task 'attempt_202603041451333180527784169619816_0770_r_000000_0' to file:/tmp/ReadsSparkSinkNotSorting13003069890460476492.bam.parts/_temporary/0/task_202603041451333180527784169619816_0770_r_000000
14:51:33.767 INFO SparkHadoopMapRedUtil - attempt_202603041451333180527784169619816_0770_r_000000_0: Committed. Elapsed time: 0 ms.
14:51:33.768 INFO Executor - Finished task 0.0 in stage 165.0 (TID 221). 1084 bytes result sent to driver
14:51:33.768 INFO TaskSetManager - Finished task 0.0 in stage 165.0 (TID 221) in 46 ms on localhost (executor driver) (1/1)
14:51:33.768 INFO TaskSchedulerImpl - Removed TaskSet 165.0, whose tasks have all completed, from pool
14:51:33.768 INFO DAGScheduler - ResultStage 165 (runJob at SparkHadoopWriter.scala:83) finished in 0.054 s
14:51:33.768 INFO DAGScheduler - Job 120 is finished. Cancelling potential speculative or zombie tasks for this job
14:51:33.768 INFO TaskSchedulerImpl - Killing all running tasks in stage 165: Stage finished
14:51:33.768 INFO DAGScheduler - Job 120 finished: runJob at SparkHadoopWriter.scala:83, took 0.055562 s
14:51:33.769 INFO SparkHadoopWriter - Start to commit write Job job_202603041451333180527784169619816_0770.
14:51:33.775 INFO SparkHadoopWriter - Write Job job_202603041451333180527784169619816_0770 committed. Elapsed time: 6 ms.
14:51:33.795 INFO HadoopFileSystemWrapper - Concatenating 3 parts to /tmp/ReadsSparkSinkNotSorting13003069890460476492.bam
14:51:33.802 INFO HadoopFileSystemWrapper - Concatenating to /tmp/ReadsSparkSinkNotSorting13003069890460476492.bam done
14:51:33.802 INFO IndexFileMerger - Merging .sbi files in temp directory /tmp/ReadsSparkSinkNotSorting13003069890460476492.bam.parts/ to /tmp/ReadsSparkSinkNotSorting13003069890460476492.bam.sbi
14:51:33.811 INFO IndexFileMerger - Done merging .sbi files
14:51:33.812 INFO MemoryStore - Block broadcast_321 stored as values in memory (estimated size 192.0 B, free 1917.6 MiB)
14:51:33.818 INFO MemoryStore - Block broadcast_321_piece0 stored as bytes in memory (estimated size 127.0 B, free 1917.6 MiB)
14:51:33.818 INFO BlockManagerInfo - Added broadcast_321_piece0 in memory on localhost:44923 (size: 127.0 B, free: 1919.6 MiB)
14:51:33.818 INFO SparkContext - Created broadcast 321 from broadcast at BamSource.java:104
14:51:33.819 INFO BlockManagerInfo - Removed broadcast_318_piece0 on localhost:44923 in memory (size: 2.4 KiB, free: 1919.6 MiB)
14:51:33.819 INFO BlockManagerInfo - Removed broadcast_309_piece0 on localhost:44923 in memory (size: 50.2 KiB, free: 1919.6 MiB)
14:51:33.820 INFO BlockManagerInfo - Removed broadcast_320_piece0 on localhost:44923 in memory (size: 56.4 KiB, free: 1919.7 MiB)
14:51:33.820 INFO MemoryStore - Block broadcast_322 stored as values in memory (estimated size 297.9 KiB, free 1917.9 MiB)
14:51:33.820 INFO BlockManagerInfo - Removed broadcast_316_piece0 on localhost:44923 in memory (size: 2.4 KiB, free: 1919.7 MiB)
14:51:33.821 INFO BlockManagerInfo - Removed broadcast_303_piece0 on localhost:44923 in memory (size: 50.2 KiB, free: 1919.7 MiB)
14:51:33.821 INFO BlockManagerInfo - Removed broadcast_319_piece0 on localhost:44923 in memory (size: 2.4 KiB, free: 1919.7 MiB)
14:51:33.822 INFO BlockManagerInfo - Removed broadcast_310_piece0 on localhost:44923 in memory (size: 3.8 KiB, free: 1919.7 MiB)
14:51:33.822 INFO BlockManagerInfo - Removed broadcast_312_piece0 on localhost:44923 in memory (size: 3.8 KiB, free: 1919.7 MiB)
14:51:33.823 INFO BlockManagerInfo - Removed broadcast_317_piece0 on localhost:44923 in memory (size: 50.3 KiB, free: 1919.8 MiB)
14:51:33.823 INFO BlockManagerInfo - Removed broadcast_308_piece0 on localhost:44923 in memory (size: 9.6 KiB, free: 1919.8 MiB)
14:51:33.824 INFO BlockManagerInfo - Removed broadcast_311_piece0 on localhost:44923 in memory (size: 153.6 KiB, free: 1919.9 MiB)
14:51:33.824 INFO BlockManagerInfo - Removed broadcast_315_piece0 on localhost:44923 in memory (size: 3.9 KiB, free: 1919.9 MiB)
14:51:33.830 INFO MemoryStore - Block broadcast_322_piece0 stored as bytes in memory (estimated size 50.1 KiB, free 1919.3 MiB)
14:51:33.831 INFO BlockManagerInfo - Added broadcast_322_piece0 in memory on localhost:44923 (size: 50.1 KiB, free: 1919.9 MiB)
14:51:33.831 INFO SparkContext - Created broadcast 322 from newAPIHadoopFile at PathSplitSource.java:96
14:51:33.847 INFO FileInputFormat - Total input files to process : 1
14:51:33.862 INFO SparkContext - Starting job: collect at SparkUtils.java:205
14:51:33.862 INFO DAGScheduler - Got job 121 (collect at SparkUtils.java:205) with 1 output partitions
14:51:33.862 INFO DAGScheduler - Final stage: ResultStage 166 (collect at SparkUtils.java:205)
14:51:33.862 INFO DAGScheduler - Parents of final stage: List()
14:51:33.862 INFO DAGScheduler - Missing parents: List()
14:51:33.862 INFO DAGScheduler - Submitting ResultStage 166 (MapPartitionsRDD[777] at mapPartitions at SparkUtils.java:188), which has no missing parents
14:51:33.869 INFO MemoryStore - Block broadcast_323 stored as values in memory (estimated size 148.6 KiB, free 1919.2 MiB)
14:51:33.870 INFO MemoryStore - Block broadcast_323_piece0 stored as bytes in memory (estimated size 54.7 KiB, free 1919.1 MiB)
14:51:33.870 INFO BlockManagerInfo - Added broadcast_323_piece0 in memory on localhost:44923 (size: 54.7 KiB, free: 1919.8 MiB)
14:51:33.870 INFO SparkContext - Created broadcast 323 from broadcast at DAGScheduler.scala:1580
14:51:33.870 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 166 (MapPartitionsRDD[777] at mapPartitions at SparkUtils.java:188) (first 15 tasks are for partitions Vector(0))
14:51:33.870 INFO TaskSchedulerImpl - Adding task set 166.0 with 1 tasks resource profile 0
14:51:33.871 INFO TaskSetManager - Starting task 0.0 in stage 166.0 (TID 222) (localhost, executor driver, partition 0, PROCESS_LOCAL, 7811 bytes)
14:51:33.871 INFO Executor - Running task 0.0 in stage 166.0 (TID 222)
14:51:33.885 INFO NewHadoopRDD - Input split: file:/tmp/ReadsSparkSinkNotSorting13003069890460476492.bam:0+59395
14:51:33.887 INFO Executor - Finished task 0.0 in stage 166.0 (TID 222). 1700 bytes result sent to driver
14:51:33.887 INFO TaskSetManager - Finished task 0.0 in stage 166.0 (TID 222) in 16 ms on localhost (executor driver) (1/1)
14:51:33.887 INFO TaskSchedulerImpl - Removed TaskSet 166.0, whose tasks have all completed, from pool
14:51:33.888 INFO DAGScheduler - ResultStage 166 (collect at SparkUtils.java:205) finished in 0.025 s
14:51:33.888 INFO DAGScheduler - Job 121 is finished. Cancelling potential speculative or zombie tasks for this job
14:51:33.888 INFO TaskSchedulerImpl - Killing all running tasks in stage 166: Stage finished
14:51:33.888 INFO DAGScheduler - Job 121 finished: collect at SparkUtils.java:205, took 0.026225 s
14:51:33.914 INFO SparkContext - Starting job: collect at ReadsSparkSinkUnitTest.java:91
14:51:33.915 INFO DAGScheduler - Got job 122 (collect at ReadsSparkSinkUnitTest.java:91) with 1 output partitions
14:51:33.915 INFO DAGScheduler - Final stage: ResultStage 167 (collect at ReadsSparkSinkUnitTest.java:91)
14:51:33.915 INFO DAGScheduler - Parents of final stage: List()
14:51:33.915 INFO DAGScheduler - Missing parents: List()
14:51:33.915 INFO DAGScheduler - Submitting ResultStage 167 (ZippedPartitionsRDD2[780] at zipPartitions at SparkUtils.java:244), which has no missing parents
14:51:33.921 INFO MemoryStore - Block broadcast_324 stored as values in memory (estimated size 149.8 KiB, free 1919.0 MiB)
14:51:33.922 INFO MemoryStore - Block broadcast_324_piece0 stored as bytes in memory (estimated size 55.2 KiB, free 1918.9 MiB)
14:51:33.922 INFO BlockManagerInfo - Added broadcast_324_piece0 in memory on localhost:44923 (size: 55.2 KiB, free: 1919.8 MiB)
14:51:33.923 INFO SparkContext - Created broadcast 324 from broadcast at DAGScheduler.scala:1580
14:51:33.923 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 167 (ZippedPartitionsRDD2[780] at zipPartitions at SparkUtils.java:244) (first 15 tasks are for partitions Vector(0))
14:51:33.923 INFO TaskSchedulerImpl - Adding task set 167.0 with 1 tasks resource profile 0
14:51:33.923 INFO TaskSetManager - Starting task 0.0 in stage 167.0 (TID 223) (localhost, executor driver, partition 0, PROCESS_LOCAL, 8436 bytes)
14:51:33.924 INFO Executor - Running task 0.0 in stage 167.0 (TID 223)
14:51:33.937 INFO NewHadoopRDD - Input split: file:/tmp/ReadsSparkSinkNotSorting13003069890460476492.bam:0+59395
14:51:33.939 INFO Executor - Finished task 0.0 in stage 167.0 (TID 223). 192451 bytes result sent to driver
14:51:33.940 INFO TaskSetManager - Finished task 0.0 in stage 167.0 (TID 223) in 17 ms on localhost (executor driver) (1/1)
14:51:33.940 INFO TaskSchedulerImpl - Removed TaskSet 167.0, whose tasks have all completed, from pool
14:51:33.940 INFO DAGScheduler - ResultStage 167 (collect at ReadsSparkSinkUnitTest.java:91) finished in 0.025 s
14:51:33.940 INFO DAGScheduler - Job 122 is finished. Cancelling potential speculative or zombie tasks for this job
14:51:33.940 INFO TaskSchedulerImpl - Killing all running tasks in stage 167: Stage finished
14:51:33.941 INFO DAGScheduler - Job 122 finished: collect at ReadsSparkSinkUnitTest.java:91, took 0.026292 s
WARNING 2026-03-04 14:51:33 SamReaderFactory Unable to detect file format from input URL or stream, assuming SAM format.
WARNING 2026-03-04 14:51:33 SamReaderFactory Unable to detect file format from input URL or stream, assuming SAM format.
14:51:33.943 INFO MemoryStore - Block broadcast_325 stored as values in memory (estimated size 21.0 KiB, free 1918.9 MiB)
14:51:33.943 INFO MemoryStore - Block broadcast_325_piece0 stored as bytes in memory (estimated size 2.4 KiB, free 1918.9 MiB)
14:51:33.943 INFO BlockManagerInfo - Added broadcast_325_piece0 in memory on localhost:44923 (size: 2.4 KiB, free: 1919.8 MiB)
14:51:33.944 INFO SparkContext - Created broadcast 325 from broadcast at SamSource.java:78
14:51:33.945 INFO MemoryStore - Block broadcast_326 stored as values in memory (estimated size 298.0 KiB, free 1918.6 MiB)
14:51:33.951 INFO MemoryStore - Block broadcast_326_piece0 stored as bytes in memory (estimated size 50.3 KiB, free 1918.5 MiB)
14:51:33.951 INFO BlockManagerInfo - Added broadcast_326_piece0 in memory on localhost:44923 (size: 50.3 KiB, free: 1919.7 MiB)
14:51:33.952 INFO SparkContext - Created broadcast 326 from newAPIHadoopFile at SamSource.java:108
14:51:33.955 INFO FileInputFormat - Total input files to process : 1
14:51:33.959 INFO SparkContext - Starting job: collect at SparkUtils.java:205
14:51:33.960 INFO DAGScheduler - Got job 123 (collect at SparkUtils.java:205) with 1 output partitions
14:51:33.960 INFO DAGScheduler - Final stage: ResultStage 168 (collect at SparkUtils.java:205)
14:51:33.960 INFO DAGScheduler - Parents of final stage: List()
14:51:33.960 INFO DAGScheduler - Missing parents: List()
14:51:33.960 INFO DAGScheduler - Submitting ResultStage 168 (MapPartitionsRDD[786] at mapPartitions at SparkUtils.java:188), which has no missing parents
14:51:33.961 INFO MemoryStore - Block broadcast_327 stored as values in memory (estimated size 7.9 KiB, free 1918.5 MiB)
14:51:33.961 INFO MemoryStore - Block broadcast_327_piece0 stored as bytes in memory (estimated size 3.9 KiB, free 1918.5 MiB)
14:51:33.961 INFO BlockManagerInfo - Added broadcast_327_piece0 in memory on localhost:44923 (size: 3.9 KiB, free: 1919.7 MiB)
14:51:33.962 INFO SparkContext - Created broadcast 327 from broadcast at DAGScheduler.scala:1580
14:51:33.962 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 168 (MapPartitionsRDD[786] at mapPartitions at SparkUtils.java:188) (first 15 tasks are for partitions Vector(0))
14:51:33.962 INFO TaskSchedulerImpl - Adding task set 168.0 with 1 tasks resource profile 0
14:51:33.962 INFO TaskSetManager - Starting task 0.0 in stage 168.0 (TID 224) (localhost, executor driver, partition 0, PROCESS_LOCAL, 7936 bytes)
14:51:33.963 INFO Executor - Running task 0.0 in stage 168.0 (TID 224)
14:51:33.964 INFO NewHadoopRDD - Input split: file:/home/runner/work/gatk/gatk/src/test/resources/org/broadinstitute/hellbender/engine/CEUTrio.HiSeq.WGS.b37.NA12878.20.21.10000000-10000020.with.unmapped.queryname.samtools.sam:0+224884
14:51:33.967 INFO Executor - Finished task 0.0 in stage 168.0 (TID 224). 1700 bytes result sent to driver
14:51:33.967 INFO TaskSetManager - Finished task 0.0 in stage 168.0 (TID 224) in 5 ms on localhost (executor driver) (1/1)
14:51:33.967 INFO TaskSchedulerImpl - Removed TaskSet 168.0, whose tasks have all completed, from pool
14:51:33.967 INFO DAGScheduler - ResultStage 168 (collect at SparkUtils.java:205) finished in 0.007 s
14:51:33.967 INFO DAGScheduler - Job 123 is finished. Cancelling potential speculative or zombie tasks for this job
14:51:33.967 INFO TaskSchedulerImpl - Killing all running tasks in stage 168: Stage finished
14:51:33.968 INFO DAGScheduler - Job 123 finished: collect at SparkUtils.java:205, took 0.008190 s
14:51:33.974 INFO SparkContext - Starting job: collect at ReadsSparkSinkUnitTest.java:94
14:51:33.974 INFO DAGScheduler - Got job 124 (collect at ReadsSparkSinkUnitTest.java:94) with 1 output partitions
14:51:33.974 INFO DAGScheduler - Final stage: ResultStage 169 (collect at ReadsSparkSinkUnitTest.java:94)
14:51:33.974 INFO DAGScheduler - Parents of final stage: List()
14:51:33.974 INFO DAGScheduler - Missing parents: List()
14:51:33.974 INFO DAGScheduler - Submitting ResultStage 169 (ZippedPartitionsRDD2[789] at zipPartitions at SparkUtils.java:244), which has no missing parents
14:51:33.975 INFO MemoryStore - Block broadcast_328 stored as values in memory (estimated size 9.6 KiB, free 1918.5 MiB)
14:51:33.976 INFO MemoryStore - Block broadcast_328_piece0 stored as bytes in memory (estimated size 4.4 KiB, free 1918.5 MiB)
14:51:33.976 INFO BlockManagerInfo - Added broadcast_328_piece0 in memory on localhost:44923 (size: 4.4 KiB, free: 1919.7 MiB)
14:51:33.976 INFO SparkContext - Created broadcast 328 from broadcast at DAGScheduler.scala:1580
14:51:33.976 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 169 (ZippedPartitionsRDD2[789] at zipPartitions at SparkUtils.java:244) (first 15 tasks are for partitions Vector(0))
14:51:33.976 INFO TaskSchedulerImpl - Adding task set 169.0 with 1 tasks resource profile 0
14:51:33.977 INFO TaskSetManager - Starting task 0.0 in stage 169.0 (TID 225) (localhost, executor driver, partition 0, PROCESS_LOCAL, 8561 bytes)
14:51:33.977 INFO Executor - Running task 0.0 in stage 169.0 (TID 225)
14:51:33.979 INFO NewHadoopRDD - Input split: file:/home/runner/work/gatk/gatk/src/test/resources/org/broadinstitute/hellbender/engine/CEUTrio.HiSeq.WGS.b37.NA12878.20.21.10000000-10000020.with.unmapped.queryname.samtools.sam:0+224884
14:51:33.986 INFO Executor - Finished task 0.0 in stage 169.0 (TID 225). 192494 bytes result sent to driver
14:51:33.986 INFO TaskSetManager - Finished task 0.0 in stage 169.0 (TID 225) in 10 ms on localhost (executor driver) (1/1)
14:51:33.986 INFO TaskSchedulerImpl - Removed TaskSet 169.0, whose tasks have all completed, from pool
14:51:33.987 INFO DAGScheduler - ResultStage 169 (collect at ReadsSparkSinkUnitTest.java:94) finished in 0.013 s
14:51:33.987 INFO DAGScheduler - Job 124 is finished. Cancelling potential speculative or zombie tasks for this job
14:51:33.987 INFO TaskSchedulerImpl - Killing all running tasks in stage 169: Stage finished
14:51:33.987 INFO DAGScheduler - Job 124 finished: collect at ReadsSparkSinkUnitTest.java:94, took 0.013396 s
14:51:33.997 INFO MemoryStore - Block broadcast_329 stored as values in memory (estimated size 297.9 KiB, free 1918.2 MiB)
14:51:34.004 INFO MemoryStore - Block broadcast_329_piece0 stored as bytes in memory (estimated size 50.2 KiB, free 1918.2 MiB)
14:51:34.004 INFO BlockManagerInfo - Added broadcast_329_piece0 in memory on localhost:44923 (size: 50.2 KiB, free: 1919.7 MiB)
14:51:34.004 INFO SparkContext - Created broadcast 329 from newAPIHadoopFile at PathSplitSource.java:96
14:51:34.036 INFO MemoryStore - Block broadcast_330 stored as values in memory (estimated size 297.9 KiB, free 1917.9 MiB)
14:51:34.043 INFO MemoryStore - Block broadcast_330_piece0 stored as bytes in memory (estimated size 50.2 KiB, free 1917.8 MiB)
14:51:34.043 INFO BlockManagerInfo - Added broadcast_330_piece0 in memory on localhost:44923 (size: 50.2 KiB, free: 1919.6 MiB)
14:51:34.043 INFO SparkContext - Created broadcast 330 from newAPIHadoopFile at PathSplitSource.java:96
14:51:34.065 INFO FileInputFormat - Total input files to process : 1
14:51:34.067 INFO MemoryStore - Block broadcast_331 stored as values in memory (estimated size 160.7 KiB, free 1917.7 MiB)
14:51:34.068 INFO MemoryStore - Block broadcast_331_piece0 stored as bytes in memory (estimated size 9.6 KiB, free 1917.7 MiB)
14:51:34.068 INFO BlockManagerInfo - Added broadcast_331_piece0 in memory on localhost:44923 (size: 9.6 KiB, free: 1919.6 MiB)
14:51:34.069 INFO SparkContext - Created broadcast 331 from broadcast at ReadsSparkSink.java:133
14:51:34.070 INFO MemoryStore - Block broadcast_332 stored as values in memory (estimated size 163.2 KiB, free 1917.5 MiB)
14:51:34.071 INFO MemoryStore - Block broadcast_332_piece0 stored as bytes in memory (estimated size 9.6 KiB, free 1917.5 MiB)
14:51:34.071 INFO BlockManagerInfo - Added broadcast_332_piece0 in memory on localhost:44923 (size: 9.6 KiB, free: 1919.6 MiB)
14:51:34.071 INFO SparkContext - Created broadcast 332 from broadcast at BamSink.java:76
14:51:34.074 INFO PathOutputCommitterFactory - No output committer factory defined, defaulting to FileOutputCommitterFactory
14:51:34.074 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
14:51:34.074 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
14:51:34.096 INFO SparkContext - Starting job: runJob at SparkHadoopWriter.scala:83
14:51:34.097 INFO DAGScheduler - Registering RDD 803 (mapToPair at SparkUtils.java:161) as input to shuffle 34
14:51:34.097 INFO DAGScheduler - Got job 125 (runJob at SparkHadoopWriter.scala:83) with 1 output partitions
14:51:34.097 INFO DAGScheduler - Final stage: ResultStage 171 (runJob at SparkHadoopWriter.scala:83)
14:51:34.097 INFO DAGScheduler - Parents of final stage: List(ShuffleMapStage 170)
14:51:34.097 INFO DAGScheduler - Missing parents: List(ShuffleMapStage 170)
14:51:34.097 INFO DAGScheduler - Submitting ShuffleMapStage 170 (MapPartitionsRDD[803] at mapToPair at SparkUtils.java:161), which has no missing parents
14:51:34.115 INFO MemoryStore - Block broadcast_333 stored as values in memory (estimated size 520.4 KiB, free 1917.0 MiB)
14:51:34.117 INFO MemoryStore - Block broadcast_333_piece0 stored as bytes in memory (estimated size 166.1 KiB, free 1916.8 MiB)
14:51:34.117 INFO BlockManagerInfo - Added broadcast_333_piece0 in memory on localhost:44923 (size: 166.1 KiB, free: 1919.5 MiB)
14:51:34.117 INFO SparkContext - Created broadcast 333 from broadcast at DAGScheduler.scala:1580
14:51:34.117 INFO DAGScheduler - Submitting 1 missing tasks from ShuffleMapStage 170 (MapPartitionsRDD[803] at mapToPair at SparkUtils.java:161) (first 15 tasks are for partitions Vector(0))
14:51:34.117 INFO TaskSchedulerImpl - Adding task set 170.0 with 1 tasks resource profile 0
14:51:34.118 INFO TaskSetManager - Starting task 0.0 in stage 170.0 (TID 226) (localhost, executor driver, partition 0, PROCESS_LOCAL, 7861 bytes)
14:51:34.118 INFO Executor - Running task 0.0 in stage 170.0 (TID 226)
14:51:34.155 INFO NewHadoopRDD - Input split: file:/home/runner/work/gatk/gatk/src/test/resources/org/broadinstitute/hellbender/tools/BQSR/HiSeq.1mb.1RG.2k_lines.bam:0+222075
14:51:34.176 INFO Executor - Finished task 0.0 in stage 170.0 (TID 226). 1234 bytes result sent to driver
14:51:34.176 INFO BlockManagerInfo - Removed broadcast_326_piece0 on localhost:44923 in memory (size: 50.3 KiB, free: 1919.5 MiB)
14:51:34.176 INFO TaskSetManager - Finished task 0.0 in stage 170.0 (TID 226) in 58 ms on localhost (executor driver) (1/1)
14:51:34.176 INFO TaskSchedulerImpl - Removed TaskSet 170.0, whose tasks have all completed, from pool
14:51:34.177 INFO DAGScheduler - ShuffleMapStage 170 (mapToPair at SparkUtils.java:161) finished in 0.079 s
14:51:34.177 INFO DAGScheduler - looking for newly runnable stages
14:51:34.177 INFO DAGScheduler - running: HashSet()
14:51:34.177 INFO DAGScheduler - waiting: HashSet(ResultStage 171)
14:51:34.177 INFO DAGScheduler - failed: HashSet()
14:51:34.177 INFO DAGScheduler - Submitting ResultStage 171 (MapPartitionsRDD[808] at mapToPair at BamSink.java:91), which has no missing parents
14:51:34.177 INFO BlockManagerInfo - Removed broadcast_325_piece0 on localhost:44923 in memory (size: 2.4 KiB, free: 1919.5 MiB)
14:51:34.178 INFO BlockManagerInfo - Removed broadcast_322_piece0 on localhost:44923 in memory (size: 50.1 KiB, free: 1919.6 MiB)
14:51:34.178 INFO BlockManagerInfo - Removed broadcast_314_piece0 on localhost:44923 in memory (size: 50.3 KiB, free: 1919.6 MiB)
14:51:34.179 INFO BlockManagerInfo - Removed broadcast_330_piece0 on localhost:44923 in memory (size: 50.2 KiB, free: 1919.7 MiB)
14:51:34.180 INFO BlockManagerInfo - Removed broadcast_327_piece0 on localhost:44923 in memory (size: 3.9 KiB, free: 1919.7 MiB)
14:51:34.182 INFO BlockManagerInfo - Removed broadcast_328_piece0 on localhost:44923 in memory (size: 4.4 KiB, free: 1919.7 MiB)
14:51:34.182 INFO BlockManagerInfo - Removed broadcast_324_piece0 on localhost:44923 in memory (size: 55.2 KiB, free: 1919.7 MiB)
14:51:34.183 INFO BlockManagerInfo - Removed broadcast_321_piece0 on localhost:44923 in memory (size: 127.0 B, free: 1919.7 MiB)
14:51:34.183 INFO BlockManagerInfo - Removed broadcast_313_piece0 on localhost:44923 in memory (size: 2.4 KiB, free: 1919.7 MiB)
14:51:34.183 INFO BlockManagerInfo - Removed broadcast_323_piece0 on localhost:44923 in memory (size: 54.7 KiB, free: 1919.8 MiB)
14:51:34.187 INFO MemoryStore - Block broadcast_334 stored as values in memory (estimated size 241.4 KiB, free 1918.4 MiB)
14:51:34.188 INFO MemoryStore - Block broadcast_334_piece0 stored as bytes in memory (estimated size 67.0 KiB, free 1918.4 MiB)
14:51:34.188 INFO BlockManagerInfo - Added broadcast_334_piece0 in memory on localhost:44923 (size: 67.0 KiB, free: 1919.7 MiB)
14:51:34.188 INFO SparkContext - Created broadcast 334 from broadcast at DAGScheduler.scala:1580
14:51:34.188 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 171 (MapPartitionsRDD[808] at mapToPair at BamSink.java:91) (first 15 tasks are for partitions Vector(0))
14:51:34.188 INFO TaskSchedulerImpl - Adding task set 171.0 with 1 tasks resource profile 0
14:51:34.189 INFO TaskSetManager - Starting task 0.0 in stage 171.0 (TID 227) (localhost, executor driver, partition 0, NODE_LOCAL, 7513 bytes)
14:51:34.189 INFO Executor - Running task 0.0 in stage 171.0 (TID 227)
14:51:34.194 INFO ShuffleBlockFetcherIterator - Getting 1 (343.8 KiB) non-empty blocks including 1 (343.8 KiB) local and 0 (0.0 B) host-local and 0 (0.0 B) push-merged-local and 0 (0.0 B) remote blocks
14:51:34.194 INFO ShuffleBlockFetcherIterator - Started 0 remote fetches in 0 ms
14:51:34.205 INFO PathOutputCommitterFactory - No output committer factory defined, defaulting to FileOutputCommitterFactory
14:51:34.205 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
14:51:34.205 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
14:51:34.206 INFO PathOutputCommitterFactory - No output committer factory defined, defaulting to FileOutputCommitterFactory
14:51:34.206 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
14:51:34.206 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
14:51:34.235 INFO FileOutputCommitter - Saved output of task 'attempt_202603041451349137800412412884587_0808_r_000000_0' to file:/tmp/ReadsSparkSinkUnitTest1.someOtherPlace14877317509620510707/_temporary/0/task_202603041451349137800412412884587_0808_r_000000
14:51:34.235 INFO SparkHadoopMapRedUtil - attempt_202603041451349137800412412884587_0808_r_000000_0: Committed. Elapsed time: 0 ms.
14:51:34.236 INFO Executor - Finished task 0.0 in stage 171.0 (TID 227). 1858 bytes result sent to driver
14:51:34.236 INFO TaskSetManager - Finished task 0.0 in stage 171.0 (TID 227) in 47 ms on localhost (executor driver) (1/1)
14:51:34.236 INFO TaskSchedulerImpl - Removed TaskSet 171.0, whose tasks have all completed, from pool
14:51:34.237 INFO DAGScheduler - ResultStage 171 (runJob at SparkHadoopWriter.scala:83) finished in 0.060 s
14:51:34.237 INFO DAGScheduler - Job 125 is finished. Cancelling potential speculative or zombie tasks for this job
14:51:34.237 INFO TaskSchedulerImpl - Killing all running tasks in stage 171: Stage finished
14:51:34.237 INFO DAGScheduler - Job 125 finished: runJob at SparkHadoopWriter.scala:83, took 0.140602 s
14:51:34.237 INFO SparkHadoopWriter - Start to commit write Job job_202603041451349137800412412884587_0808.
14:51:34.244 INFO SparkHadoopWriter - Write Job job_202603041451349137800412412884587_0808 committed. Elapsed time: 6 ms.
14:51:34.258 INFO HadoopFileSystemWrapper - Concatenating 3 parts to /tmp/ReadsSparkSinkUnitTest111723401712165003973.bam
14:51:34.264 INFO HadoopFileSystemWrapper - Concatenating to /tmp/ReadsSparkSinkUnitTest111723401712165003973.bam done
14:51:34.264 INFO IndexFileMerger - Merging .sbi files in temp directory /tmp/ReadsSparkSinkUnitTest1.someOtherPlace14877317509620510707 to /tmp/ReadsSparkSinkUnitTest111723401712165003973.bam.sbi
14:51:34.271 INFO IndexFileMerger - Done merging .sbi files
14:51:34.271 INFO IndexFileMerger - Merging .bai files in temp directory /tmp/ReadsSparkSinkUnitTest1.someOtherPlace14877317509620510707 to /tmp/ReadsSparkSinkUnitTest111723401712165003973.bam.bai
14:51:34.277 INFO IndexFileMerger - Done merging .bai files
14:51:34.279 INFO MemoryStore - Block broadcast_335 stored as values in memory (estimated size 320.0 B, free 1918.4 MiB)
14:51:34.280 INFO MemoryStore - Block broadcast_335_piece0 stored as bytes in memory (estimated size 233.0 B, free 1918.4 MiB)
14:51:34.280 INFO BlockManagerInfo - Added broadcast_335_piece0 in memory on localhost:44923 (size: 233.0 B, free: 1919.7 MiB)
14:51:34.280 INFO SparkContext - Created broadcast 335 from broadcast at BamSource.java:104
14:51:34.281 INFO MemoryStore - Block broadcast_336 stored as values in memory (estimated size 297.9 KiB, free 1918.1 MiB)
14:51:34.292 INFO MemoryStore - Block broadcast_336_piece0 stored as bytes in memory (estimated size 50.2 KiB, free 1918.0 MiB)
14:51:34.292 INFO BlockManagerInfo - Added broadcast_336_piece0 in memory on localhost:44923 (size: 50.2 KiB, free: 1919.7 MiB)
14:51:34.292 INFO SparkContext - Created broadcast 336 from newAPIHadoopFile at PathSplitSource.java:96
14:51:34.307 INFO FileInputFormat - Total input files to process : 1
14:51:34.331 INFO SparkContext - Starting job: collect at ReadsSparkSinkUnitTest.java:182
14:51:34.331 INFO DAGScheduler - Got job 126 (collect at ReadsSparkSinkUnitTest.java:182) with 1 output partitions
14:51:34.331 INFO DAGScheduler - Final stage: ResultStage 172 (collect at ReadsSparkSinkUnitTest.java:182)
14:51:34.331 INFO DAGScheduler - Parents of final stage: List()
14:51:34.331 INFO DAGScheduler - Missing parents: List()
14:51:34.331 INFO DAGScheduler - Submitting ResultStage 172 (MapPartitionsRDD[814] at filter at ReadsSparkSource.java:96), which has no missing parents
14:51:34.337 INFO MemoryStore - Block broadcast_337 stored as values in memory (estimated size 148.2 KiB, free 1917.9 MiB)
14:51:34.338 INFO MemoryStore - Block broadcast_337_piece0 stored as bytes in memory (estimated size 54.6 KiB, free 1917.8 MiB)
14:51:34.338 INFO BlockManagerInfo - Added broadcast_337_piece0 in memory on localhost:44923 (size: 54.6 KiB, free: 1919.6 MiB)
14:51:34.339 INFO SparkContext - Created broadcast 337 from broadcast at DAGScheduler.scala:1580
14:51:34.339 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 172 (MapPartitionsRDD[814] at filter at ReadsSparkSource.java:96) (first 15 tasks are for partitions Vector(0))
14:51:34.339 INFO TaskSchedulerImpl - Adding task set 172.0 with 1 tasks resource profile 0
14:51:34.340 INFO TaskSetManager - Starting task 0.0 in stage 172.0 (TID 228) (localhost, executor driver, partition 0, PROCESS_LOCAL, 7810 bytes)
14:51:34.340 INFO Executor - Running task 0.0 in stage 172.0 (TID 228)
14:51:34.353 INFO NewHadoopRDD - Input split: file:/tmp/ReadsSparkSinkUnitTest111723401712165003973.bam:0+237038
14:51:34.358 INFO Executor - Finished task 0.0 in stage 172.0 (TID 228). 651526 bytes result sent to driver
14:51:34.360 INFO TaskSetManager - Finished task 0.0 in stage 172.0 (TID 228) in 21 ms on localhost (executor driver) (1/1)
14:51:34.360 INFO TaskSchedulerImpl - Removed TaskSet 172.0, whose tasks have all completed, from pool
14:51:34.360 INFO DAGScheduler - ResultStage 172 (collect at ReadsSparkSinkUnitTest.java:182) finished in 0.029 s
14:51:34.360 INFO DAGScheduler - Job 126 is finished. Cancelling potential speculative or zombie tasks for this job
14:51:34.360 INFO TaskSchedulerImpl - Killing all running tasks in stage 172: Stage finished
14:51:34.360 INFO DAGScheduler - Job 126 finished: collect at ReadsSparkSinkUnitTest.java:182, took 0.029451 s
14:51:34.371 INFO SparkContext - Starting job: count at ReadsSparkSinkUnitTest.java:185
14:51:34.372 INFO DAGScheduler - Got job 127 (count at ReadsSparkSinkUnitTest.java:185) with 1 output partitions
14:51:34.372 INFO DAGScheduler - Final stage: ResultStage 173 (count at ReadsSparkSinkUnitTest.java:185)
14:51:34.372 INFO DAGScheduler - Parents of final stage: List()
14:51:34.372 INFO DAGScheduler - Missing parents: List()
14:51:34.372 INFO DAGScheduler - Submitting ResultStage 173 (MapPartitionsRDD[796] at filter at ReadsSparkSource.java:96), which has no missing parents
14:51:34.391 INFO MemoryStore - Block broadcast_338 stored as values in memory (estimated size 426.1 KiB, free 1917.4 MiB)
14:51:34.393 INFO MemoryStore - Block broadcast_338_piece0 stored as bytes in memory (estimated size 153.6 KiB, free 1917.2 MiB)
14:51:34.393 INFO BlockManagerInfo - Added broadcast_338_piece0 in memory on localhost:44923 (size: 153.6 KiB, free: 1919.5 MiB)
14:51:34.393 INFO SparkContext - Created broadcast 338 from broadcast at DAGScheduler.scala:1580
14:51:34.394 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 173 (MapPartitionsRDD[796] at filter at ReadsSparkSource.java:96) (first 15 tasks are for partitions Vector(0))
14:51:34.394 INFO TaskSchedulerImpl - Adding task set 173.0 with 1 tasks resource profile 0
14:51:34.394 INFO TaskSetManager - Starting task 0.0 in stage 173.0 (TID 229) (localhost, executor driver, partition 0, PROCESS_LOCAL, 7872 bytes)
14:51:34.395 INFO Executor - Running task 0.0 in stage 173.0 (TID 229)
14:51:34.427 INFO NewHadoopRDD - Input split: file:/home/runner/work/gatk/gatk/src/test/resources/org/broadinstitute/hellbender/tools/BQSR/HiSeq.1mb.1RG.2k_lines.bam:0+222075
14:51:34.437 INFO Executor - Finished task 0.0 in stage 173.0 (TID 229). 989 bytes result sent to driver
14:51:34.437 INFO TaskSetManager - Finished task 0.0 in stage 173.0 (TID 229) in 43 ms on localhost (executor driver) (1/1)
14:51:34.437 INFO TaskSchedulerImpl - Removed TaskSet 173.0, whose tasks have all completed, from pool
14:51:34.437 INFO DAGScheduler - ResultStage 173 (count at ReadsSparkSinkUnitTest.java:185) finished in 0.065 s
14:51:34.438 INFO DAGScheduler - Job 127 is finished. Cancelling potential speculative or zombie tasks for this job
14:51:34.438 INFO TaskSchedulerImpl - Killing all running tasks in stage 173: Stage finished
14:51:34.438 INFO DAGScheduler - Job 127 finished: count at ReadsSparkSinkUnitTest.java:185, took 0.066346 s
14:51:34.441 INFO SparkContext - Starting job: count at ReadsSparkSinkUnitTest.java:185
14:51:34.442 INFO DAGScheduler - Got job 128 (count at ReadsSparkSinkUnitTest.java:185) with 1 output partitions
14:51:34.442 INFO DAGScheduler - Final stage: ResultStage 174 (count at ReadsSparkSinkUnitTest.java:185)
14:51:34.442 INFO DAGScheduler - Parents of final stage: List()
14:51:34.442 INFO DAGScheduler - Missing parents: List()
14:51:34.442 INFO DAGScheduler - Submitting ResultStage 174 (MapPartitionsRDD[814] at filter at ReadsSparkSource.java:96), which has no missing parents
14:51:34.448 INFO MemoryStore - Block broadcast_339 stored as values in memory (estimated size 148.1 KiB, free 1917.1 MiB)
14:51:34.449 INFO MemoryStore - Block broadcast_339_piece0 stored as bytes in memory (estimated size 54.5 KiB, free 1917.1 MiB)
14:51:34.449 INFO BlockManagerInfo - Added broadcast_339_piece0 in memory on localhost:44923 (size: 54.5 KiB, free: 1919.4 MiB)
14:51:34.449 INFO SparkContext - Created broadcast 339 from broadcast at DAGScheduler.scala:1580
14:51:34.449 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 174 (MapPartitionsRDD[814] at filter at ReadsSparkSource.java:96) (first 15 tasks are for partitions Vector(0))
14:51:34.449 INFO TaskSchedulerImpl - Adding task set 174.0 with 1 tasks resource profile 0
14:51:34.450 INFO TaskSetManager - Starting task 0.0 in stage 174.0 (TID 230) (localhost, executor driver, partition 0, PROCESS_LOCAL, 7810 bytes)
14:51:34.450 INFO Executor - Running task 0.0 in stage 174.0 (TID 230)
14:51:34.463 INFO NewHadoopRDD - Input split: file:/tmp/ReadsSparkSinkUnitTest111723401712165003973.bam:0+237038
14:51:34.467 INFO Executor - Finished task 0.0 in stage 174.0 (TID 230). 989 bytes result sent to driver
14:51:34.467 INFO TaskSetManager - Finished task 0.0 in stage 174.0 (TID 230) in 17 ms on localhost (executor driver) (1/1)
14:51:34.467 INFO TaskSchedulerImpl - Removed TaskSet 174.0, whose tasks have all completed, from pool
14:51:34.467 INFO DAGScheduler - ResultStage 174 (count at ReadsSparkSinkUnitTest.java:185) finished in 0.025 s
14:51:34.467 INFO DAGScheduler - Job 128 is finished. Cancelling potential speculative or zombie tasks for this job
14:51:34.467 INFO TaskSchedulerImpl - Killing all running tasks in stage 174: Stage finished
14:51:34.467 INFO DAGScheduler - Job 128 finished: count at ReadsSparkSinkUnitTest.java:185, took 0.026128 s
14:51:34.477 INFO MemoryStore - Block broadcast_340 stored as values in memory (estimated size 297.9 KiB, free 1916.8 MiB)
14:51:34.483 INFO MemoryStore - Block broadcast_340_piece0 stored as bytes in memory (estimated size 50.2 KiB, free 1916.7 MiB)
14:51:34.484 INFO BlockManagerInfo - Added broadcast_340_piece0 in memory on localhost:44923 (size: 50.2 KiB, free: 1919.3 MiB)
14:51:34.484 INFO SparkContext - Created broadcast 340 from newAPIHadoopFile at PathSplitSource.java:96
14:51:34.507 INFO MemoryStore - Block broadcast_341 stored as values in memory (estimated size 297.9 KiB, free 1916.4 MiB)
14:51:34.514 INFO MemoryStore - Block broadcast_341_piece0 stored as bytes in memory (estimated size 50.2 KiB, free 1916.4 MiB)
14:51:34.514 INFO BlockManagerInfo - Added broadcast_341_piece0 in memory on localhost:44923 (size: 50.2 KiB, free: 1919.3 MiB)
14:51:34.514 INFO SparkContext - Created broadcast 341 from newAPIHadoopFile at PathSplitSource.java:96
14:51:34.536 INFO FileInputFormat - Total input files to process : 1
14:51:34.538 INFO MemoryStore - Block broadcast_342 stored as values in memory (estimated size 160.7 KiB, free 1916.2 MiB)
14:51:34.539 INFO MemoryStore - Block broadcast_342_piece0 stored as bytes in memory (estimated size 9.6 KiB, free 1916.2 MiB)
14:51:34.539 INFO BlockManagerInfo - Added broadcast_342_piece0 in memory on localhost:44923 (size: 9.6 KiB, free: 1919.3 MiB)
14:51:34.539 INFO SparkContext - Created broadcast 342 from broadcast at ReadsSparkSink.java:133
14:51:34.541 INFO MemoryStore - Block broadcast_343 stored as values in memory (estimated size 163.2 KiB, free 1916.0 MiB)
14:51:34.541 INFO MemoryStore - Block broadcast_343_piece0 stored as bytes in memory (estimated size 9.6 KiB, free 1916.0 MiB)
14:51:34.542 INFO BlockManagerInfo - Added broadcast_343_piece0 in memory on localhost:44923 (size: 9.6 KiB, free: 1919.3 MiB)
14:51:34.542 INFO SparkContext - Created broadcast 343 from broadcast at BamSink.java:76
14:51:34.544 INFO PathOutputCommitterFactory - No output committer factory defined, defaulting to FileOutputCommitterFactory
14:51:34.544 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
14:51:34.544 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
14:51:34.565 INFO SparkContext - Starting job: runJob at SparkHadoopWriter.scala:83
14:51:34.565 INFO DAGScheduler - Registering RDD 828 (mapToPair at SparkUtils.java:161) as input to shuffle 35
14:51:34.565 INFO DAGScheduler - Got job 129 (runJob at SparkHadoopWriter.scala:83) with 1 output partitions
14:51:34.565 INFO DAGScheduler - Final stage: ResultStage 176 (runJob at SparkHadoopWriter.scala:83)
14:51:34.565 INFO DAGScheduler - Parents of final stage: List(ShuffleMapStage 175)
14:51:34.565 INFO DAGScheduler - Missing parents: List(ShuffleMapStage 175)
14:51:34.566 INFO DAGScheduler - Submitting ShuffleMapStage 175 (MapPartitionsRDD[828] at mapToPair at SparkUtils.java:161), which has no missing parents
14:51:34.583 INFO MemoryStore - Block broadcast_344 stored as values in memory (estimated size 520.4 KiB, free 1915.5 MiB)
14:51:34.585 INFO MemoryStore - Block broadcast_344_piece0 stored as bytes in memory (estimated size 166.1 KiB, free 1915.4 MiB)
14:51:34.585 INFO BlockManagerInfo - Added broadcast_344_piece0 in memory on localhost:44923 (size: 166.1 KiB, free: 1919.1 MiB)
14:51:34.586 INFO SparkContext - Created broadcast 344 from broadcast at DAGScheduler.scala:1580
14:51:34.586 INFO DAGScheduler - Submitting 1 missing tasks from ShuffleMapStage 175 (MapPartitionsRDD[828] at mapToPair at SparkUtils.java:161) (first 15 tasks are for partitions Vector(0))
14:51:34.586 INFO TaskSchedulerImpl - Adding task set 175.0 with 1 tasks resource profile 0
14:51:34.587 INFO TaskSetManager - Starting task 0.0 in stage 175.0 (TID 231) (localhost, executor driver, partition 0, PROCESS_LOCAL, 7861 bytes)
14:51:34.587 INFO Executor - Running task 0.0 in stage 175.0 (TID 231)
14:51:34.626 INFO NewHadoopRDD - Input split: file:/home/runner/work/gatk/gatk/src/test/resources/org/broadinstitute/hellbender/tools/BQSR/HiSeq.1mb.1RG.2k_lines.bam:0+222075
14:51:34.643 INFO Executor - Finished task 0.0 in stage 175.0 (TID 231). 1148 bytes result sent to driver
14:51:34.643 INFO TaskSetManager - Finished task 0.0 in stage 175.0 (TID 231) in 57 ms on localhost (executor driver) (1/1)
14:51:34.643 INFO TaskSchedulerImpl - Removed TaskSet 175.0, whose tasks have all completed, from pool
14:51:34.643 INFO DAGScheduler - ShuffleMapStage 175 (mapToPair at SparkUtils.java:161) finished in 0.077 s
14:51:34.643 INFO DAGScheduler - looking for newly runnable stages
14:51:34.643 INFO DAGScheduler - running: HashSet()
14:51:34.643 INFO DAGScheduler - waiting: HashSet(ResultStage 176)
14:51:34.643 INFO DAGScheduler - failed: HashSet()
14:51:34.644 INFO DAGScheduler - Submitting ResultStage 176 (MapPartitionsRDD[833] at mapToPair at BamSink.java:91), which has no missing parents
14:51:34.655 INFO MemoryStore - Block broadcast_345 stored as values in memory (estimated size 241.4 KiB, free 1915.1 MiB)
14:51:34.656 INFO MemoryStore - Block broadcast_345_piece0 stored as bytes in memory (estimated size 67.0 KiB, free 1915.1 MiB)
14:51:34.657 INFO BlockManagerInfo - Added broadcast_345_piece0 in memory on localhost:44923 (size: 67.0 KiB, free: 1919.1 MiB)
14:51:34.657 INFO SparkContext - Created broadcast 345 from broadcast at DAGScheduler.scala:1580
14:51:34.657 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 176 (MapPartitionsRDD[833] at mapToPair at BamSink.java:91) (first 15 tasks are for partitions Vector(0))
14:51:34.657 INFO TaskSchedulerImpl - Adding task set 176.0 with 1 tasks resource profile 0
14:51:34.658 INFO TaskSetManager - Starting task 0.0 in stage 176.0 (TID 232) (localhost, executor driver, partition 0, NODE_LOCAL, 7513 bytes)
14:51:34.658 INFO Executor - Running task 0.0 in stage 176.0 (TID 232)
14:51:34.663 INFO ShuffleBlockFetcherIterator - Getting 1 (343.8 KiB) non-empty blocks including 1 (343.8 KiB) local and 0 (0.0 B) host-local and 0 (0.0 B) push-merged-local and 0 (0.0 B) remote blocks
14:51:34.663 INFO ShuffleBlockFetcherIterator - Started 0 remote fetches in 0 ms
14:51:34.675 INFO PathOutputCommitterFactory - No output committer factory defined, defaulting to FileOutputCommitterFactory
14:51:34.675 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
14:51:34.675 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
14:51:34.676 INFO PathOutputCommitterFactory - No output committer factory defined, defaulting to FileOutputCommitterFactory
14:51:34.676 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
14:51:34.676 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
14:51:34.706 INFO FileOutputCommitter - Saved output of task 'attempt_202603041451346149169587361845121_0833_r_000000_0' to file:/tmp/ReadsSparkSinkUnitTest1.someOtherPlace9760781724973800963/_temporary/0/task_202603041451346149169587361845121_0833_r_000000
14:51:34.706 INFO SparkHadoopMapRedUtil - attempt_202603041451346149169587361845121_0833_r_000000_0: Committed. Elapsed time: 0 ms.
14:51:34.711 INFO Executor - Finished task 0.0 in stage 176.0 (TID 232). 1944 bytes result sent to driver
14:51:34.712 INFO BlockManagerInfo - Removed broadcast_336_piece0 on localhost:44923 in memory (size: 50.2 KiB, free: 1919.1 MiB)
14:51:34.712 INFO TaskSetManager - Finished task 0.0 in stage 176.0 (TID 232) in 54 ms on localhost (executor driver) (1/1)
14:51:34.712 INFO TaskSchedulerImpl - Removed TaskSet 176.0, whose tasks have all completed, from pool
14:51:34.712 INFO BlockManagerInfo - Removed broadcast_334_piece0 on localhost:44923 in memory (size: 67.0 KiB, free: 1919.2 MiB)
14:51:34.713 INFO DAGScheduler - ResultStage 176 (runJob at SparkHadoopWriter.scala:83) finished in 0.069 s
14:51:34.713 INFO DAGScheduler - Job 129 is finished. Cancelling potential speculative or zombie tasks for this job
14:51:34.713 INFO TaskSchedulerImpl - Killing all running tasks in stage 176: Stage finished
14:51:34.713 INFO BlockManagerInfo - Removed broadcast_339_piece0 on localhost:44923 in memory (size: 54.5 KiB, free: 1919.2 MiB)
14:51:34.713 INFO DAGScheduler - Job 129 finished: runJob at SparkHadoopWriter.scala:83, took 0.148348 s
14:51:34.713 INFO SparkHadoopWriter - Start to commit write Job job_202603041451346149169587361845121_0833.
14:51:34.714 INFO BlockManagerInfo - Removed broadcast_331_piece0 on localhost:44923 in memory (size: 9.6 KiB, free: 1919.2 MiB)
14:51:34.714 INFO BlockManagerInfo - Removed broadcast_337_piece0 on localhost:44923 in memory (size: 54.6 KiB, free: 1919.3 MiB)
14:51:34.715 INFO BlockManagerInfo - Removed broadcast_332_piece0 on localhost:44923 in memory (size: 9.6 KiB, free: 1919.3 MiB)
14:51:34.715 INFO BlockManagerInfo - Removed broadcast_338_piece0 on localhost:44923 in memory (size: 153.6 KiB, free: 1919.4 MiB)
14:51:34.716 INFO BlockManagerInfo - Removed broadcast_333_piece0 on localhost:44923 in memory (size: 166.1 KiB, free: 1919.6 MiB)
14:51:34.716 INFO BlockManagerInfo - Removed broadcast_341_piece0 on localhost:44923 in memory (size: 50.2 KiB, free: 1919.7 MiB)
14:51:34.717 INFO BlockManagerInfo - Removed broadcast_335_piece0 on localhost:44923 in memory (size: 233.0 B, free: 1919.7 MiB)
14:51:34.717 INFO BlockManagerInfo - Removed broadcast_329_piece0 on localhost:44923 in memory (size: 50.2 KiB, free: 1919.7 MiB)
14:51:34.717 INFO BlockManagerInfo - Removed broadcast_344_piece0 on localhost:44923 in memory (size: 166.1 KiB, free: 1919.9 MiB)
14:51:34.721 INFO SparkHadoopWriter - Write Job job_202603041451346149169587361845121_0833 committed. Elapsed time: 7 ms.
14:51:34.736 INFO HadoopFileSystemWrapper - Concatenating 3 parts to /tmp/ReadsSparkSinkUnitTest117429404460610509007.bam
14:51:34.741 INFO HadoopFileSystemWrapper - Concatenating to /tmp/ReadsSparkSinkUnitTest117429404460610509007.bam done
14:51:34.741 INFO IndexFileMerger - Merging .sbi files in temp directory /tmp/ReadsSparkSinkUnitTest1.someOtherPlace9760781724973800963 to /tmp/ReadsSparkSinkUnitTest117429404460610509007.bam.sbi
14:51:34.748 INFO IndexFileMerger - Done merging .sbi files
14:51:34.748 INFO IndexFileMerger - Merging .bai files in temp directory /tmp/ReadsSparkSinkUnitTest1.someOtherPlace9760781724973800963 to /tmp/ReadsSparkSinkUnitTest117429404460610509007.bam.bai
14:51:34.754 INFO IndexFileMerger - Done merging .bai files
14:51:34.757 INFO MemoryStore - Block broadcast_346 stored as values in memory (estimated size 13.3 KiB, free 1919.0 MiB)
14:51:34.757 INFO MemoryStore - Block broadcast_346_piece0 stored as bytes in memory (estimated size 8.3 KiB, free 1919.0 MiB)
14:51:34.757 INFO BlockManagerInfo - Added broadcast_346_piece0 in memory on localhost:44923 (size: 8.3 KiB, free: 1919.9 MiB)
14:51:34.758 INFO SparkContext - Created broadcast 346 from broadcast at BamSource.java:104
14:51:34.759 INFO MemoryStore - Block broadcast_347 stored as values in memory (estimated size 297.9 KiB, free 1918.7 MiB)
14:51:34.765 INFO MemoryStore - Block broadcast_347_piece0 stored as bytes in memory (estimated size 50.2 KiB, free 1918.7 MiB)
14:51:34.765 INFO BlockManagerInfo - Added broadcast_347_piece0 in memory on localhost:44923 (size: 50.2 KiB, free: 1919.8 MiB)
14:51:34.766 INFO SparkContext - Created broadcast 347 from newAPIHadoopFile at PathSplitSource.java:96
14:51:34.775 INFO FileInputFormat - Total input files to process : 1
14:51:34.790 INFO SparkContext - Starting job: collect at ReadsSparkSinkUnitTest.java:182
14:51:34.790 INFO DAGScheduler - Got job 130 (collect at ReadsSparkSinkUnitTest.java:182) with 1 output partitions
14:51:34.790 INFO DAGScheduler - Final stage: ResultStage 177 (collect at ReadsSparkSinkUnitTest.java:182)
14:51:34.790 INFO DAGScheduler - Parents of final stage: List()
14:51:34.790 INFO DAGScheduler - Missing parents: List()
14:51:34.791 INFO DAGScheduler - Submitting ResultStage 177 (MapPartitionsRDD[839] at filter at ReadsSparkSource.java:96), which has no missing parents
14:51:34.800 INFO MemoryStore - Block broadcast_348 stored as values in memory (estimated size 148.2 KiB, free 1918.5 MiB)
14:51:34.800 INFO MemoryStore - Block broadcast_348_piece0 stored as bytes in memory (estimated size 54.5 KiB, free 1918.5 MiB)
14:51:34.801 INFO BlockManagerInfo - Added broadcast_348_piece0 in memory on localhost:44923 (size: 54.5 KiB, free: 1919.8 MiB)
14:51:34.801 INFO SparkContext - Created broadcast 348 from broadcast at DAGScheduler.scala:1580
14:51:34.801 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 177 (MapPartitionsRDD[839] at filter at ReadsSparkSource.java:96) (first 15 tasks are for partitions Vector(0))
14:51:34.801 INFO TaskSchedulerImpl - Adding task set 177.0 with 1 tasks resource profile 0
14:51:34.802 INFO TaskSetManager - Starting task 0.0 in stage 177.0 (TID 233) (localhost, executor driver, partition 0, PROCESS_LOCAL, 7810 bytes)
14:51:34.802 INFO Executor - Running task 0.0 in stage 177.0 (TID 233)
14:51:34.815 INFO NewHadoopRDD - Input split: file:/tmp/ReadsSparkSinkUnitTest117429404460610509007.bam:0+237038
14:51:34.820 INFO Executor - Finished task 0.0 in stage 177.0 (TID 233). 651483 bytes result sent to driver
14:51:34.822 INFO TaskSetManager - Finished task 0.0 in stage 177.0 (TID 233) in 21 ms on localhost (executor driver) (1/1)
14:51:34.822 INFO TaskSchedulerImpl - Removed TaskSet 177.0, whose tasks have all completed, from pool
14:51:34.822 INFO DAGScheduler - ResultStage 177 (collect at ReadsSparkSinkUnitTest.java:182) finished in 0.031 s
14:51:34.822 INFO DAGScheduler - Job 130 is finished. Cancelling potential speculative or zombie tasks for this job
14:51:34.822 INFO TaskSchedulerImpl - Killing all running tasks in stage 177: Stage finished
14:51:34.823 INFO DAGScheduler - Job 130 finished: collect at ReadsSparkSinkUnitTest.java:182, took 0.032791 s
14:51:34.836 INFO SparkContext - Starting job: count at ReadsSparkSinkUnitTest.java:185
14:51:34.837 INFO DAGScheduler - Got job 131 (count at ReadsSparkSinkUnitTest.java:185) with 1 output partitions
14:51:34.837 INFO DAGScheduler - Final stage: ResultStage 178 (count at ReadsSparkSinkUnitTest.java:185)
14:51:34.837 INFO DAGScheduler - Parents of final stage: List()
14:51:34.837 INFO DAGScheduler - Missing parents: List()
14:51:34.837 INFO DAGScheduler - Submitting ResultStage 178 (MapPartitionsRDD[821] at filter at ReadsSparkSource.java:96), which has no missing parents
14:51:34.854 INFO MemoryStore - Block broadcast_349 stored as values in memory (estimated size 426.1 KiB, free 1918.0 MiB)
14:51:34.856 INFO MemoryStore - Block broadcast_349_piece0 stored as bytes in memory (estimated size 153.6 KiB, free 1917.9 MiB)
14:51:34.856 INFO BlockManagerInfo - Added broadcast_349_piece0 in memory on localhost:44923 (size: 153.6 KiB, free: 1919.6 MiB)
14:51:34.856 INFO SparkContext - Created broadcast 349 from broadcast at DAGScheduler.scala:1580
14:51:34.856 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 178 (MapPartitionsRDD[821] at filter at ReadsSparkSource.java:96) (first 15 tasks are for partitions Vector(0))
14:51:34.856 INFO TaskSchedulerImpl - Adding task set 178.0 with 1 tasks resource profile 0
14:51:34.857 INFO TaskSetManager - Starting task 0.0 in stage 178.0 (TID 234) (localhost, executor driver, partition 0, PROCESS_LOCAL, 7872 bytes)
14:51:34.857 INFO Executor - Running task 0.0 in stage 178.0 (TID 234)
14:51:34.893 INFO NewHadoopRDD - Input split: file:/home/runner/work/gatk/gatk/src/test/resources/org/broadinstitute/hellbender/tools/BQSR/HiSeq.1mb.1RG.2k_lines.bam:0+222075
14:51:34.903 INFO Executor - Finished task 0.0 in stage 178.0 (TID 234). 989 bytes result sent to driver
14:51:34.904 INFO TaskSetManager - Finished task 0.0 in stage 178.0 (TID 234) in 47 ms on localhost (executor driver) (1/1)
14:51:34.904 INFO TaskSchedulerImpl - Removed TaskSet 178.0, whose tasks have all completed, from pool
14:51:34.904 INFO DAGScheduler - ResultStage 178 (count at ReadsSparkSinkUnitTest.java:185) finished in 0.067 s
14:51:34.904 INFO DAGScheduler - Job 131 is finished. Cancelling potential speculative or zombie tasks for this job
14:51:34.904 INFO TaskSchedulerImpl - Killing all running tasks in stage 178: Stage finished
14:51:34.904 INFO DAGScheduler - Job 131 finished: count at ReadsSparkSinkUnitTest.java:185, took 0.067747 s
14:51:34.908 INFO SparkContext - Starting job: count at ReadsSparkSinkUnitTest.java:185
14:51:34.908 INFO DAGScheduler - Got job 132 (count at ReadsSparkSinkUnitTest.java:185) with 1 output partitions
14:51:34.908 INFO DAGScheduler - Final stage: ResultStage 179 (count at ReadsSparkSinkUnitTest.java:185)
14:51:34.908 INFO DAGScheduler - Parents of final stage: List()
14:51:34.908 INFO DAGScheduler - Missing parents: List()
14:51:34.908 INFO DAGScheduler - Submitting ResultStage 179 (MapPartitionsRDD[839] at filter at ReadsSparkSource.java:96), which has no missing parents
14:51:34.914 INFO MemoryStore - Block broadcast_350 stored as values in memory (estimated size 148.1 KiB, free 1917.8 MiB)
14:51:34.915 INFO MemoryStore - Block broadcast_350_piece0 stored as bytes in memory (estimated size 54.5 KiB, free 1917.7 MiB)
14:51:34.915 INFO BlockManagerInfo - Added broadcast_350_piece0 in memory on localhost:44923 (size: 54.5 KiB, free: 1919.6 MiB)
14:51:34.916 INFO SparkContext - Created broadcast 350 from broadcast at DAGScheduler.scala:1580
14:51:34.916 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 179 (MapPartitionsRDD[839] at filter at ReadsSparkSource.java:96) (first 15 tasks are for partitions Vector(0))
14:51:34.916 INFO TaskSchedulerImpl - Adding task set 179.0 with 1 tasks resource profile 0
14:51:34.916 INFO TaskSetManager - Starting task 0.0 in stage 179.0 (TID 235) (localhost, executor driver, partition 0, PROCESS_LOCAL, 7810 bytes)
14:51:34.917 INFO Executor - Running task 0.0 in stage 179.0 (TID 235)
14:51:34.929 INFO NewHadoopRDD - Input split: file:/tmp/ReadsSparkSinkUnitTest117429404460610509007.bam:0+237038
14:51:34.933 INFO Executor - Finished task 0.0 in stage 179.0 (TID 235). 989 bytes result sent to driver
14:51:34.933 INFO TaskSetManager - Finished task 0.0 in stage 179.0 (TID 235) in 17 ms on localhost (executor driver) (1/1)
14:51:34.934 INFO TaskSchedulerImpl - Removed TaskSet 179.0, whose tasks have all completed, from pool
14:51:34.934 INFO DAGScheduler - ResultStage 179 (count at ReadsSparkSinkUnitTest.java:185) finished in 0.025 s
14:51:34.934 INFO DAGScheduler - Job 132 is finished. Cancelling potential speculative or zombie tasks for this job
14:51:34.934 INFO TaskSchedulerImpl - Killing all running tasks in stage 179: Stage finished
14:51:34.934 INFO DAGScheduler - Job 132 finished: count at ReadsSparkSinkUnitTest.java:185, took 0.025958 s
14:51:34.943 INFO MemoryStore - Block broadcast_351 stored as values in memory (estimated size 297.9 KiB, free 1917.4 MiB)
14:51:34.950 INFO MemoryStore - Block broadcast_351_piece0 stored as bytes in memory (estimated size 50.2 KiB, free 1917.4 MiB)
14:51:34.950 INFO BlockManagerInfo - Added broadcast_351_piece0 in memory on localhost:44923 (size: 50.2 KiB, free: 1919.5 MiB)
14:51:34.950 INFO SparkContext - Created broadcast 351 from newAPIHadoopFile at PathSplitSource.java:96
14:51:34.973 INFO MemoryStore - Block broadcast_352 stored as values in memory (estimated size 297.9 KiB, free 1917.1 MiB)
14:51:34.984 INFO MemoryStore - Block broadcast_352_piece0 stored as bytes in memory (estimated size 50.2 KiB, free 1917.0 MiB)
14:51:34.984 INFO BlockManagerInfo - Added broadcast_352_piece0 in memory on localhost:44923 (size: 50.2 KiB, free: 1919.5 MiB)
14:51:34.985 INFO SparkContext - Created broadcast 352 from newAPIHadoopFile at PathSplitSource.java:96
14:51:35.009 INFO FileInputFormat - Total input files to process : 1
14:51:35.011 INFO MemoryStore - Block broadcast_353 stored as values in memory (estimated size 160.7 KiB, free 1916.9 MiB)
14:51:35.011 INFO MemoryStore - Block broadcast_353_piece0 stored as bytes in memory (estimated size 9.6 KiB, free 1916.9 MiB)
14:51:35.011 INFO BlockManagerInfo - Added broadcast_353_piece0 in memory on localhost:44923 (size: 9.6 KiB, free: 1919.4 MiB)
14:51:35.012 INFO SparkContext - Created broadcast 353 from broadcast at ReadsSparkSink.java:133
14:51:35.012 WARN HtsjdkReadsRddStorage - Unrecognized write option: DISABLE
14:51:35.013 INFO MemoryStore - Block broadcast_354 stored as values in memory (estimated size 163.2 KiB, free 1916.7 MiB)
14:51:35.014 INFO MemoryStore - Block broadcast_354_piece0 stored as bytes in memory (estimated size 9.6 KiB, free 1916.7 MiB)
14:51:35.014 INFO BlockManagerInfo - Added broadcast_354_piece0 in memory on localhost:44923 (size: 9.6 KiB, free: 1919.4 MiB)
14:51:35.014 INFO SparkContext - Created broadcast 354 from broadcast at BamSink.java:76
14:51:35.016 INFO PathOutputCommitterFactory - No output committer factory defined, defaulting to FileOutputCommitterFactory
14:51:35.016 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
14:51:35.016 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
14:51:35.035 INFO SparkContext - Starting job: runJob at SparkHadoopWriter.scala:83
14:51:35.036 INFO DAGScheduler - Registering RDD 853 (mapToPair at SparkUtils.java:161) as input to shuffle 36
14:51:35.036 INFO DAGScheduler - Got job 133 (runJob at SparkHadoopWriter.scala:83) with 1 output partitions
14:51:35.036 INFO DAGScheduler - Final stage: ResultStage 181 (runJob at SparkHadoopWriter.scala:83)
14:51:35.036 INFO DAGScheduler - Parents of final stage: List(ShuffleMapStage 180)
14:51:35.036 INFO DAGScheduler - Missing parents: List(ShuffleMapStage 180)
14:51:35.036 INFO DAGScheduler - Submitting ShuffleMapStage 180 (MapPartitionsRDD[853] at mapToPair at SparkUtils.java:161), which has no missing parents
14:51:35.054 INFO MemoryStore - Block broadcast_355 stored as values in memory (estimated size 520.4 KiB, free 1916.2 MiB)
14:51:35.055 INFO MemoryStore - Block broadcast_355_piece0 stored as bytes in memory (estimated size 166.1 KiB, free 1916.0 MiB)
14:51:35.055 INFO BlockManagerInfo - Added broadcast_355_piece0 in memory on localhost:44923 (size: 166.1 KiB, free: 1919.3 MiB)
14:51:35.056 INFO SparkContext - Created broadcast 355 from broadcast at DAGScheduler.scala:1580
14:51:35.056 INFO DAGScheduler - Submitting 1 missing tasks from ShuffleMapStage 180 (MapPartitionsRDD[853] at mapToPair at SparkUtils.java:161) (first 15 tasks are for partitions Vector(0))
14:51:35.056 INFO TaskSchedulerImpl - Adding task set 180.0 with 1 tasks resource profile 0
14:51:35.056 INFO TaskSetManager - Starting task 0.0 in stage 180.0 (TID 236) (localhost, executor driver, partition 0, PROCESS_LOCAL, 7861 bytes)
14:51:35.057 INFO Executor - Running task 0.0 in stage 180.0 (TID 236)
14:51:35.093 INFO NewHadoopRDD - Input split: file:/home/runner/work/gatk/gatk/src/test/resources/org/broadinstitute/hellbender/tools/BQSR/HiSeq.1mb.1RG.2k_lines.bam:0+222075
14:51:35.109 INFO Executor - Finished task 0.0 in stage 180.0 (TID 236). 1148 bytes result sent to driver
14:51:35.109 INFO TaskSetManager - Finished task 0.0 in stage 180.0 (TID 236) in 53 ms on localhost (executor driver) (1/1)
14:51:35.109 INFO TaskSchedulerImpl - Removed TaskSet 180.0, whose tasks have all completed, from pool
14:51:35.110 INFO DAGScheduler - ShuffleMapStage 180 (mapToPair at SparkUtils.java:161) finished in 0.074 s
14:51:35.110 INFO DAGScheduler - looking for newly runnable stages
14:51:35.110 INFO DAGScheduler - running: HashSet()
14:51:35.110 INFO DAGScheduler - waiting: HashSet(ResultStage 181)
14:51:35.110 INFO DAGScheduler - failed: HashSet()
14:51:35.110 INFO DAGScheduler - Submitting ResultStage 181 (MapPartitionsRDD[858] at mapToPair at BamSink.java:91), which has no missing parents
14:51:35.117 INFO MemoryStore - Block broadcast_356 stored as values in memory (estimated size 241.4 KiB, free 1915.8 MiB)
14:51:35.118 INFO MemoryStore - Block broadcast_356_piece0 stored as bytes in memory (estimated size 67.0 KiB, free 1915.7 MiB)
14:51:35.118 INFO BlockManagerInfo - Added broadcast_356_piece0 in memory on localhost:44923 (size: 67.0 KiB, free: 1919.2 MiB)
14:51:35.118 INFO SparkContext - Created broadcast 356 from broadcast at DAGScheduler.scala:1580
14:51:35.118 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 181 (MapPartitionsRDD[858] at mapToPair at BamSink.java:91) (first 15 tasks are for partitions Vector(0))
14:51:35.118 INFO TaskSchedulerImpl - Adding task set 181.0 with 1 tasks resource profile 0
14:51:35.119 INFO TaskSetManager - Starting task 0.0 in stage 181.0 (TID 237) (localhost, executor driver, partition 0, NODE_LOCAL, 7513 bytes)
14:51:35.120 INFO Executor - Running task 0.0 in stage 181.0 (TID 237)
14:51:35.124 INFO ShuffleBlockFetcherIterator - Getting 1 (343.8 KiB) non-empty blocks including 1 (343.8 KiB) local and 0 (0.0 B) host-local and 0 (0.0 B) push-merged-local and 0 (0.0 B) remote blocks
14:51:35.124 INFO ShuffleBlockFetcherIterator - Started 0 remote fetches in 0 ms
14:51:35.136 INFO PathOutputCommitterFactory - No output committer factory defined, defaulting to FileOutputCommitterFactory
14:51:35.136 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
14:51:35.136 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
14:51:35.136 INFO PathOutputCommitterFactory - No output committer factory defined, defaulting to FileOutputCommitterFactory
14:51:35.136 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
14:51:35.136 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
14:51:35.158 INFO FileOutputCommitter - Saved output of task 'attempt_202603041451357127027696941623097_0858_r_000000_0' to file:/tmp/ReadsSparkSinkUnitTest1.someOtherPlace4534331730913351225/_temporary/0/task_202603041451357127027696941623097_0858_r_000000
14:51:35.158 INFO SparkHadoopMapRedUtil - attempt_202603041451357127027696941623097_0858_r_000000_0: Committed. Elapsed time: 0 ms.
14:51:35.159 INFO Executor - Finished task 0.0 in stage 181.0 (TID 237). 1858 bytes result sent to driver
14:51:35.159 INFO TaskSetManager - Finished task 0.0 in stage 181.0 (TID 237) in 40 ms on localhost (executor driver) (1/1)
14:51:35.160 INFO TaskSchedulerImpl - Removed TaskSet 181.0, whose tasks have all completed, from pool
14:51:35.160 INFO DAGScheduler - ResultStage 181 (runJob at SparkHadoopWriter.scala:83) finished in 0.050 s
14:51:35.160 INFO DAGScheduler - Job 133 is finished. Cancelling potential speculative or zombie tasks for this job
14:51:35.160 INFO TaskSchedulerImpl - Killing all running tasks in stage 181: Stage finished
14:51:35.160 INFO DAGScheduler - Job 133 finished: runJob at SparkHadoopWriter.scala:83, took 0.125006 s
14:51:35.160 INFO SparkHadoopWriter - Start to commit write Job job_202603041451357127027696941623097_0858.
14:51:35.167 INFO SparkHadoopWriter - Write Job job_202603041451357127027696941623097_0858 committed. Elapsed time: 6 ms.
14:51:35.182 INFO HadoopFileSystemWrapper - Concatenating 3 parts to /tmp/ReadsSparkSinkUnitTest110243289822368755951.bam
14:51:35.188 INFO HadoopFileSystemWrapper - Concatenating to /tmp/ReadsSparkSinkUnitTest110243289822368755951.bam done
14:51:35.188 INFO IndexFileMerger - Merging .bai files in temp directory /tmp/ReadsSparkSinkUnitTest1.someOtherPlace4534331730913351225 to /tmp/ReadsSparkSinkUnitTest110243289822368755951.bam.bai
14:51:35.193 INFO IndexFileMerger - Done merging .bai files
14:51:35.196 INFO MemoryStore - Block broadcast_357 stored as values in memory (estimated size 297.9 KiB, free 1915.4 MiB)
14:51:35.203 INFO MemoryStore - Block broadcast_357_piece0 stored as bytes in memory (estimated size 50.2 KiB, free 1915.4 MiB)
14:51:35.203 INFO BlockManagerInfo - Added broadcast_357_piece0 in memory on localhost:44923 (size: 50.2 KiB, free: 1919.2 MiB)
14:51:35.203 INFO SparkContext - Created broadcast 357 from newAPIHadoopFile at PathSplitSource.java:96
14:51:35.224 INFO FileInputFormat - Total input files to process : 1
14:51:35.260 INFO SparkContext - Starting job: collect at ReadsSparkSinkUnitTest.java:182
14:51:35.260 INFO DAGScheduler - Got job 134 (collect at ReadsSparkSinkUnitTest.java:182) with 1 output partitions
14:51:35.260 INFO DAGScheduler - Final stage: ResultStage 182 (collect at ReadsSparkSinkUnitTest.java:182)
14:51:35.260 INFO DAGScheduler - Parents of final stage: List()
14:51:35.260 INFO DAGScheduler - Missing parents: List()
14:51:35.260 INFO DAGScheduler - Submitting ResultStage 182 (MapPartitionsRDD[865] at filter at ReadsSparkSource.java:96), which has no missing parents
14:51:35.277 INFO MemoryStore - Block broadcast_358 stored as values in memory (estimated size 426.2 KiB, free 1915.0 MiB)
14:51:35.283 INFO BlockManagerInfo - Removed broadcast_356_piece0 on localhost:44923 in memory (size: 67.0 KiB, free: 1919.2 MiB)
14:51:35.283 INFO MemoryStore - Block broadcast_358_piece0 stored as bytes in memory (estimated size 153.6 KiB, free 1915.1 MiB)
14:51:35.283 INFO BlockManagerInfo - Added broadcast_358_piece0 in memory on localhost:44923 (size: 153.6 KiB, free: 1919.1 MiB)
14:51:35.283 INFO BlockManagerInfo - Removed broadcast_354_piece0 on localhost:44923 in memory (size: 9.6 KiB, free: 1919.1 MiB)
14:51:35.284 INFO SparkContext - Created broadcast 358 from broadcast at DAGScheduler.scala:1580
14:51:35.284 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 182 (MapPartitionsRDD[865] at filter at ReadsSparkSource.java:96) (first 15 tasks are for partitions Vector(0))
14:51:35.284 INFO TaskSchedulerImpl - Adding task set 182.0 with 1 tasks resource profile 0
14:51:35.284 INFO BlockManagerInfo - Removed broadcast_349_piece0 on localhost:44923 in memory (size: 153.6 KiB, free: 1919.2 MiB)
14:51:35.284 INFO TaskSetManager - Starting task 0.0 in stage 182.0 (TID 238) (localhost, executor driver, partition 0, PROCESS_LOCAL, 7810 bytes)
14:51:35.285 INFO Executor - Running task 0.0 in stage 182.0 (TID 238)
14:51:35.286 INFO BlockManagerInfo - Removed broadcast_352_piece0 on localhost:44923 in memory (size: 50.2 KiB, free: 1919.3 MiB)
14:51:35.287 INFO BlockManagerInfo - Removed broadcast_355_piece0 on localhost:44923 in memory (size: 166.1 KiB, free: 1919.4 MiB)
14:51:35.287 INFO BlockManagerInfo - Removed broadcast_342_piece0 on localhost:44923 in memory (size: 9.6 KiB, free: 1919.5 MiB)
14:51:35.288 INFO BlockManagerInfo - Removed broadcast_345_piece0 on localhost:44923 in memory (size: 67.0 KiB, free: 1919.5 MiB)
14:51:35.289 INFO BlockManagerInfo - Removed broadcast_353_piece0 on localhost:44923 in memory (size: 9.6 KiB, free: 1919.5 MiB)
14:51:35.289 INFO BlockManagerInfo - Removed broadcast_343_piece0 on localhost:44923 in memory (size: 9.6 KiB, free: 1919.5 MiB)
14:51:35.290 INFO BlockManagerInfo - Removed broadcast_347_piece0 on localhost:44923 in memory (size: 50.2 KiB, free: 1919.6 MiB)
14:51:35.290 INFO BlockManagerInfo - Removed broadcast_340_piece0 on localhost:44923 in memory (size: 50.2 KiB, free: 1919.6 MiB)
14:51:35.291 INFO BlockManagerInfo - Removed broadcast_348_piece0 on localhost:44923 in memory (size: 54.5 KiB, free: 1919.7 MiB)
14:51:35.291 INFO BlockManagerInfo - Removed broadcast_350_piece0 on localhost:44923 in memory (size: 54.5 KiB, free: 1919.7 MiB)
14:51:35.291 INFO BlockManagerInfo - Removed broadcast_346_piece0 on localhost:44923 in memory (size: 8.3 KiB, free: 1919.8 MiB)
14:51:35.322 INFO NewHadoopRDD - Input split: file:/tmp/ReadsSparkSinkUnitTest110243289822368755951.bam:0+237038
14:51:35.335 INFO Executor - Finished task 0.0 in stage 182.0 (TID 238). 651483 bytes result sent to driver
14:51:35.337 INFO TaskSetManager - Finished task 0.0 in stage 182.0 (TID 238) in 53 ms on localhost (executor driver) (1/1)
14:51:35.337 INFO TaskSchedulerImpl - Removed TaskSet 182.0, whose tasks have all completed, from pool
14:51:35.338 INFO DAGScheduler - ResultStage 182 (collect at ReadsSparkSinkUnitTest.java:182) finished in 0.076 s
14:51:35.338 INFO DAGScheduler - Job 134 is finished. Cancelling potential speculative or zombie tasks for this job
14:51:35.338 INFO TaskSchedulerImpl - Killing all running tasks in stage 182: Stage finished
14:51:35.338 INFO DAGScheduler - Job 134 finished: collect at ReadsSparkSinkUnitTest.java:182, took 0.078094 s
14:51:35.347 INFO SparkContext - Starting job: count at ReadsSparkSinkUnitTest.java:185
14:51:35.348 INFO DAGScheduler - Got job 135 (count at ReadsSparkSinkUnitTest.java:185) with 1 output partitions
14:51:35.348 INFO DAGScheduler - Final stage: ResultStage 183 (count at ReadsSparkSinkUnitTest.java:185)
14:51:35.348 INFO DAGScheduler - Parents of final stage: List()
14:51:35.348 INFO DAGScheduler - Missing parents: List()
14:51:35.348 INFO DAGScheduler - Submitting ResultStage 183 (MapPartitionsRDD[846] at filter at ReadsSparkSource.java:96), which has no missing parents
14:51:35.365 INFO MemoryStore - Block broadcast_359 stored as values in memory (estimated size 426.1 KiB, free 1918.3 MiB)
14:51:35.366 INFO MemoryStore - Block broadcast_359_piece0 stored as bytes in memory (estimated size 153.6 KiB, free 1918.2 MiB)
14:51:35.367 INFO BlockManagerInfo - Added broadcast_359_piece0 in memory on localhost:44923 (size: 153.6 KiB, free: 1919.6 MiB)
14:51:35.367 INFO SparkContext - Created broadcast 359 from broadcast at DAGScheduler.scala:1580
14:51:35.367 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 183 (MapPartitionsRDD[846] at filter at ReadsSparkSource.java:96) (first 15 tasks are for partitions Vector(0))
14:51:35.367 INFO TaskSchedulerImpl - Adding task set 183.0 with 1 tasks resource profile 0
14:51:35.368 INFO TaskSetManager - Starting task 0.0 in stage 183.0 (TID 239) (localhost, executor driver, partition 0, PROCESS_LOCAL, 7872 bytes)
14:51:35.368 INFO Executor - Running task 0.0 in stage 183.0 (TID 239)
14:51:35.402 INFO NewHadoopRDD - Input split: file:/home/runner/work/gatk/gatk/src/test/resources/org/broadinstitute/hellbender/tools/BQSR/HiSeq.1mb.1RG.2k_lines.bam:0+222075
14:51:35.412 INFO Executor - Finished task 0.0 in stage 183.0 (TID 239). 989 bytes result sent to driver
14:51:35.413 INFO TaskSetManager - Finished task 0.0 in stage 183.0 (TID 239) in 46 ms on localhost (executor driver) (1/1)
14:51:35.413 INFO TaskSchedulerImpl - Removed TaskSet 183.0, whose tasks have all completed, from pool
14:51:35.413 INFO DAGScheduler - ResultStage 183 (count at ReadsSparkSinkUnitTest.java:185) finished in 0.065 s
14:51:35.413 INFO DAGScheduler - Job 135 is finished. Cancelling potential speculative or zombie tasks for this job
14:51:35.413 INFO TaskSchedulerImpl - Killing all running tasks in stage 183: Stage finished
14:51:35.413 INFO DAGScheduler - Job 135 finished: count at ReadsSparkSinkUnitTest.java:185, took 0.065880 s
14:51:35.417 INFO SparkContext - Starting job: count at ReadsSparkSinkUnitTest.java:185
14:51:35.417 INFO DAGScheduler - Got job 136 (count at ReadsSparkSinkUnitTest.java:185) with 1 output partitions
14:51:35.417 INFO DAGScheduler - Final stage: ResultStage 184 (count at ReadsSparkSinkUnitTest.java:185)
14:51:35.417 INFO DAGScheduler - Parents of final stage: List()
14:51:35.417 INFO DAGScheduler - Missing parents: List()
14:51:35.417 INFO DAGScheduler - Submitting ResultStage 184 (MapPartitionsRDD[865] at filter at ReadsSparkSource.java:96), which has no missing parents
14:51:35.434 INFO MemoryStore - Block broadcast_360 stored as values in memory (estimated size 426.1 KiB, free 1917.8 MiB)
14:51:35.435 INFO MemoryStore - Block broadcast_360_piece0 stored as bytes in memory (estimated size 153.6 KiB, free 1917.6 MiB)
14:51:35.435 INFO BlockManagerInfo - Added broadcast_360_piece0 in memory on localhost:44923 (size: 153.6 KiB, free: 1919.5 MiB)
14:51:35.436 INFO SparkContext - Created broadcast 360 from broadcast at DAGScheduler.scala:1580
14:51:35.436 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 184 (MapPartitionsRDD[865] at filter at ReadsSparkSource.java:96) (first 15 tasks are for partitions Vector(0))
14:51:35.436 INFO TaskSchedulerImpl - Adding task set 184.0 with 1 tasks resource profile 0
14:51:35.436 INFO TaskSetManager - Starting task 0.0 in stage 184.0 (TID 240) (localhost, executor driver, partition 0, PROCESS_LOCAL, 7810 bytes)
14:51:35.437 INFO Executor - Running task 0.0 in stage 184.0 (TID 240)
14:51:35.469 INFO NewHadoopRDD - Input split: file:/tmp/ReadsSparkSinkUnitTest110243289822368755951.bam:0+237038
14:51:35.481 INFO Executor - Finished task 0.0 in stage 184.0 (TID 240). 989 bytes result sent to driver
14:51:35.481 INFO TaskSetManager - Finished task 0.0 in stage 184.0 (TID 240) in 45 ms on localhost (executor driver) (1/1)
14:51:35.482 INFO TaskSchedulerImpl - Removed TaskSet 184.0, whose tasks have all completed, from pool
14:51:35.482 INFO DAGScheduler - ResultStage 184 (count at ReadsSparkSinkUnitTest.java:185) finished in 0.065 s
14:51:35.482 INFO DAGScheduler - Job 136 is finished. Cancelling potential speculative or zombie tasks for this job
14:51:35.482 INFO TaskSchedulerImpl - Killing all running tasks in stage 184: Stage finished
14:51:35.482 INFO DAGScheduler - Job 136 finished: count at ReadsSparkSinkUnitTest.java:185, took 0.065088 s
14:51:35.491 INFO MemoryStore - Block broadcast_361 stored as values in memory (estimated size 297.9 KiB, free 1917.3 MiB)
14:51:35.498 INFO MemoryStore - Block broadcast_361_piece0 stored as bytes in memory (estimated size 50.2 KiB, free 1917.3 MiB)
14:51:35.498 INFO BlockManagerInfo - Added broadcast_361_piece0 in memory on localhost:44923 (size: 50.2 KiB, free: 1919.4 MiB)
14:51:35.498 INFO SparkContext - Created broadcast 361 from newAPIHadoopFile at PathSplitSource.java:96
14:51:35.522 INFO MemoryStore - Block broadcast_362 stored as values in memory (estimated size 297.9 KiB, free 1917.0 MiB)
14:51:35.528 INFO MemoryStore - Block broadcast_362_piece0 stored as bytes in memory (estimated size 50.2 KiB, free 1916.9 MiB)
14:51:35.528 INFO BlockManagerInfo - Added broadcast_362_piece0 in memory on localhost:44923 (size: 50.2 KiB, free: 1919.4 MiB)
14:51:35.529 INFO SparkContext - Created broadcast 362 from newAPIHadoopFile at PathSplitSource.java:96
14:51:35.550 INFO FileInputFormat - Total input files to process : 1
14:51:35.552 INFO MemoryStore - Block broadcast_363 stored as values in memory (estimated size 160.7 KiB, free 1916.8 MiB)
14:51:35.553 INFO MemoryStore - Block broadcast_363_piece0 stored as bytes in memory (estimated size 9.6 KiB, free 1916.8 MiB)
14:51:35.553 INFO BlockManagerInfo - Added broadcast_363_piece0 in memory on localhost:44923 (size: 9.6 KiB, free: 1919.3 MiB)
14:51:35.553 INFO SparkContext - Created broadcast 363 from broadcast at ReadsSparkSink.java:133
14:51:35.554 WARN HtsjdkReadsRddStorage - Unrecognized write option: DISABLE
14:51:35.555 INFO MemoryStore - Block broadcast_364 stored as values in memory (estimated size 163.2 KiB, free 1916.6 MiB)
14:51:35.555 INFO MemoryStore - Block broadcast_364_piece0 stored as bytes in memory (estimated size 9.6 KiB, free 1916.6 MiB)
14:51:35.556 INFO BlockManagerInfo - Added broadcast_364_piece0 in memory on localhost:44923 (size: 9.6 KiB, free: 1919.3 MiB)
14:51:35.556 INFO SparkContext - Created broadcast 364 from broadcast at BamSink.java:76
14:51:35.558 INFO PathOutputCommitterFactory - No output committer factory defined, defaulting to FileOutputCommitterFactory
14:51:35.558 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
14:51:35.558 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
14:51:35.576 INFO SparkContext - Starting job: runJob at SparkHadoopWriter.scala:83
14:51:35.577 INFO DAGScheduler - Registering RDD 879 (mapToPair at SparkUtils.java:161) as input to shuffle 37
14:51:35.577 INFO DAGScheduler - Got job 137 (runJob at SparkHadoopWriter.scala:83) with 1 output partitions
14:51:35.577 INFO DAGScheduler - Final stage: ResultStage 186 (runJob at SparkHadoopWriter.scala:83)
14:51:35.577 INFO DAGScheduler - Parents of final stage: List(ShuffleMapStage 185)
14:51:35.577 INFO DAGScheduler - Missing parents: List(ShuffleMapStage 185)
14:51:35.577 INFO DAGScheduler - Submitting ShuffleMapStage 185 (MapPartitionsRDD[879] at mapToPair at SparkUtils.java:161), which has no missing parents
14:51:35.596 INFO MemoryStore - Block broadcast_365 stored as values in memory (estimated size 520.4 KiB, free 1916.1 MiB)
14:51:35.597 INFO MemoryStore - Block broadcast_365_piece0 stored as bytes in memory (estimated size 166.1 KiB, free 1915.9 MiB)
14:51:35.598 INFO BlockManagerInfo - Added broadcast_365_piece0 in memory on localhost:44923 (size: 166.1 KiB, free: 1919.2 MiB)
14:51:35.598 INFO SparkContext - Created broadcast 365 from broadcast at DAGScheduler.scala:1580
14:51:35.598 INFO DAGScheduler - Submitting 1 missing tasks from ShuffleMapStage 185 (MapPartitionsRDD[879] at mapToPair at SparkUtils.java:161) (first 15 tasks are for partitions Vector(0))
14:51:35.598 INFO TaskSchedulerImpl - Adding task set 185.0 with 1 tasks resource profile 0
14:51:35.599 INFO TaskSetManager - Starting task 0.0 in stage 185.0 (TID 241) (localhost, executor driver, partition 0, PROCESS_LOCAL, 7861 bytes)
14:51:35.599 INFO Executor - Running task 0.0 in stage 185.0 (TID 241)
14:51:35.633 INFO NewHadoopRDD - Input split: file:/home/runner/work/gatk/gatk/src/test/resources/org/broadinstitute/hellbender/tools/BQSR/HiSeq.1mb.1RG.2k_lines.bam:0+222075
14:51:35.650 INFO Executor - Finished task 0.0 in stage 185.0 (TID 241). 1148 bytes result sent to driver
14:51:35.651 INFO TaskSetManager - Finished task 0.0 in stage 185.0 (TID 241) in 52 ms on localhost (executor driver) (1/1)
14:51:35.651 INFO TaskSchedulerImpl - Removed TaskSet 185.0, whose tasks have all completed, from pool
14:51:35.651 INFO DAGScheduler - ShuffleMapStage 185 (mapToPair at SparkUtils.java:161) finished in 0.074 s
14:51:35.651 INFO DAGScheduler - looking for newly runnable stages
14:51:35.651 INFO DAGScheduler - running: HashSet()
14:51:35.651 INFO DAGScheduler - waiting: HashSet(ResultStage 186)
14:51:35.651 INFO DAGScheduler - failed: HashSet()
14:51:35.651 INFO DAGScheduler - Submitting ResultStage 186 (MapPartitionsRDD[884] at mapToPair at BamSink.java:91), which has no missing parents
14:51:35.659 INFO MemoryStore - Block broadcast_366 stored as values in memory (estimated size 241.4 KiB, free 1915.7 MiB)
14:51:35.659 INFO MemoryStore - Block broadcast_366_piece0 stored as bytes in memory (estimated size 67.0 KiB, free 1915.6 MiB)
14:51:35.660 INFO BlockManagerInfo - Added broadcast_366_piece0 in memory on localhost:44923 (size: 67.0 KiB, free: 1919.1 MiB)
14:51:35.660 INFO SparkContext - Created broadcast 366 from broadcast at DAGScheduler.scala:1580
14:51:35.660 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 186 (MapPartitionsRDD[884] at mapToPair at BamSink.java:91) (first 15 tasks are for partitions Vector(0))
14:51:35.660 INFO TaskSchedulerImpl - Adding task set 186.0 with 1 tasks resource profile 0
14:51:35.661 INFO TaskSetManager - Starting task 0.0 in stage 186.0 (TID 242) (localhost, executor driver, partition 0, NODE_LOCAL, 7513 bytes)
14:51:35.661 INFO Executor - Running task 0.0 in stage 186.0 (TID 242)
14:51:35.666 INFO ShuffleBlockFetcherIterator - Getting 1 (343.8 KiB) non-empty blocks including 1 (343.8 KiB) local and 0 (0.0 B) host-local and 0 (0.0 B) push-merged-local and 0 (0.0 B) remote blocks
14:51:35.666 INFO ShuffleBlockFetcherIterator - Started 0 remote fetches in 0 ms
14:51:35.679 INFO PathOutputCommitterFactory - No output committer factory defined, defaulting to FileOutputCommitterFactory
14:51:35.679 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
14:51:35.679 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
14:51:35.679 INFO PathOutputCommitterFactory - No output committer factory defined, defaulting to FileOutputCommitterFactory
14:51:35.679 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
14:51:35.679 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
14:51:35.702 INFO FileOutputCommitter - Saved output of task 'attempt_202603041451359212397425150553542_0884_r_000000_0' to file:/tmp/ReadsSparkSinkUnitTest1.someOtherPlace18312553957083968231/_temporary/0/task_202603041451359212397425150553542_0884_r_000000
14:51:35.702 INFO SparkHadoopMapRedUtil - attempt_202603041451359212397425150553542_0884_r_000000_0: Committed. Elapsed time: 0 ms.
14:51:35.702 INFO Executor - Finished task 0.0 in stage 186.0 (TID 242). 1858 bytes result sent to driver
14:51:35.703 INFO TaskSetManager - Finished task 0.0 in stage 186.0 (TID 242) in 42 ms on localhost (executor driver) (1/1)
14:51:35.703 INFO TaskSchedulerImpl - Removed TaskSet 186.0, whose tasks have all completed, from pool
14:51:35.703 INFO DAGScheduler - ResultStage 186 (runJob at SparkHadoopWriter.scala:83) finished in 0.051 s
14:51:35.703 INFO DAGScheduler - Job 137 is finished. Cancelling potential speculative or zombie tasks for this job
14:51:35.703 INFO TaskSchedulerImpl - Killing all running tasks in stage 186: Stage finished
14:51:35.703 INFO DAGScheduler - Job 137 finished: runJob at SparkHadoopWriter.scala:83, took 0.127199 s
14:51:35.704 INFO SparkHadoopWriter - Start to commit write Job job_202603041451359212397425150553542_0884.
14:51:35.710 INFO SparkHadoopWriter - Write Job job_202603041451359212397425150553542_0884 committed. Elapsed time: 5 ms.
14:51:35.724 INFO HadoopFileSystemWrapper - Concatenating 3 parts to /tmp/ReadsSparkSinkUnitTest12614682851569572773.bam
14:51:35.730 INFO HadoopFileSystemWrapper - Concatenating to /tmp/ReadsSparkSinkUnitTest12614682851569572773.bam done
14:51:35.730 INFO IndexFileMerger - Merging .sbi files in temp directory /tmp/ReadsSparkSinkUnitTest1.someOtherPlace18312553957083968231 to /tmp/ReadsSparkSinkUnitTest12614682851569572773.bam.sbi
14:51:35.737 INFO IndexFileMerger - Done merging .sbi files
14:51:35.739 INFO MemoryStore - Block broadcast_367 stored as values in memory (estimated size 320.0 B, free 1915.6 MiB)
14:51:35.739 INFO MemoryStore - Block broadcast_367_piece0 stored as bytes in memory (estimated size 233.0 B, free 1915.6 MiB)
14:51:35.739 INFO BlockManagerInfo - Added broadcast_367_piece0 in memory on localhost:44923 (size: 233.0 B, free: 1919.1 MiB)
14:51:35.740 INFO SparkContext - Created broadcast 367 from broadcast at BamSource.java:104
14:51:35.741 INFO MemoryStore - Block broadcast_368 stored as values in memory (estimated size 297.9 KiB, free 1915.3 MiB)
14:51:35.747 INFO MemoryStore - Block broadcast_368_piece0 stored as bytes in memory (estimated size 50.2 KiB, free 1915.3 MiB)
14:51:35.748 INFO BlockManagerInfo - Added broadcast_368_piece0 in memory on localhost:44923 (size: 50.2 KiB, free: 1919.1 MiB)
14:51:35.748 INFO SparkContext - Created broadcast 368 from newAPIHadoopFile at PathSplitSource.java:96
14:51:35.758 INFO FileInputFormat - Total input files to process : 1
14:51:35.773 INFO SparkContext - Starting job: collect at ReadsSparkSinkUnitTest.java:182
14:51:35.773 INFO DAGScheduler - Got job 138 (collect at ReadsSparkSinkUnitTest.java:182) with 1 output partitions
14:51:35.773 INFO DAGScheduler - Final stage: ResultStage 187 (collect at ReadsSparkSinkUnitTest.java:182)
14:51:35.773 INFO DAGScheduler - Parents of final stage: List()
14:51:35.773 INFO DAGScheduler - Missing parents: List()
14:51:35.774 INFO DAGScheduler - Submitting ResultStage 187 (MapPartitionsRDD[890] at filter at ReadsSparkSource.java:96), which has no missing parents
14:51:35.780 INFO MemoryStore - Block broadcast_369 stored as values in memory (estimated size 148.2 KiB, free 1915.1 MiB)
14:51:35.781 INFO MemoryStore - Block broadcast_369_piece0 stored as bytes in memory (estimated size 54.6 KiB, free 1915.1 MiB)
14:51:35.781 INFO BlockManagerInfo - Added broadcast_369_piece0 in memory on localhost:44923 (size: 54.6 KiB, free: 1919.0 MiB)
14:51:35.781 INFO SparkContext - Created broadcast 369 from broadcast at DAGScheduler.scala:1580
14:51:35.781 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 187 (MapPartitionsRDD[890] at filter at ReadsSparkSource.java:96) (first 15 tasks are for partitions Vector(0))
14:51:35.781 INFO TaskSchedulerImpl - Adding task set 187.0 with 1 tasks resource profile 0
14:51:35.782 INFO TaskSetManager - Starting task 0.0 in stage 187.0 (TID 243) (localhost, executor driver, partition 0, PROCESS_LOCAL, 7809 bytes)
14:51:35.782 INFO Executor - Running task 0.0 in stage 187.0 (TID 243)
14:51:35.794 INFO NewHadoopRDD - Input split: file:/tmp/ReadsSparkSinkUnitTest12614682851569572773.bam:0+237038
14:51:35.799 INFO Executor - Finished task 0.0 in stage 187.0 (TID 243). 651483 bytes result sent to driver
14:51:35.801 INFO TaskSetManager - Finished task 0.0 in stage 187.0 (TID 243) in 19 ms on localhost (executor driver) (1/1)
14:51:35.801 INFO TaskSchedulerImpl - Removed TaskSet 187.0, whose tasks have all completed, from pool
14:51:35.801 INFO DAGScheduler - ResultStage 187 (collect at ReadsSparkSinkUnitTest.java:182) finished in 0.027 s
14:51:35.801 INFO DAGScheduler - Job 138 is finished. Cancelling potential speculative or zombie tasks for this job
14:51:35.801 INFO TaskSchedulerImpl - Killing all running tasks in stage 187: Stage finished
14:51:35.801 INFO DAGScheduler - Job 138 finished: collect at ReadsSparkSinkUnitTest.java:182, took 0.028209 s
14:51:35.811 INFO SparkContext - Starting job: count at ReadsSparkSinkUnitTest.java:185
14:51:35.811 INFO DAGScheduler - Got job 139 (count at ReadsSparkSinkUnitTest.java:185) with 1 output partitions
14:51:35.811 INFO DAGScheduler - Final stage: ResultStage 188 (count at ReadsSparkSinkUnitTest.java:185)
14:51:35.811 INFO DAGScheduler - Parents of final stage: List()
14:51:35.811 INFO DAGScheduler - Missing parents: List()
14:51:35.811 INFO DAGScheduler - Submitting ResultStage 188 (MapPartitionsRDD[872] at filter at ReadsSparkSource.java:96), which has no missing parents
14:51:35.829 INFO MemoryStore - Block broadcast_370 stored as values in memory (estimated size 426.1 KiB, free 1914.7 MiB)
14:51:35.834 INFO BlockManagerInfo - Removed broadcast_366_piece0 on localhost:44923 in memory (size: 67.0 KiB, free: 1919.1 MiB)
14:51:35.835 INFO BlockManagerInfo - Removed broadcast_360_piece0 on localhost:44923 in memory (size: 153.6 KiB, free: 1919.2 MiB)
14:51:35.835 INFO MemoryStore - Block broadcast_370_piece0 stored as bytes in memory (estimated size 153.6 KiB, free 1915.4 MiB)
14:51:35.835 INFO BlockManagerInfo - Added broadcast_370_piece0 in memory on localhost:44923 (size: 153.6 KiB, free: 1919.1 MiB)
14:51:35.835 INFO SparkContext - Created broadcast 370 from broadcast at DAGScheduler.scala:1580
14:51:35.836 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 188 (MapPartitionsRDD[872] at filter at ReadsSparkSource.java:96) (first 15 tasks are for partitions Vector(0))
14:51:35.836 INFO TaskSchedulerImpl - Adding task set 188.0 with 1 tasks resource profile 0
14:51:35.836 INFO BlockManagerInfo - Removed broadcast_358_piece0 on localhost:44923 in memory (size: 153.6 KiB, free: 1919.2 MiB)
14:51:35.837 INFO TaskSetManager - Starting task 0.0 in stage 188.0 (TID 244) (localhost, executor driver, partition 0, PROCESS_LOCAL, 7872 bytes)
14:51:35.837 INFO BlockManagerInfo - Removed broadcast_363_piece0 on localhost:44923 in memory (size: 9.6 KiB, free: 1919.2 MiB)
14:51:35.837 INFO Executor - Running task 0.0 in stage 188.0 (TID 244)
14:51:35.837 INFO BlockManagerInfo - Removed broadcast_357_piece0 on localhost:44923 in memory (size: 50.2 KiB, free: 1919.3 MiB)
14:51:35.838 INFO BlockManagerInfo - Removed broadcast_362_piece0 on localhost:44923 in memory (size: 50.2 KiB, free: 1919.3 MiB)
14:51:35.838 INFO BlockManagerInfo - Removed broadcast_364_piece0 on localhost:44923 in memory (size: 9.6 KiB, free: 1919.3 MiB)
14:51:35.839 INFO BlockManagerInfo - Removed broadcast_365_piece0 on localhost:44923 in memory (size: 166.1 KiB, free: 1919.5 MiB)
14:51:35.839 INFO BlockManagerInfo - Removed broadcast_369_piece0 on localhost:44923 in memory (size: 54.6 KiB, free: 1919.6 MiB)
14:51:35.840 INFO BlockManagerInfo - Removed broadcast_359_piece0 on localhost:44923 in memory (size: 153.6 KiB, free: 1919.7 MiB)
14:51:35.840 INFO BlockManagerInfo - Removed broadcast_351_piece0 on localhost:44923 in memory (size: 50.2 KiB, free: 1919.8 MiB)
14:51:35.872 INFO NewHadoopRDD - Input split: file:/home/runner/work/gatk/gatk/src/test/resources/org/broadinstitute/hellbender/tools/BQSR/HiSeq.1mb.1RG.2k_lines.bam:0+222075
14:51:35.882 INFO Executor - Finished task 0.0 in stage 188.0 (TID 244). 989 bytes result sent to driver
14:51:35.883 INFO TaskSetManager - Finished task 0.0 in stage 188.0 (TID 244) in 47 ms on localhost (executor driver) (1/1)
14:51:35.883 INFO TaskSchedulerImpl - Removed TaskSet 188.0, whose tasks have all completed, from pool
14:51:35.883 INFO DAGScheduler - ResultStage 188 (count at ReadsSparkSinkUnitTest.java:185) finished in 0.071 s
14:51:35.883 INFO DAGScheduler - Job 139 is finished. Cancelling potential speculative or zombie tasks for this job
14:51:35.883 INFO TaskSchedulerImpl - Killing all running tasks in stage 188: Stage finished
14:51:35.883 INFO DAGScheduler - Job 139 finished: count at ReadsSparkSinkUnitTest.java:185, took 0.072618 s
14:51:35.887 INFO SparkContext - Starting job: count at ReadsSparkSinkUnitTest.java:185
14:51:35.887 INFO DAGScheduler - Got job 140 (count at ReadsSparkSinkUnitTest.java:185) with 1 output partitions
14:51:35.887 INFO DAGScheduler - Final stage: ResultStage 189 (count at ReadsSparkSinkUnitTest.java:185)
14:51:35.887 INFO DAGScheduler - Parents of final stage: List()
14:51:35.887 INFO DAGScheduler - Missing parents: List()
14:51:35.887 INFO DAGScheduler - Submitting ResultStage 189 (MapPartitionsRDD[890] at filter at ReadsSparkSource.java:96), which has no missing parents
14:51:35.894 INFO MemoryStore - Block broadcast_371 stored as values in memory (estimated size 148.1 KiB, free 1918.6 MiB)
14:51:35.895 INFO MemoryStore - Block broadcast_371_piece0 stored as bytes in memory (estimated size 54.5 KiB, free 1918.6 MiB)
14:51:35.895 INFO BlockManagerInfo - Added broadcast_371_piece0 in memory on localhost:44923 (size: 54.5 KiB, free: 1919.7 MiB)
14:51:35.895 INFO SparkContext - Created broadcast 371 from broadcast at DAGScheduler.scala:1580
14:51:35.895 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 189 (MapPartitionsRDD[890] at filter at ReadsSparkSource.java:96) (first 15 tasks are for partitions Vector(0))
14:51:35.895 INFO TaskSchedulerImpl - Adding task set 189.0 with 1 tasks resource profile 0
14:51:35.896 INFO TaskSetManager - Starting task 0.0 in stage 189.0 (TID 245) (localhost, executor driver, partition 0, PROCESS_LOCAL, 7809 bytes)
14:51:35.896 INFO Executor - Running task 0.0 in stage 189.0 (TID 245)
14:51:35.909 INFO NewHadoopRDD - Input split: file:/tmp/ReadsSparkSinkUnitTest12614682851569572773.bam:0+237038
14:51:35.912 INFO Executor - Finished task 0.0 in stage 189.0 (TID 245). 989 bytes result sent to driver
14:51:35.913 INFO TaskSetManager - Finished task 0.0 in stage 189.0 (TID 245) in 17 ms on localhost (executor driver) (1/1)
14:51:35.913 INFO TaskSchedulerImpl - Removed TaskSet 189.0, whose tasks have all completed, from pool
14:51:35.913 INFO DAGScheduler - ResultStage 189 (count at ReadsSparkSinkUnitTest.java:185) finished in 0.025 s
14:51:35.913 INFO DAGScheduler - Job 140 is finished. Cancelling potential speculative or zombie tasks for this job
14:51:35.913 INFO TaskSchedulerImpl - Killing all running tasks in stage 189: Stage finished
14:51:35.913 INFO DAGScheduler - Job 140 finished: count at ReadsSparkSinkUnitTest.java:185, took 0.026182 s
14:51:35.923 INFO MemoryStore - Block broadcast_372 stored as values in memory (estimated size 297.9 KiB, free 1918.3 MiB)
14:51:35.931 INFO MemoryStore - Block broadcast_372_piece0 stored as bytes in memory (estimated size 50.2 KiB, free 1918.2 MiB)
14:51:35.931 INFO BlockManagerInfo - Added broadcast_372_piece0 in memory on localhost:44923 (size: 50.2 KiB, free: 1919.6 MiB)
14:51:35.932 INFO SparkContext - Created broadcast 372 from newAPIHadoopFile at PathSplitSource.java:96
14:51:35.963 INFO MemoryStore - Block broadcast_373 stored as values in memory (estimated size 297.9 KiB, free 1917.9 MiB)
14:51:35.970 INFO MemoryStore - Block broadcast_373_piece0 stored as bytes in memory (estimated size 50.2 KiB, free 1917.9 MiB)
14:51:35.970 INFO BlockManagerInfo - Added broadcast_373_piece0 in memory on localhost:44923 (size: 50.2 KiB, free: 1919.6 MiB)
14:51:35.970 INFO SparkContext - Created broadcast 373 from newAPIHadoopFile at PathSplitSource.java:96
14:51:35.992 INFO FileInputFormat - Total input files to process : 1
14:51:35.994 INFO MemoryStore - Block broadcast_374 stored as values in memory (estimated size 160.7 KiB, free 1917.7 MiB)
14:51:35.995 INFO MemoryStore - Block broadcast_374_piece0 stored as bytes in memory (estimated size 9.6 KiB, free 1917.7 MiB)
14:51:35.995 INFO BlockManagerInfo - Added broadcast_374_piece0 in memory on localhost:44923 (size: 9.6 KiB, free: 1919.6 MiB)
14:51:35.995 INFO SparkContext - Created broadcast 374 from broadcast at ReadsSparkSink.java:133
14:51:35.996 WARN HtsjdkReadsRddStorage - Unrecognized write option: DISABLE
14:51:35.996 WARN HtsjdkReadsRddStorage - Unrecognized write option: DISABLE
14:51:35.997 INFO MemoryStore - Block broadcast_375 stored as values in memory (estimated size 163.2 KiB, free 1917.5 MiB)
14:51:35.997 INFO MemoryStore - Block broadcast_375_piece0 stored as bytes in memory (estimated size 9.6 KiB, free 1917.5 MiB)
14:51:35.997 INFO BlockManagerInfo - Added broadcast_375_piece0 in memory on localhost:44923 (size: 9.6 KiB, free: 1919.6 MiB)
14:51:35.998 INFO SparkContext - Created broadcast 375 from broadcast at BamSink.java:76
14:51:36.000 INFO PathOutputCommitterFactory - No output committer factory defined, defaulting to FileOutputCommitterFactory
14:51:36.000 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
14:51:36.000 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
14:51:36.020 INFO SparkContext - Starting job: runJob at SparkHadoopWriter.scala:83
14:51:36.021 INFO DAGScheduler - Registering RDD 904 (mapToPair at SparkUtils.java:161) as input to shuffle 38
14:51:36.021 INFO DAGScheduler - Got job 141 (runJob at SparkHadoopWriter.scala:83) with 1 output partitions
14:51:36.021 INFO DAGScheduler - Final stage: ResultStage 191 (runJob at SparkHadoopWriter.scala:83)
14:51:36.021 INFO DAGScheduler - Parents of final stage: List(ShuffleMapStage 190)
14:51:36.021 INFO DAGScheduler - Missing parents: List(ShuffleMapStage 190)
14:51:36.022 INFO DAGScheduler - Submitting ShuffleMapStage 190 (MapPartitionsRDD[904] at mapToPair at SparkUtils.java:161), which has no missing parents
14:51:36.040 INFO MemoryStore - Block broadcast_376 stored as values in memory (estimated size 520.4 KiB, free 1917.0 MiB)
14:51:36.041 INFO MemoryStore - Block broadcast_376_piece0 stored as bytes in memory (estimated size 166.1 KiB, free 1916.9 MiB)
14:51:36.041 INFO BlockManagerInfo - Added broadcast_376_piece0 in memory on localhost:44923 (size: 166.1 KiB, free: 1919.4 MiB)
14:51:36.042 INFO SparkContext - Created broadcast 376 from broadcast at DAGScheduler.scala:1580
14:51:36.042 INFO DAGScheduler - Submitting 1 missing tasks from ShuffleMapStage 190 (MapPartitionsRDD[904] at mapToPair at SparkUtils.java:161) (first 15 tasks are for partitions Vector(0))
14:51:36.042 INFO TaskSchedulerImpl - Adding task set 190.0 with 1 tasks resource profile 0
14:51:36.042 INFO TaskSetManager - Starting task 0.0 in stage 190.0 (TID 246) (localhost, executor driver, partition 0, PROCESS_LOCAL, 7861 bytes)
14:51:36.043 INFO Executor - Running task 0.0 in stage 190.0 (TID 246)
14:51:36.079 INFO NewHadoopRDD - Input split: file:/home/runner/work/gatk/gatk/src/test/resources/org/broadinstitute/hellbender/tools/BQSR/HiSeq.1mb.1RG.2k_lines.bam:0+222075
14:51:36.095 INFO Executor - Finished task 0.0 in stage 190.0 (TID 246). 1148 bytes result sent to driver
14:51:36.096 INFO TaskSetManager - Finished task 0.0 in stage 190.0 (TID 246) in 54 ms on localhost (executor driver) (1/1)
14:51:36.096 INFO TaskSchedulerImpl - Removed TaskSet 190.0, whose tasks have all completed, from pool
14:51:36.096 INFO DAGScheduler - ShuffleMapStage 190 (mapToPair at SparkUtils.java:161) finished in 0.074 s
14:51:36.096 INFO DAGScheduler - looking for newly runnable stages
14:51:36.096 INFO DAGScheduler - running: HashSet()
14:51:36.096 INFO DAGScheduler - waiting: HashSet(ResultStage 191)
14:51:36.096 INFO DAGScheduler - failed: HashSet()
14:51:36.096 INFO DAGScheduler - Submitting ResultStage 191 (MapPartitionsRDD[909] at mapToPair at BamSink.java:91), which has no missing parents
14:51:36.103 INFO MemoryStore - Block broadcast_377 stored as values in memory (estimated size 241.4 KiB, free 1916.6 MiB)
14:51:36.104 INFO MemoryStore - Block broadcast_377_piece0 stored as bytes in memory (estimated size 67.0 KiB, free 1916.6 MiB)
14:51:36.105 INFO BlockManagerInfo - Added broadcast_377_piece0 in memory on localhost:44923 (size: 67.0 KiB, free: 1919.4 MiB)
14:51:36.105 INFO SparkContext - Created broadcast 377 from broadcast at DAGScheduler.scala:1580
14:51:36.105 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 191 (MapPartitionsRDD[909] at mapToPair at BamSink.java:91) (first 15 tasks are for partitions Vector(0))
14:51:36.105 INFO TaskSchedulerImpl - Adding task set 191.0 with 1 tasks resource profile 0
14:51:36.106 INFO TaskSetManager - Starting task 0.0 in stage 191.0 (TID 247) (localhost, executor driver, partition 0, NODE_LOCAL, 7513 bytes)
14:51:36.106 INFO Executor - Running task 0.0 in stage 191.0 (TID 247)
14:51:36.111 INFO ShuffleBlockFetcherIterator - Getting 1 (343.8 KiB) non-empty blocks including 1 (343.8 KiB) local and 0 (0.0 B) host-local and 0 (0.0 B) push-merged-local and 0 (0.0 B) remote blocks
14:51:36.111 INFO ShuffleBlockFetcherIterator - Started 0 remote fetches in 0 ms
14:51:36.122 INFO PathOutputCommitterFactory - No output committer factory defined, defaulting to FileOutputCommitterFactory
14:51:36.122 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
14:51:36.122 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
14:51:36.123 INFO PathOutputCommitterFactory - No output committer factory defined, defaulting to FileOutputCommitterFactory
14:51:36.123 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
14:51:36.123 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
14:51:36.141 INFO FileOutputCommitter - Saved output of task 'attempt_202603041451353361217414611718685_0909_r_000000_0' to file:/tmp/ReadsSparkSinkUnitTest1.someOtherPlace12017073750189107217/_temporary/0/task_202603041451353361217414611718685_0909_r_000000
14:51:36.141 INFO SparkHadoopMapRedUtil - attempt_202603041451353361217414611718685_0909_r_000000_0: Committed. Elapsed time: 0 ms.
14:51:36.142 INFO Executor - Finished task 0.0 in stage 191.0 (TID 247). 1858 bytes result sent to driver
14:51:36.142 INFO TaskSetManager - Finished task 0.0 in stage 191.0 (TID 247) in 36 ms on localhost (executor driver) (1/1)
14:51:36.142 INFO TaskSchedulerImpl - Removed TaskSet 191.0, whose tasks have all completed, from pool
14:51:36.142 INFO DAGScheduler - ResultStage 191 (runJob at SparkHadoopWriter.scala:83) finished in 0.045 s
14:51:36.142 INFO DAGScheduler - Job 141 is finished. Cancelling potential speculative or zombie tasks for this job
14:51:36.142 INFO TaskSchedulerImpl - Killing all running tasks in stage 191: Stage finished
14:51:36.143 INFO DAGScheduler - Job 141 finished: runJob at SparkHadoopWriter.scala:83, took 0.122156 s
14:51:36.143 INFO SparkHadoopWriter - Start to commit write Job job_202603041451353361217414611718685_0909.
14:51:36.148 INFO SparkHadoopWriter - Write Job job_202603041451353361217414611718685_0909 committed. Elapsed time: 5 ms.
14:51:36.163 INFO HadoopFileSystemWrapper - Concatenating 3 parts to /tmp/ReadsSparkSinkUnitTest115827931374258641133.bam
14:51:36.168 INFO HadoopFileSystemWrapper - Concatenating to /tmp/ReadsSparkSinkUnitTest115827931374258641133.bam done
14:51:36.170 INFO MemoryStore - Block broadcast_378 stored as values in memory (estimated size 297.9 KiB, free 1916.3 MiB)
14:51:36.177 INFO MemoryStore - Block broadcast_378_piece0 stored as bytes in memory (estimated size 50.2 KiB, free 1916.2 MiB)
14:51:36.177 INFO BlockManagerInfo - Added broadcast_378_piece0 in memory on localhost:44923 (size: 50.2 KiB, free: 1919.3 MiB)
14:51:36.177 INFO SparkContext - Created broadcast 378 from newAPIHadoopFile at PathSplitSource.java:96
14:51:36.198 INFO FileInputFormat - Total input files to process : 1
14:51:36.234 INFO SparkContext - Starting job: collect at ReadsSparkSinkUnitTest.java:182
14:51:36.235 INFO DAGScheduler - Got job 142 (collect at ReadsSparkSinkUnitTest.java:182) with 1 output partitions
14:51:36.235 INFO DAGScheduler - Final stage: ResultStage 192 (collect at ReadsSparkSinkUnitTest.java:182)
14:51:36.235 INFO DAGScheduler - Parents of final stage: List()
14:51:36.235 INFO DAGScheduler - Missing parents: List()
14:51:36.235 INFO DAGScheduler - Submitting ResultStage 192 (MapPartitionsRDD[916] at filter at ReadsSparkSource.java:96), which has no missing parents
14:51:36.252 INFO MemoryStore - Block broadcast_379 stored as values in memory (estimated size 426.2 KiB, free 1915.8 MiB)
14:51:36.253 INFO MemoryStore - Block broadcast_379_piece0 stored as bytes in memory (estimated size 153.7 KiB, free 1915.7 MiB)
14:51:36.253 INFO BlockManagerInfo - Added broadcast_379_piece0 in memory on localhost:44923 (size: 153.7 KiB, free: 1919.2 MiB)
14:51:36.254 INFO SparkContext - Created broadcast 379 from broadcast at DAGScheduler.scala:1580
14:51:36.254 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 192 (MapPartitionsRDD[916] at filter at ReadsSparkSource.java:96) (first 15 tasks are for partitions Vector(0))
14:51:36.254 INFO TaskSchedulerImpl - Adding task set 192.0 with 1 tasks resource profile 0
14:51:36.255 INFO TaskSetManager - Starting task 0.0 in stage 192.0 (TID 248) (localhost, executor driver, partition 0, PROCESS_LOCAL, 7810 bytes)
14:51:36.255 INFO Executor - Running task 0.0 in stage 192.0 (TID 248)
14:51:36.288 INFO NewHadoopRDD - Input split: file:/tmp/ReadsSparkSinkUnitTest115827931374258641133.bam:0+237038
14:51:36.301 INFO Executor - Finished task 0.0 in stage 192.0 (TID 248). 651483 bytes result sent to driver
14:51:36.303 INFO TaskSetManager - Finished task 0.0 in stage 192.0 (TID 248) in 49 ms on localhost (executor driver) (1/1)
14:51:36.303 INFO TaskSchedulerImpl - Removed TaskSet 192.0, whose tasks have all completed, from pool
14:51:36.304 INFO DAGScheduler - ResultStage 192 (collect at ReadsSparkSinkUnitTest.java:182) finished in 0.068 s
14:51:36.304 INFO DAGScheduler - Job 142 is finished. Cancelling potential speculative or zombie tasks for this job
14:51:36.304 INFO TaskSchedulerImpl - Killing all running tasks in stage 192: Stage finished
14:51:36.304 INFO DAGScheduler - Job 142 finished: collect at ReadsSparkSinkUnitTest.java:182, took 0.069530 s
14:51:36.319 INFO SparkContext - Starting job: count at ReadsSparkSinkUnitTest.java:185
14:51:36.319 INFO DAGScheduler - Got job 143 (count at ReadsSparkSinkUnitTest.java:185) with 1 output partitions
14:51:36.319 INFO DAGScheduler - Final stage: ResultStage 193 (count at ReadsSparkSinkUnitTest.java:185)
14:51:36.319 INFO DAGScheduler - Parents of final stage: List()
14:51:36.319 INFO DAGScheduler - Missing parents: List()
14:51:36.319 INFO DAGScheduler - Submitting ResultStage 193 (MapPartitionsRDD[897] at filter at ReadsSparkSource.java:96), which has no missing parents
14:51:36.338 INFO MemoryStore - Block broadcast_380 stored as values in memory (estimated size 426.1 KiB, free 1915.2 MiB)
14:51:36.340 INFO MemoryStore - Block broadcast_380_piece0 stored as bytes in memory (estimated size 153.6 KiB, free 1915.1 MiB)
14:51:36.340 INFO BlockManagerInfo - Added broadcast_380_piece0 in memory on localhost:44923 (size: 153.6 KiB, free: 1919.0 MiB)
14:51:36.340 INFO SparkContext - Created broadcast 380 from broadcast at DAGScheduler.scala:1580
14:51:36.340 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 193 (MapPartitionsRDD[897] at filter at ReadsSparkSource.java:96) (first 15 tasks are for partitions Vector(0))
14:51:36.340 INFO TaskSchedulerImpl - Adding task set 193.0 with 1 tasks resource profile 0
14:51:36.341 INFO TaskSetManager - Starting task 0.0 in stage 193.0 (TID 249) (localhost, executor driver, partition 0, PROCESS_LOCAL, 7872 bytes)
14:51:36.341 INFO Executor - Running task 0.0 in stage 193.0 (TID 249)
14:51:36.373 INFO NewHadoopRDD - Input split: file:/home/runner/work/gatk/gatk/src/test/resources/org/broadinstitute/hellbender/tools/BQSR/HiSeq.1mb.1RG.2k_lines.bam:0+222075
14:51:36.383 INFO Executor - Finished task 0.0 in stage 193.0 (TID 249). 989 bytes result sent to driver
14:51:36.384 INFO TaskSetManager - Finished task 0.0 in stage 193.0 (TID 249) in 43 ms on localhost (executor driver) (1/1)
14:51:36.384 INFO TaskSchedulerImpl - Removed TaskSet 193.0, whose tasks have all completed, from pool
14:51:36.384 INFO DAGScheduler - ResultStage 193 (count at ReadsSparkSinkUnitTest.java:185) finished in 0.064 s
14:51:36.384 INFO DAGScheduler - Job 143 is finished. Cancelling potential speculative or zombie tasks for this job
14:51:36.384 INFO TaskSchedulerImpl - Killing all running tasks in stage 193: Stage finished
14:51:36.384 INFO DAGScheduler - Job 143 finished: count at ReadsSparkSinkUnitTest.java:185, took 0.065219 s
14:51:36.387 INFO SparkContext - Starting job: count at ReadsSparkSinkUnitTest.java:185
14:51:36.388 INFO DAGScheduler - Got job 144 (count at ReadsSparkSinkUnitTest.java:185) with 1 output partitions
14:51:36.388 INFO DAGScheduler - Final stage: ResultStage 194 (count at ReadsSparkSinkUnitTest.java:185)
14:51:36.388 INFO DAGScheduler - Parents of final stage: List()
14:51:36.388 INFO DAGScheduler - Missing parents: List()
14:51:36.388 INFO DAGScheduler - Submitting ResultStage 194 (MapPartitionsRDD[916] at filter at ReadsSparkSource.java:96), which has no missing parents
14:51:36.405 INFO MemoryStore - Block broadcast_381 stored as values in memory (estimated size 426.1 KiB, free 1914.7 MiB)
14:51:36.407 INFO MemoryStore - Block broadcast_381_piece0 stored as bytes in memory (estimated size 153.6 KiB, free 1914.5 MiB)
14:51:36.407 INFO BlockManagerInfo - Added broadcast_381_piece0 in memory on localhost:44923 (size: 153.6 KiB, free: 1918.9 MiB)
14:51:36.407 INFO SparkContext - Created broadcast 381 from broadcast at DAGScheduler.scala:1580
14:51:36.407 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 194 (MapPartitionsRDD[916] at filter at ReadsSparkSource.java:96) (first 15 tasks are for partitions Vector(0))
14:51:36.407 INFO TaskSchedulerImpl - Adding task set 194.0 with 1 tasks resource profile 0
14:51:36.408 INFO TaskSetManager - Starting task 0.0 in stage 194.0 (TID 250) (localhost, executor driver, partition 0, PROCESS_LOCAL, 7810 bytes)
14:51:36.408 INFO Executor - Running task 0.0 in stage 194.0 (TID 250)
14:51:36.442 INFO NewHadoopRDD - Input split: file:/tmp/ReadsSparkSinkUnitTest115827931374258641133.bam:0+237038
14:51:36.458 INFO Executor - Finished task 0.0 in stage 194.0 (TID 250). 1075 bytes result sent to driver
14:51:36.459 INFO BlockManagerInfo - Removed broadcast_374_piece0 on localhost:44923 in memory (size: 9.6 KiB, free: 1918.9 MiB)
14:51:36.459 INFO TaskSetManager - Finished task 0.0 in stage 194.0 (TID 250) in 51 ms on localhost (executor driver) (1/1)
14:51:36.459 INFO TaskSchedulerImpl - Removed TaskSet 194.0, whose tasks have all completed, from pool
14:51:36.459 INFO DAGScheduler - ResultStage 194 (count at ReadsSparkSinkUnitTest.java:185) finished in 0.071 s
14:51:36.459 INFO DAGScheduler - Job 144 is finished. Cancelling potential speculative or zombie tasks for this job
14:51:36.459 INFO TaskSchedulerImpl - Killing all running tasks in stage 194: Stage finished
14:51:36.459 INFO BlockManagerInfo - Removed broadcast_367_piece0 on localhost:44923 in memory (size: 233.0 B, free: 1918.9 MiB)
14:51:36.459 INFO DAGScheduler - Job 144 finished: count at ReadsSparkSinkUnitTest.java:185, took 0.072100 s
14:51:36.460 INFO BlockManagerInfo - Removed broadcast_379_piece0 on localhost:44923 in memory (size: 153.7 KiB, free: 1919.0 MiB)
14:51:36.460 INFO BlockManagerInfo - Removed broadcast_380_piece0 on localhost:44923 in memory (size: 153.6 KiB, free: 1919.2 MiB)
14:51:36.461 INFO BlockManagerInfo - Removed broadcast_377_piece0 on localhost:44923 in memory (size: 67.0 KiB, free: 1919.2 MiB)
14:51:36.461 INFO BlockManagerInfo - Removed broadcast_368_piece0 on localhost:44923 in memory (size: 50.2 KiB, free: 1919.3 MiB)
14:51:36.462 INFO BlockManagerInfo - Removed broadcast_361_piece0 on localhost:44923 in memory (size: 50.2 KiB, free: 1919.3 MiB)
14:51:36.463 INFO BlockManagerInfo - Removed broadcast_375_piece0 on localhost:44923 in memory (size: 9.6 KiB, free: 1919.3 MiB)
14:51:36.463 INFO BlockManagerInfo - Removed broadcast_376_piece0 on localhost:44923 in memory (size: 166.1 KiB, free: 1919.5 MiB)
14:51:36.464 INFO BlockManagerInfo - Removed broadcast_373_piece0 on localhost:44923 in memory (size: 50.2 KiB, free: 1919.5 MiB)
14:51:36.465 INFO BlockManagerInfo - Removed broadcast_370_piece0 on localhost:44923 in memory (size: 153.6 KiB, free: 1919.7 MiB)
14:51:36.466 INFO BlockManagerInfo - Removed broadcast_371_piece0 on localhost:44923 in memory (size: 54.5 KiB, free: 1919.8 MiB)
14:51:36.472 INFO MemoryStore - Block broadcast_382 stored as values in memory (estimated size 298.0 KiB, free 1918.5 MiB)
14:51:36.482 INFO MemoryStore - Block broadcast_382_piece0 stored as bytes in memory (estimated size 50.3 KiB, free 1918.4 MiB)
14:51:36.482 INFO BlockManagerInfo - Added broadcast_382_piece0 in memory on localhost:44923 (size: 50.3 KiB, free: 1919.7 MiB)
14:51:36.482 INFO SparkContext - Created broadcast 382 from newAPIHadoopFile at PathSplitSource.java:96
14:51:36.505 INFO MemoryStore - Block broadcast_383 stored as values in memory (estimated size 298.0 KiB, free 1918.1 MiB)
14:51:36.512 INFO MemoryStore - Block broadcast_383_piece0 stored as bytes in memory (estimated size 50.3 KiB, free 1918.1 MiB)
14:51:36.512 INFO BlockManagerInfo - Added broadcast_383_piece0 in memory on localhost:44923 (size: 50.3 KiB, free: 1919.7 MiB)
14:51:36.512 INFO SparkContext - Created broadcast 383 from newAPIHadoopFile at PathSplitSource.java:96
14:51:36.534 INFO FileInputFormat - Total input files to process : 1
14:51:36.537 INFO MemoryStore - Block broadcast_384 stored as values in memory (estimated size 160.7 KiB, free 1917.9 MiB)
14:51:36.538 INFO MemoryStore - Block broadcast_384_piece0 stored as bytes in memory (estimated size 9.6 KiB, free 1917.9 MiB)
14:51:36.538 INFO BlockManagerInfo - Added broadcast_384_piece0 in memory on localhost:44923 (size: 9.6 KiB, free: 1919.6 MiB)
14:51:36.539 INFO SparkContext - Created broadcast 384 from broadcast at ReadsSparkSink.java:133
14:51:36.541 INFO MemoryStore - Block broadcast_385 stored as values in memory (estimated size 163.2 KiB, free 1917.7 MiB)
14:51:36.542 INFO MemoryStore - Block broadcast_385_piece0 stored as bytes in memory (estimated size 9.6 KiB, free 1917.7 MiB)
14:51:36.542 INFO BlockManagerInfo - Added broadcast_385_piece0 in memory on localhost:44923 (size: 9.6 KiB, free: 1919.6 MiB)
14:51:36.542 INFO SparkContext - Created broadcast 385 from broadcast at BamSink.java:76
14:51:36.545 INFO PathOutputCommitterFactory - No output committer factory defined, defaulting to FileOutputCommitterFactory
14:51:36.545 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
14:51:36.545 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
14:51:36.565 INFO SparkContext - Starting job: runJob at SparkHadoopWriter.scala:83
14:51:36.566 INFO DAGScheduler - Registering RDD 930 (mapToPair at SparkUtils.java:161) as input to shuffle 39
14:51:36.566 INFO DAGScheduler - Got job 145 (runJob at SparkHadoopWriter.scala:83) with 1 output partitions
14:51:36.566 INFO DAGScheduler - Final stage: ResultStage 196 (runJob at SparkHadoopWriter.scala:83)
14:51:36.566 INFO DAGScheduler - Parents of final stage: List(ShuffleMapStage 195)
14:51:36.566 INFO DAGScheduler - Missing parents: List(ShuffleMapStage 195)
14:51:36.566 INFO DAGScheduler - Submitting ShuffleMapStage 195 (MapPartitionsRDD[930] at mapToPair at SparkUtils.java:161), which has no missing parents
14:51:36.591 INFO MemoryStore - Block broadcast_386 stored as values in memory (estimated size 520.4 KiB, free 1917.2 MiB)
14:51:36.592 INFO MemoryStore - Block broadcast_386_piece0 stored as bytes in memory (estimated size 166.1 KiB, free 1917.1 MiB)
14:51:36.592 INFO BlockManagerInfo - Added broadcast_386_piece0 in memory on localhost:44923 (size: 166.1 KiB, free: 1919.5 MiB)
14:51:36.593 INFO SparkContext - Created broadcast 386 from broadcast at DAGScheduler.scala:1580
14:51:36.593 INFO DAGScheduler - Submitting 1 missing tasks from ShuffleMapStage 195 (MapPartitionsRDD[930] at mapToPair at SparkUtils.java:161) (first 15 tasks are for partitions Vector(0))
14:51:36.593 INFO TaskSchedulerImpl - Adding task set 195.0 with 1 tasks resource profile 0
14:51:36.594 INFO TaskSetManager - Starting task 0.0 in stage 195.0 (TID 251) (localhost, executor driver, partition 0, PROCESS_LOCAL, 7901 bytes)
14:51:36.594 INFO Executor - Running task 0.0 in stage 195.0 (TID 251)
14:51:36.634 INFO NewHadoopRDD - Input split: file:/home/runner/work/gatk/gatk/src/test/resources/org/broadinstitute/hellbender/tools/BQSR/expected.HiSeq.1mb.1RG.2k_lines.alternate.recalibrated.DIQ.bam:0+216896
14:51:36.653 INFO Executor - Finished task 0.0 in stage 195.0 (TID 251). 1148 bytes result sent to driver
14:51:36.653 INFO TaskSetManager - Finished task 0.0 in stage 195.0 (TID 251) in 60 ms on localhost (executor driver) (1/1)
14:51:36.654 INFO TaskSchedulerImpl - Removed TaskSet 195.0, whose tasks have all completed, from pool
14:51:36.654 INFO DAGScheduler - ShuffleMapStage 195 (mapToPair at SparkUtils.java:161) finished in 0.087 s
14:51:36.654 INFO DAGScheduler - looking for newly runnable stages
14:51:36.654 INFO DAGScheduler - running: HashSet()
14:51:36.654 INFO DAGScheduler - waiting: HashSet(ResultStage 196)
14:51:36.654 INFO DAGScheduler - failed: HashSet()
14:51:36.654 INFO DAGScheduler - Submitting ResultStage 196 (MapPartitionsRDD[935] at mapToPair at BamSink.java:91), which has no missing parents
14:51:36.666 INFO MemoryStore - Block broadcast_387 stored as values in memory (estimated size 241.4 KiB, free 1916.8 MiB)
14:51:36.667 INFO MemoryStore - Block broadcast_387_piece0 stored as bytes in memory (estimated size 67.0 KiB, free 1916.8 MiB)
14:51:36.667 INFO BlockManagerInfo - Added broadcast_387_piece0 in memory on localhost:44923 (size: 67.0 KiB, free: 1919.4 MiB)
14:51:36.667 INFO SparkContext - Created broadcast 387 from broadcast at DAGScheduler.scala:1580
14:51:36.667 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 196 (MapPartitionsRDD[935] at mapToPair at BamSink.java:91) (first 15 tasks are for partitions Vector(0))
14:51:36.667 INFO TaskSchedulerImpl - Adding task set 196.0 with 1 tasks resource profile 0
14:51:36.668 INFO TaskSetManager - Starting task 0.0 in stage 196.0 (TID 252) (localhost, executor driver, partition 0, NODE_LOCAL, 7513 bytes)
14:51:36.669 INFO Executor - Running task 0.0 in stage 196.0 (TID 252)
14:51:36.674 INFO ShuffleBlockFetcherIterator - Getting 1 (343.8 KiB) non-empty blocks including 1 (343.8 KiB) local and 0 (0.0 B) host-local and 0 (0.0 B) push-merged-local and 0 (0.0 B) remote blocks
14:51:36.674 INFO ShuffleBlockFetcherIterator - Started 0 remote fetches in 0 ms
14:51:36.686 INFO PathOutputCommitterFactory - No output committer factory defined, defaulting to FileOutputCommitterFactory
14:51:36.686 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
14:51:36.686 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
14:51:36.686 INFO PathOutputCommitterFactory - No output committer factory defined, defaulting to FileOutputCommitterFactory
14:51:36.686 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
14:51:36.686 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
14:51:36.717 INFO FileOutputCommitter - Saved output of task 'attempt_20260304145136356872023946675753_0935_r_000000_0' to file:/tmp/ReadsSparkSinkUnitTest2.someOtherPlace8954253468682874451/_temporary/0/task_20260304145136356872023946675753_0935_r_000000
14:51:36.717 INFO SparkHadoopMapRedUtil - attempt_20260304145136356872023946675753_0935_r_000000_0: Committed. Elapsed time: 0 ms.
14:51:36.718 INFO Executor - Finished task 0.0 in stage 196.0 (TID 252). 1858 bytes result sent to driver
14:51:36.718 INFO TaskSetManager - Finished task 0.0 in stage 196.0 (TID 252) in 50 ms on localhost (executor driver) (1/1)
14:51:36.718 INFO TaskSchedulerImpl - Removed TaskSet 196.0, whose tasks have all completed, from pool
14:51:36.718 INFO DAGScheduler - ResultStage 196 (runJob at SparkHadoopWriter.scala:83) finished in 0.064 s
14:51:36.718 INFO DAGScheduler - Job 145 is finished. Cancelling potential speculative or zombie tasks for this job
14:51:36.718 INFO TaskSchedulerImpl - Killing all running tasks in stage 196: Stage finished
14:51:36.719 INFO DAGScheduler - Job 145 finished: runJob at SparkHadoopWriter.scala:83, took 0.153251 s
14:51:36.719 INFO SparkHadoopWriter - Start to commit write Job job_20260304145136356872023946675753_0935.
14:51:36.725 INFO SparkHadoopWriter - Write Job job_20260304145136356872023946675753_0935 committed. Elapsed time: 6 ms.
14:51:36.740 INFO HadoopFileSystemWrapper - Concatenating 3 parts to /tmp/ReadsSparkSinkUnitTest213024386687850283292.bam
14:51:36.746 INFO HadoopFileSystemWrapper - Concatenating to /tmp/ReadsSparkSinkUnitTest213024386687850283292.bam done
14:51:36.746 INFO IndexFileMerger - Merging .sbi files in temp directory /tmp/ReadsSparkSinkUnitTest2.someOtherPlace8954253468682874451 to /tmp/ReadsSparkSinkUnitTest213024386687850283292.bam.sbi
14:51:36.752 INFO IndexFileMerger - Done merging .sbi files
14:51:36.752 INFO IndexFileMerger - Merging .bai files in temp directory /tmp/ReadsSparkSinkUnitTest2.someOtherPlace8954253468682874451 to /tmp/ReadsSparkSinkUnitTest213024386687850283292.bam.bai
14:51:36.759 INFO IndexFileMerger - Done merging .bai files
14:51:36.761 INFO MemoryStore - Block broadcast_388 stored as values in memory (estimated size 320.0 B, free 1916.8 MiB)
14:51:36.762 INFO MemoryStore - Block broadcast_388_piece0 stored as bytes in memory (estimated size 233.0 B, free 1916.8 MiB)
14:51:36.762 INFO BlockManagerInfo - Added broadcast_388_piece0 in memory on localhost:44923 (size: 233.0 B, free: 1919.4 MiB)
14:51:36.762 INFO SparkContext - Created broadcast 388 from broadcast at BamSource.java:104
14:51:36.763 INFO MemoryStore - Block broadcast_389 stored as values in memory (estimated size 297.9 KiB, free 1916.5 MiB)
14:51:36.769 INFO MemoryStore - Block broadcast_389_piece0 stored as bytes in memory (estimated size 50.2 KiB, free 1916.4 MiB)
14:51:36.770 INFO BlockManagerInfo - Added broadcast_389_piece0 in memory on localhost:44923 (size: 50.2 KiB, free: 1919.4 MiB)
14:51:36.770 INFO SparkContext - Created broadcast 389 from newAPIHadoopFile at PathSplitSource.java:96
14:51:36.779 INFO FileInputFormat - Total input files to process : 1
14:51:36.795 INFO SparkContext - Starting job: collect at ReadsSparkSinkUnitTest.java:182
14:51:36.795 INFO DAGScheduler - Got job 146 (collect at ReadsSparkSinkUnitTest.java:182) with 1 output partitions
14:51:36.795 INFO DAGScheduler - Final stage: ResultStage 197 (collect at ReadsSparkSinkUnitTest.java:182)
14:51:36.795 INFO DAGScheduler - Parents of final stage: List()
14:51:36.795 INFO DAGScheduler - Missing parents: List()
14:51:36.796 INFO DAGScheduler - Submitting ResultStage 197 (MapPartitionsRDD[941] at filter at ReadsSparkSource.java:96), which has no missing parents
14:51:36.802 INFO MemoryStore - Block broadcast_390 stored as values in memory (estimated size 148.2 KiB, free 1916.3 MiB)
14:51:36.803 INFO MemoryStore - Block broadcast_390_piece0 stored as bytes in memory (estimated size 54.5 KiB, free 1916.2 MiB)
14:51:36.803 INFO BlockManagerInfo - Added broadcast_390_piece0 in memory on localhost:44923 (size: 54.5 KiB, free: 1919.3 MiB)
14:51:36.803 INFO SparkContext - Created broadcast 390 from broadcast at DAGScheduler.scala:1580
14:51:36.804 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 197 (MapPartitionsRDD[941] at filter at ReadsSparkSource.java:96) (first 15 tasks are for partitions Vector(0))
14:51:36.804 INFO TaskSchedulerImpl - Adding task set 197.0 with 1 tasks resource profile 0
14:51:36.804 INFO TaskSetManager - Starting task 0.0 in stage 197.0 (TID 253) (localhost, executor driver, partition 0, PROCESS_LOCAL, 7810 bytes)
14:51:36.804 INFO Executor - Running task 0.0 in stage 197.0 (TID 253)
14:51:36.817 INFO NewHadoopRDD - Input split: file:/tmp/ReadsSparkSinkUnitTest213024386687850283292.bam:0+235514
14:51:36.822 INFO Executor - Finished task 0.0 in stage 197.0 (TID 253). 650184 bytes result sent to driver
14:51:36.824 INFO TaskSetManager - Finished task 0.0 in stage 197.0 (TID 253) in 20 ms on localhost (executor driver) (1/1)
14:51:36.824 INFO TaskSchedulerImpl - Removed TaskSet 197.0, whose tasks have all completed, from pool
14:51:36.824 INFO DAGScheduler - ResultStage 197 (collect at ReadsSparkSinkUnitTest.java:182) finished in 0.028 s
14:51:36.824 INFO DAGScheduler - Job 146 is finished. Cancelling potential speculative or zombie tasks for this job
14:51:36.824 INFO TaskSchedulerImpl - Killing all running tasks in stage 197: Stage finished
14:51:36.824 INFO DAGScheduler - Job 146 finished: collect at ReadsSparkSinkUnitTest.java:182, took 0.029457 s
14:51:36.835 INFO SparkContext - Starting job: count at ReadsSparkSinkUnitTest.java:185
14:51:36.835 INFO DAGScheduler - Got job 147 (count at ReadsSparkSinkUnitTest.java:185) with 1 output partitions
14:51:36.835 INFO DAGScheduler - Final stage: ResultStage 198 (count at ReadsSparkSinkUnitTest.java:185)
14:51:36.835 INFO DAGScheduler - Parents of final stage: List()
14:51:36.835 INFO DAGScheduler - Missing parents: List()
14:51:36.835 INFO DAGScheduler - Submitting ResultStage 198 (MapPartitionsRDD[923] at filter at ReadsSparkSource.java:96), which has no missing parents
14:51:36.852 INFO MemoryStore - Block broadcast_391 stored as values in memory (estimated size 426.1 KiB, free 1915.8 MiB)
14:51:36.853 INFO MemoryStore - Block broadcast_391_piece0 stored as bytes in memory (estimated size 153.6 KiB, free 1915.7 MiB)
14:51:36.854 INFO BlockManagerInfo - Added broadcast_391_piece0 in memory on localhost:44923 (size: 153.6 KiB, free: 1919.2 MiB)
14:51:36.854 INFO SparkContext - Created broadcast 391 from broadcast at DAGScheduler.scala:1580
14:51:36.854 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 198 (MapPartitionsRDD[923] at filter at ReadsSparkSource.java:96) (first 15 tasks are for partitions Vector(0))
14:51:36.854 INFO TaskSchedulerImpl - Adding task set 198.0 with 1 tasks resource profile 0
14:51:36.855 INFO TaskSetManager - Starting task 0.0 in stage 198.0 (TID 254) (localhost, executor driver, partition 0, PROCESS_LOCAL, 7912 bytes)
14:51:36.855 INFO Executor - Running task 0.0 in stage 198.0 (TID 254)
14:51:36.888 INFO NewHadoopRDD - Input split: file:/home/runner/work/gatk/gatk/src/test/resources/org/broadinstitute/hellbender/tools/BQSR/expected.HiSeq.1mb.1RG.2k_lines.alternate.recalibrated.DIQ.bam:0+216896
14:51:36.900 INFO Executor - Finished task 0.0 in stage 198.0 (TID 254). 989 bytes result sent to driver
14:51:36.901 INFO TaskSetManager - Finished task 0.0 in stage 198.0 (TID 254) in 47 ms on localhost (executor driver) (1/1)
14:51:36.901 INFO TaskSchedulerImpl - Removed TaskSet 198.0, whose tasks have all completed, from pool
14:51:36.901 INFO DAGScheduler - ResultStage 198 (count at ReadsSparkSinkUnitTest.java:185) finished in 0.066 s
14:51:36.901 INFO DAGScheduler - Job 147 is finished. Cancelling potential speculative or zombie tasks for this job
14:51:36.901 INFO TaskSchedulerImpl - Killing all running tasks in stage 198: Stage finished
14:51:36.901 INFO DAGScheduler - Job 147 finished: count at ReadsSparkSinkUnitTest.java:185, took 0.066449 s
14:51:36.905 INFO SparkContext - Starting job: count at ReadsSparkSinkUnitTest.java:185
14:51:36.905 INFO DAGScheduler - Got job 148 (count at ReadsSparkSinkUnitTest.java:185) with 1 output partitions
14:51:36.905 INFO DAGScheduler - Final stage: ResultStage 199 (count at ReadsSparkSinkUnitTest.java:185)
14:51:36.905 INFO DAGScheduler - Parents of final stage: List()
14:51:36.905 INFO DAGScheduler - Missing parents: List()
14:51:36.905 INFO DAGScheduler - Submitting ResultStage 199 (MapPartitionsRDD[941] at filter at ReadsSparkSource.java:96), which has no missing parents
14:51:36.911 INFO MemoryStore - Block broadcast_392 stored as values in memory (estimated size 148.1 KiB, free 1915.5 MiB)
14:51:36.912 INFO MemoryStore - Block broadcast_392_piece0 stored as bytes in memory (estimated size 54.5 KiB, free 1915.5 MiB)
14:51:36.912 INFO BlockManagerInfo - Added broadcast_392_piece0 in memory on localhost:44923 (size: 54.5 KiB, free: 1919.1 MiB)
14:51:36.912 INFO SparkContext - Created broadcast 392 from broadcast at DAGScheduler.scala:1580
14:51:36.912 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 199 (MapPartitionsRDD[941] at filter at ReadsSparkSource.java:96) (first 15 tasks are for partitions Vector(0))
14:51:36.912 INFO TaskSchedulerImpl - Adding task set 199.0 with 1 tasks resource profile 0
14:51:36.913 INFO TaskSetManager - Starting task 0.0 in stage 199.0 (TID 255) (localhost, executor driver, partition 0, PROCESS_LOCAL, 7810 bytes)
14:51:36.913 INFO Executor - Running task 0.0 in stage 199.0 (TID 255)
14:51:36.927 INFO NewHadoopRDD - Input split: file:/tmp/ReadsSparkSinkUnitTest213024386687850283292.bam:0+235514
14:51:36.931 INFO Executor - Finished task 0.0 in stage 199.0 (TID 255). 989 bytes result sent to driver
14:51:36.932 INFO TaskSetManager - Finished task 0.0 in stage 199.0 (TID 255) in 18 ms on localhost (executor driver) (1/1)
14:51:36.932 INFO TaskSchedulerImpl - Removed TaskSet 199.0, whose tasks have all completed, from pool
14:51:36.932 INFO DAGScheduler - ResultStage 199 (count at ReadsSparkSinkUnitTest.java:185) finished in 0.027 s
14:51:36.932 INFO DAGScheduler - Job 148 is finished. Cancelling potential speculative or zombie tasks for this job
14:51:36.932 INFO TaskSchedulerImpl - Killing all running tasks in stage 199: Stage finished
14:51:36.932 INFO DAGScheduler - Job 148 finished: count at ReadsSparkSinkUnitTest.java:185, took 0.027453 s
14:51:36.943 INFO MemoryStore - Block broadcast_393 stored as values in memory (estimated size 298.0 KiB, free 1915.2 MiB)
14:51:36.951 INFO MemoryStore - Block broadcast_393_piece0 stored as bytes in memory (estimated size 50.2 KiB, free 1915.1 MiB)
14:51:36.951 INFO BlockManagerInfo - Added broadcast_393_piece0 in memory on localhost:44923 (size: 50.2 KiB, free: 1919.1 MiB)
14:51:36.951 INFO SparkContext - Created broadcast 393 from newAPIHadoopFile at PathSplitSource.java:96
14:51:36.973 INFO MemoryStore - Block broadcast_394 stored as values in memory (estimated size 298.0 KiB, free 1914.8 MiB)
14:51:36.979 INFO BlockManagerInfo - Removed broadcast_387_piece0 on localhost:44923 in memory (size: 67.0 KiB, free: 1919.1 MiB)
14:51:36.980 INFO BlockManagerInfo - Removed broadcast_382_piece0 on localhost:44923 in memory (size: 50.3 KiB, free: 1919.2 MiB)
14:51:36.980 INFO BlockManagerInfo - Removed broadcast_390_piece0 on localhost:44923 in memory (size: 54.5 KiB, free: 1919.2 MiB)
14:51:36.980 INFO BlockManagerInfo - Removed broadcast_381_piece0 on localhost:44923 in memory (size: 153.6 KiB, free: 1919.4 MiB)
14:51:36.981 INFO BlockManagerInfo - Removed broadcast_372_piece0 on localhost:44923 in memory (size: 50.2 KiB, free: 1919.4 MiB)
14:51:36.981 INFO BlockManagerInfo - Removed broadcast_391_piece0 on localhost:44923 in memory (size: 153.6 KiB, free: 1919.6 MiB)
14:51:36.982 INFO BlockManagerInfo - Removed broadcast_385_piece0 on localhost:44923 in memory (size: 9.6 KiB, free: 1919.6 MiB)
14:51:36.983 INFO BlockManagerInfo - Removed broadcast_386_piece0 on localhost:44923 in memory (size: 166.1 KiB, free: 1919.7 MiB)
14:51:36.983 INFO BlockManagerInfo - Removed broadcast_383_piece0 on localhost:44923 in memory (size: 50.3 KiB, free: 1919.8 MiB)
14:51:36.984 INFO BlockManagerInfo - Removed broadcast_384_piece0 on localhost:44923 in memory (size: 9.6 KiB, free: 1919.8 MiB)
14:51:36.984 INFO BlockManagerInfo - Removed broadcast_389_piece0 on localhost:44923 in memory (size: 50.2 KiB, free: 1919.8 MiB)
14:51:36.984 INFO BlockManagerInfo - Removed broadcast_392_piece0 on localhost:44923 in memory (size: 54.5 KiB, free: 1919.9 MiB)
14:51:36.985 INFO BlockManagerInfo - Removed broadcast_378_piece0 on localhost:44923 in memory (size: 50.2 KiB, free: 1920.0 MiB)
14:51:36.985 INFO BlockManagerInfo - Removed broadcast_388_piece0 on localhost:44923 in memory (size: 233.0 B, free: 1920.0 MiB)
14:51:36.986 INFO MemoryStore - Block broadcast_394_piece0 stored as bytes in memory (estimated size 50.2 KiB, free 1919.3 MiB)
14:51:36.986 INFO BlockManagerInfo - Added broadcast_394_piece0 in memory on localhost:44923 (size: 50.2 KiB, free: 1919.9 MiB)
14:51:36.986 INFO SparkContext - Created broadcast 394 from newAPIHadoopFile at PathSplitSource.java:96
14:51:37.007 INFO FileInputFormat - Total input files to process : 1
14:51:37.009 INFO MemoryStore - Block broadcast_395 stored as values in memory (estimated size 19.6 KiB, free 1919.3 MiB)
14:51:37.009 INFO MemoryStore - Block broadcast_395_piece0 stored as bytes in memory (estimated size 1890.0 B, free 1919.3 MiB)
14:51:37.009 INFO BlockManagerInfo - Added broadcast_395_piece0 in memory on localhost:44923 (size: 1890.0 B, free: 1919.9 MiB)
14:51:37.010 INFO SparkContext - Created broadcast 395 from broadcast at ReadsSparkSink.java:133
14:51:37.010 INFO MemoryStore - Block broadcast_396 stored as values in memory (estimated size 20.0 KiB, free 1919.3 MiB)
14:51:37.011 INFO MemoryStore - Block broadcast_396_piece0 stored as bytes in memory (estimated size 1890.0 B, free 1919.3 MiB)
14:51:37.011 INFO BlockManagerInfo - Added broadcast_396_piece0 in memory on localhost:44923 (size: 1890.0 B, free: 1919.9 MiB)
14:51:37.011 INFO SparkContext - Created broadcast 396 from broadcast at BamSink.java:76
14:51:37.013 INFO PathOutputCommitterFactory - No output committer factory defined, defaulting to FileOutputCommitterFactory
14:51:37.013 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
14:51:37.013 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
14:51:37.033 INFO SparkContext - Starting job: runJob at SparkHadoopWriter.scala:83
14:51:37.033 INFO DAGScheduler - Registering RDD 955 (mapToPair at SparkUtils.java:161) as input to shuffle 40
14:51:37.033 INFO DAGScheduler - Got job 149 (runJob at SparkHadoopWriter.scala:83) with 1 output partitions
14:51:37.033 INFO DAGScheduler - Final stage: ResultStage 201 (runJob at SparkHadoopWriter.scala:83)
14:51:37.034 INFO DAGScheduler - Parents of final stage: List(ShuffleMapStage 200)
14:51:37.034 INFO DAGScheduler - Missing parents: List(ShuffleMapStage 200)
14:51:37.034 INFO DAGScheduler - Submitting ShuffleMapStage 200 (MapPartitionsRDD[955] at mapToPair at SparkUtils.java:161), which has no missing parents
14:51:37.051 INFO MemoryStore - Block broadcast_397 stored as values in memory (estimated size 434.3 KiB, free 1918.9 MiB)
14:51:37.053 INFO MemoryStore - Block broadcast_397_piece0 stored as bytes in memory (estimated size 157.6 KiB, free 1918.7 MiB)
14:51:37.053 INFO BlockManagerInfo - Added broadcast_397_piece0 in memory on localhost:44923 (size: 157.6 KiB, free: 1919.7 MiB)
14:51:37.053 INFO SparkContext - Created broadcast 397 from broadcast at DAGScheduler.scala:1580
14:51:37.053 INFO DAGScheduler - Submitting 1 missing tasks from ShuffleMapStage 200 (MapPartitionsRDD[955] at mapToPair at SparkUtils.java:161) (first 15 tasks are for partitions Vector(0))
14:51:37.053 INFO TaskSchedulerImpl - Adding task set 200.0 with 1 tasks resource profile 0
14:51:37.054 INFO TaskSetManager - Starting task 0.0 in stage 200.0 (TID 256) (localhost, executor driver, partition 0, PROCESS_LOCAL, 7882 bytes)
14:51:37.054 INFO Executor - Running task 0.0 in stage 200.0 (TID 256)
14:51:37.091 INFO NewHadoopRDD - Input split: file:/home/runner/work/gatk/gatk/src/test/resources/org/broadinstitute/hellbender/tools/BQSR/CEUTrio.HiSeq.WGS.b37.ch20.1m-1m1k.NA12878.bam:0+211123
14:51:37.106 INFO Executor - Finished task 0.0 in stage 200.0 (TID 256). 1148 bytes result sent to driver
14:51:37.106 INFO TaskSetManager - Finished task 0.0 in stage 200.0 (TID 256) in 52 ms on localhost (executor driver) (1/1)
14:51:37.106 INFO TaskSchedulerImpl - Removed TaskSet 200.0, whose tasks have all completed, from pool
14:51:37.107 INFO DAGScheduler - ShuffleMapStage 200 (mapToPair at SparkUtils.java:161) finished in 0.073 s
14:51:37.107 INFO DAGScheduler - looking for newly runnable stages
14:51:37.107 INFO DAGScheduler - running: HashSet()
14:51:37.107 INFO DAGScheduler - waiting: HashSet(ResultStage 201)
14:51:37.107 INFO DAGScheduler - failed: HashSet()
14:51:37.107 INFO DAGScheduler - Submitting ResultStage 201 (MapPartitionsRDD[960] at mapToPair at BamSink.java:91), which has no missing parents
14:51:37.114 INFO MemoryStore - Block broadcast_398 stored as values in memory (estimated size 155.3 KiB, free 1918.5 MiB)
14:51:37.115 INFO MemoryStore - Block broadcast_398_piece0 stored as bytes in memory (estimated size 58.5 KiB, free 1918.5 MiB)
14:51:37.115 INFO BlockManagerInfo - Added broadcast_398_piece0 in memory on localhost:44923 (size: 58.5 KiB, free: 1919.7 MiB)
14:51:37.116 INFO SparkContext - Created broadcast 398 from broadcast at DAGScheduler.scala:1580
14:51:37.116 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 201 (MapPartitionsRDD[960] at mapToPair at BamSink.java:91) (first 15 tasks are for partitions Vector(0))
14:51:37.116 INFO TaskSchedulerImpl - Adding task set 201.0 with 1 tasks resource profile 0
14:51:37.117 INFO TaskSetManager - Starting task 0.0 in stage 201.0 (TID 257) (localhost, executor driver, partition 0, NODE_LOCAL, 7513 bytes)
14:51:37.117 INFO Executor - Running task 0.0 in stage 201.0 (TID 257)
14:51:37.121 INFO ShuffleBlockFetcherIterator - Getting 1 (312.6 KiB) non-empty blocks including 1 (312.6 KiB) local and 0 (0.0 B) host-local and 0 (0.0 B) push-merged-local and 0 (0.0 B) remote blocks
14:51:37.122 INFO ShuffleBlockFetcherIterator - Started 0 remote fetches in 0 ms
14:51:37.133 INFO PathOutputCommitterFactory - No output committer factory defined, defaulting to FileOutputCommitterFactory
14:51:37.133 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
14:51:37.133 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
14:51:37.133 INFO PathOutputCommitterFactory - No output committer factory defined, defaulting to FileOutputCommitterFactory
14:51:37.133 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
14:51:37.133 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
14:51:37.161 INFO FileOutputCommitter - Saved output of task 'attempt_202603041451379073028060993393427_0960_r_000000_0' to file:/tmp/ReadsSparkSinkUnitTest3.someOtherPlace6252672379052248995/_temporary/0/task_202603041451379073028060993393427_0960_r_000000
14:51:37.161 INFO SparkHadoopMapRedUtil - attempt_202603041451379073028060993393427_0960_r_000000_0: Committed. Elapsed time: 0 ms.
14:51:37.162 INFO Executor - Finished task 0.0 in stage 201.0 (TID 257). 1858 bytes result sent to driver
14:51:37.162 INFO TaskSetManager - Finished task 0.0 in stage 201.0 (TID 257) in 46 ms on localhost (executor driver) (1/1)
14:51:37.162 INFO TaskSchedulerImpl - Removed TaskSet 201.0, whose tasks have all completed, from pool
14:51:37.162 INFO DAGScheduler - ResultStage 201 (runJob at SparkHadoopWriter.scala:83) finished in 0.055 s
14:51:37.163 INFO DAGScheduler - Job 149 is finished. Cancelling potential speculative or zombie tasks for this job
14:51:37.163 INFO TaskSchedulerImpl - Killing all running tasks in stage 201: Stage finished
14:51:37.163 INFO DAGScheduler - Job 149 finished: runJob at SparkHadoopWriter.scala:83, took 0.129833 s
14:51:37.163 INFO SparkHadoopWriter - Start to commit write Job job_202603041451379073028060993393427_0960.
14:51:37.169 INFO SparkHadoopWriter - Write Job job_202603041451379073028060993393427_0960 committed. Elapsed time: 5 ms.
14:51:37.183 INFO HadoopFileSystemWrapper - Concatenating 3 parts to /tmp/ReadsSparkSinkUnitTest317188921660358231010.bam
14:51:37.190 INFO HadoopFileSystemWrapper - Concatenating to /tmp/ReadsSparkSinkUnitTest317188921660358231010.bam done
14:51:37.191 INFO IndexFileMerger - Merging .sbi files in temp directory /tmp/ReadsSparkSinkUnitTest3.someOtherPlace6252672379052248995 to /tmp/ReadsSparkSinkUnitTest317188921660358231010.bam.sbi
14:51:37.197 INFO IndexFileMerger - Done merging .sbi files
14:51:37.197 INFO IndexFileMerger - Merging .bai files in temp directory /tmp/ReadsSparkSinkUnitTest3.someOtherPlace6252672379052248995 to /tmp/ReadsSparkSinkUnitTest317188921660358231010.bam.bai
14:51:37.204 INFO IndexFileMerger - Done merging .bai files
14:51:37.206 INFO MemoryStore - Block broadcast_399 stored as values in memory (estimated size 312.0 B, free 1918.5 MiB)
14:51:37.207 INFO MemoryStore - Block broadcast_399_piece0 stored as bytes in memory (estimated size 231.0 B, free 1918.5 MiB)
14:51:37.207 INFO BlockManagerInfo - Added broadcast_399_piece0 in memory on localhost:44923 (size: 231.0 B, free: 1919.7 MiB)
14:51:37.207 INFO SparkContext - Created broadcast 399 from broadcast at BamSource.java:104
14:51:37.208 INFO MemoryStore - Block broadcast_400 stored as values in memory (estimated size 297.9 KiB, free 1918.2 MiB)
14:51:37.219 INFO MemoryStore - Block broadcast_400_piece0 stored as bytes in memory (estimated size 50.2 KiB, free 1918.2 MiB)
14:51:37.220 INFO BlockManagerInfo - Added broadcast_400_piece0 in memory on localhost:44923 (size: 50.2 KiB, free: 1919.6 MiB)
14:51:37.220 INFO SparkContext - Created broadcast 400 from newAPIHadoopFile at PathSplitSource.java:96
14:51:37.234 INFO FileInputFormat - Total input files to process : 1
14:51:37.249 INFO SparkContext - Starting job: collect at ReadsSparkSinkUnitTest.java:182
14:51:37.249 INFO DAGScheduler - Got job 150 (collect at ReadsSparkSinkUnitTest.java:182) with 1 output partitions
14:51:37.249 INFO DAGScheduler - Final stage: ResultStage 202 (collect at ReadsSparkSinkUnitTest.java:182)
14:51:37.249 INFO DAGScheduler - Parents of final stage: List()
14:51:37.249 INFO DAGScheduler - Missing parents: List()
14:51:37.249 INFO DAGScheduler - Submitting ResultStage 202 (MapPartitionsRDD[966] at filter at ReadsSparkSource.java:96), which has no missing parents
14:51:37.256 INFO MemoryStore - Block broadcast_401 stored as values in memory (estimated size 148.2 KiB, free 1918.0 MiB)
14:51:37.257 INFO MemoryStore - Block broadcast_401_piece0 stored as bytes in memory (estimated size 54.6 KiB, free 1918.0 MiB)
14:51:37.257 INFO BlockManagerInfo - Added broadcast_401_piece0 in memory on localhost:44923 (size: 54.6 KiB, free: 1919.6 MiB)
14:51:37.257 INFO SparkContext - Created broadcast 401 from broadcast at DAGScheduler.scala:1580
14:51:37.257 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 202 (MapPartitionsRDD[966] at filter at ReadsSparkSource.java:96) (first 15 tasks are for partitions Vector(0))
14:51:37.257 INFO TaskSchedulerImpl - Adding task set 202.0 with 1 tasks resource profile 0
14:51:37.258 INFO TaskSetManager - Starting task 0.0 in stage 202.0 (TID 258) (localhost, executor driver, partition 0, PROCESS_LOCAL, 7810 bytes)
14:51:37.258 INFO Executor - Running task 0.0 in stage 202.0 (TID 258)
14:51:37.272 INFO NewHadoopRDD - Input split: file:/tmp/ReadsSparkSinkUnitTest317188921660358231010.bam:0+236517
14:51:37.276 INFO Executor - Finished task 0.0 in stage 202.0 (TID 258). 749470 bytes result sent to driver
14:51:37.278 INFO TaskSetManager - Finished task 0.0 in stage 202.0 (TID 258) in 20 ms on localhost (executor driver) (1/1)
14:51:37.278 INFO TaskSchedulerImpl - Removed TaskSet 202.0, whose tasks have all completed, from pool
14:51:37.278 INFO DAGScheduler - ResultStage 202 (collect at ReadsSparkSinkUnitTest.java:182) finished in 0.028 s
14:51:37.278 INFO DAGScheduler - Job 150 is finished. Cancelling potential speculative or zombie tasks for this job
14:51:37.278 INFO TaskSchedulerImpl - Killing all running tasks in stage 202: Stage finished
14:51:37.278 INFO DAGScheduler - Job 150 finished: collect at ReadsSparkSinkUnitTest.java:182, took 0.029416 s
14:51:37.289 INFO SparkContext - Starting job: count at ReadsSparkSinkUnitTest.java:185
14:51:37.289 INFO DAGScheduler - Got job 151 (count at ReadsSparkSinkUnitTest.java:185) with 1 output partitions
14:51:37.289 INFO DAGScheduler - Final stage: ResultStage 203 (count at ReadsSparkSinkUnitTest.java:185)
14:51:37.289 INFO DAGScheduler - Parents of final stage: List()
14:51:37.289 INFO DAGScheduler - Missing parents: List()
14:51:37.289 INFO DAGScheduler - Submitting ResultStage 203 (MapPartitionsRDD[948] at filter at ReadsSparkSource.java:96), which has no missing parents
14:51:37.308 INFO MemoryStore - Block broadcast_402 stored as values in memory (estimated size 426.1 KiB, free 1917.5 MiB)
14:51:37.309 INFO MemoryStore - Block broadcast_402_piece0 stored as bytes in memory (estimated size 153.6 KiB, free 1917.4 MiB)
14:51:37.309 INFO BlockManagerInfo - Added broadcast_402_piece0 in memory on localhost:44923 (size: 153.6 KiB, free: 1919.4 MiB)
14:51:37.310 INFO SparkContext - Created broadcast 402 from broadcast at DAGScheduler.scala:1580
14:51:37.310 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 203 (MapPartitionsRDD[948] at filter at ReadsSparkSource.java:96) (first 15 tasks are for partitions Vector(0))
14:51:37.310 INFO TaskSchedulerImpl - Adding task set 203.0 with 1 tasks resource profile 0
14:51:37.310 INFO TaskSetManager - Starting task 0.0 in stage 203.0 (TID 259) (localhost, executor driver, partition 0, PROCESS_LOCAL, 7893 bytes)
14:51:37.311 INFO Executor - Running task 0.0 in stage 203.0 (TID 259)
14:51:37.346 INFO NewHadoopRDD - Input split: file:/home/runner/work/gatk/gatk/src/test/resources/org/broadinstitute/hellbender/tools/BQSR/CEUTrio.HiSeq.WGS.b37.ch20.1m-1m1k.NA12878.bam:0+211123
14:51:37.353 INFO Executor - Finished task 0.0 in stage 203.0 (TID 259). 989 bytes result sent to driver
14:51:37.354 INFO TaskSetManager - Finished task 0.0 in stage 203.0 (TID 259) in 44 ms on localhost (executor driver) (1/1)
14:51:37.354 INFO TaskSchedulerImpl - Removed TaskSet 203.0, whose tasks have all completed, from pool
14:51:37.354 INFO DAGScheduler - ResultStage 203 (count at ReadsSparkSinkUnitTest.java:185) finished in 0.064 s
14:51:37.354 INFO DAGScheduler - Job 151 is finished. Cancelling potential speculative or zombie tasks for this job
14:51:37.354 INFO TaskSchedulerImpl - Killing all running tasks in stage 203: Stage finished
14:51:37.354 INFO DAGScheduler - Job 151 finished: count at ReadsSparkSinkUnitTest.java:185, took 0.065463 s
14:51:37.358 INFO SparkContext - Starting job: count at ReadsSparkSinkUnitTest.java:185
14:51:37.358 INFO DAGScheduler - Got job 152 (count at ReadsSparkSinkUnitTest.java:185) with 1 output partitions
14:51:37.358 INFO DAGScheduler - Final stage: ResultStage 204 (count at ReadsSparkSinkUnitTest.java:185)
14:51:37.358 INFO DAGScheduler - Parents of final stage: List()
14:51:37.358 INFO DAGScheduler - Missing parents: List()
14:51:37.358 INFO DAGScheduler - Submitting ResultStage 204 (MapPartitionsRDD[966] at filter at ReadsSparkSource.java:96), which has no missing parents
14:51:37.364 INFO MemoryStore - Block broadcast_403 stored as values in memory (estimated size 148.1 KiB, free 1917.2 MiB)
14:51:37.365 INFO MemoryStore - Block broadcast_403_piece0 stored as bytes in memory (estimated size 54.5 KiB, free 1917.2 MiB)
14:51:37.365 INFO BlockManagerInfo - Added broadcast_403_piece0 in memory on localhost:44923 (size: 54.5 KiB, free: 1919.4 MiB)
14:51:37.365 INFO SparkContext - Created broadcast 403 from broadcast at DAGScheduler.scala:1580
14:51:37.366 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 204 (MapPartitionsRDD[966] at filter at ReadsSparkSource.java:96) (first 15 tasks are for partitions Vector(0))
14:51:37.366 INFO TaskSchedulerImpl - Adding task set 204.0 with 1 tasks resource profile 0
14:51:37.366 INFO TaskSetManager - Starting task 0.0 in stage 204.0 (TID 260) (localhost, executor driver, partition 0, PROCESS_LOCAL, 7810 bytes)
14:51:37.366 INFO Executor - Running task 0.0 in stage 204.0 (TID 260)
14:51:37.379 INFO NewHadoopRDD - Input split: file:/tmp/ReadsSparkSinkUnitTest317188921660358231010.bam:0+236517
14:51:37.382 INFO Executor - Finished task 0.0 in stage 204.0 (TID 260). 989 bytes result sent to driver
14:51:37.382 INFO TaskSetManager - Finished task 0.0 in stage 204.0 (TID 260) in 16 ms on localhost (executor driver) (1/1)
14:51:37.382 INFO TaskSchedulerImpl - Removed TaskSet 204.0, whose tasks have all completed, from pool
14:51:37.382 INFO DAGScheduler - ResultStage 204 (count at ReadsSparkSinkUnitTest.java:185) finished in 0.024 s
14:51:37.383 INFO DAGScheduler - Job 152 is finished. Cancelling potential speculative or zombie tasks for this job
14:51:37.383 INFO TaskSchedulerImpl - Killing all running tasks in stage 204: Stage finished
14:51:37.383 INFO DAGScheduler - Job 152 finished: count at ReadsSparkSinkUnitTest.java:185, took 0.024920 s
14:51:37.391 INFO MemoryStore - Block broadcast_404 stored as values in memory (estimated size 576.0 B, free 1917.2 MiB)
14:51:37.391 INFO MemoryStore - Block broadcast_404_piece0 stored as bytes in memory (estimated size 228.0 B, free 1917.2 MiB)
14:51:37.391 INFO BlockManagerInfo - Added broadcast_404_piece0 in memory on localhost:44923 (size: 228.0 B, free: 1919.4 MiB)
14:51:37.392 INFO SparkContext - Created broadcast 404 from broadcast at CramSource.java:114
14:51:37.393 INFO MemoryStore - Block broadcast_405 stored as values in memory (estimated size 297.9 KiB, free 1916.9 MiB)
14:51:37.404 INFO MemoryStore - Block broadcast_405_piece0 stored as bytes in memory (estimated size 50.2 KiB, free 1916.8 MiB)
14:51:37.404 INFO BlockManagerInfo - Added broadcast_405_piece0 in memory on localhost:44923 (size: 50.2 KiB, free: 1919.3 MiB)
14:51:37.405 INFO SparkContext - Created broadcast 405 from newAPIHadoopFile at PathSplitSource.java:96
14:51:37.422 INFO MemoryStore - Block broadcast_406 stored as values in memory (estimated size 576.0 B, free 1916.8 MiB)
14:51:37.426 INFO MemoryStore - Block broadcast_406_piece0 stored as bytes in memory (estimated size 228.0 B, free 1916.8 MiB)
14:51:37.426 INFO BlockManagerInfo - Added broadcast_406_piece0 in memory on localhost:44923 (size: 228.0 B, free: 1919.3 MiB)
14:51:37.426 INFO BlockManagerInfo - Removed broadcast_400_piece0 on localhost:44923 in memory (size: 50.2 KiB, free: 1919.4 MiB)
14:51:37.426 INFO SparkContext - Created broadcast 406 from broadcast at CramSource.java:114
14:51:37.426 INFO BlockManagerInfo - Removed broadcast_398_piece0 on localhost:44923 in memory (size: 58.5 KiB, free: 1919.4 MiB)
14:51:37.427 INFO BlockManagerInfo - Removed broadcast_396_piece0 on localhost:44923 in memory (size: 1890.0 B, free: 1919.4 MiB)
14:51:37.427 INFO BlockManagerInfo - Removed broadcast_401_piece0 on localhost:44923 in memory (size: 54.6 KiB, free: 1919.5 MiB)
14:51:37.428 INFO MemoryStore - Block broadcast_407 stored as values in memory (estimated size 297.9 KiB, free 1917.3 MiB)
14:51:37.428 INFO BlockManagerInfo - Removed broadcast_395_piece0 on localhost:44923 in memory (size: 1890.0 B, free: 1919.5 MiB)
14:51:37.428 INFO BlockManagerInfo - Removed broadcast_394_piece0 on localhost:44923 in memory (size: 50.2 KiB, free: 1919.5 MiB)
14:51:37.429 INFO BlockManagerInfo - Removed broadcast_402_piece0 on localhost:44923 in memory (size: 153.6 KiB, free: 1919.7 MiB)
14:51:37.429 INFO BlockManagerInfo - Removed broadcast_403_piece0 on localhost:44923 in memory (size: 54.5 KiB, free: 1919.7 MiB)
14:51:37.430 INFO BlockManagerInfo - Removed broadcast_393_piece0 on localhost:44923 in memory (size: 50.2 KiB, free: 1919.8 MiB)
14:51:37.430 INFO BlockManagerInfo - Removed broadcast_397_piece0 on localhost:44923 in memory (size: 157.6 KiB, free: 1920.0 MiB)
14:51:37.430 INFO BlockManagerInfo - Removed broadcast_399_piece0 on localhost:44923 in memory (size: 231.0 B, free: 1920.0 MiB)
14:51:37.435 INFO MemoryStore - Block broadcast_407_piece0 stored as bytes in memory (estimated size 50.2 KiB, free 1919.3 MiB)
14:51:37.435 INFO BlockManagerInfo - Added broadcast_407_piece0 in memory on localhost:44923 (size: 50.2 KiB, free: 1919.9 MiB)
14:51:37.435 INFO SparkContext - Created broadcast 407 from newAPIHadoopFile at PathSplitSource.java:96
14:51:37.450 INFO FileInputFormat - Total input files to process : 1
14:51:37.452 INFO MemoryStore - Block broadcast_408 stored as values in memory (estimated size 6.0 KiB, free 1919.3 MiB)
14:51:37.452 INFO MemoryStore - Block broadcast_408_piece0 stored as bytes in memory (estimated size 1473.0 B, free 1919.3 MiB)
14:51:37.452 INFO BlockManagerInfo - Added broadcast_408_piece0 in memory on localhost:44923 (size: 1473.0 B, free: 1919.9 MiB)
14:51:37.453 INFO SparkContext - Created broadcast 408 from broadcast at ReadsSparkSink.java:133
14:51:37.453 INFO MemoryStore - Block broadcast_409 stored as values in memory (estimated size 6.2 KiB, free 1919.3 MiB)
14:51:37.454 INFO MemoryStore - Block broadcast_409_piece0 stored as bytes in memory (estimated size 1473.0 B, free 1919.3 MiB)
14:51:37.454 INFO BlockManagerInfo - Added broadcast_409_piece0 in memory on localhost:44923 (size: 1473.0 B, free: 1919.9 MiB)
14:51:37.454 INFO SparkContext - Created broadcast 409 from broadcast at CramSink.java:76
14:51:37.456 INFO PathOutputCommitterFactory - No output committer factory defined, defaulting to FileOutputCommitterFactory
14:51:37.456 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
14:51:37.456 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
14:51:37.477 INFO SparkContext - Starting job: runJob at SparkHadoopWriter.scala:83
14:51:37.477 INFO DAGScheduler - Registering RDD 978 (mapToPair at SparkUtils.java:161) as input to shuffle 41
14:51:37.477 INFO DAGScheduler - Got job 153 (runJob at SparkHadoopWriter.scala:83) with 1 output partitions
14:51:37.477 INFO DAGScheduler - Final stage: ResultStage 206 (runJob at SparkHadoopWriter.scala:83)
14:51:37.478 INFO DAGScheduler - Parents of final stage: List(ShuffleMapStage 205)
14:51:37.478 INFO DAGScheduler - Missing parents: List(ShuffleMapStage 205)
14:51:37.478 INFO DAGScheduler - Submitting ShuffleMapStage 205 (MapPartitionsRDD[978] at mapToPair at SparkUtils.java:161), which has no missing parents
14:51:37.490 INFO MemoryStore - Block broadcast_410 stored as values in memory (estimated size 292.8 KiB, free 1919.0 MiB)
14:51:37.491 INFO MemoryStore - Block broadcast_410_piece0 stored as bytes in memory (estimated size 107.3 KiB, free 1918.9 MiB)
14:51:37.491 INFO BlockManagerInfo - Added broadcast_410_piece0 in memory on localhost:44923 (size: 107.3 KiB, free: 1919.8 MiB)
14:51:37.491 INFO SparkContext - Created broadcast 410 from broadcast at DAGScheduler.scala:1580
14:51:37.491 INFO DAGScheduler - Submitting 1 missing tasks from ShuffleMapStage 205 (MapPartitionsRDD[978] at mapToPair at SparkUtils.java:161) (first 15 tasks are for partitions Vector(0))
14:51:37.491 INFO TaskSchedulerImpl - Adding task set 205.0 with 1 tasks resource profile 0
14:51:37.492 INFO TaskSetManager - Starting task 0.0 in stage 205.0 (TID 261) (localhost, executor driver, partition 0, PROCESS_LOCAL, 7869 bytes)
14:51:37.492 INFO Executor - Running task 0.0 in stage 205.0 (TID 261)
14:51:37.516 INFO NewHadoopRDD - Input split: file:/home/runner/work/gatk/gatk/src/test/resources/org/broadinstitute/hellbender/tools/BQSR/NA12878.chr17_69k_70k.dictFix.cram:0+50619
14:51:37.530 INFO Executor - Finished task 0.0 in stage 205.0 (TID 261). 1148 bytes result sent to driver
14:51:37.530 INFO TaskSetManager - Finished task 0.0 in stage 205.0 (TID 261) in 38 ms on localhost (executor driver) (1/1)
14:51:37.530 INFO TaskSchedulerImpl - Removed TaskSet 205.0, whose tasks have all completed, from pool
14:51:37.531 INFO DAGScheduler - ShuffleMapStage 205 (mapToPair at SparkUtils.java:161) finished in 0.053 s
14:51:37.531 INFO DAGScheduler - looking for newly runnable stages
14:51:37.531 INFO DAGScheduler - running: HashSet()
14:51:37.531 INFO DAGScheduler - waiting: HashSet(ResultStage 206)
14:51:37.531 INFO DAGScheduler - failed: HashSet()
14:51:37.531 INFO DAGScheduler - Submitting ResultStage 206 (MapPartitionsRDD[983] at mapToPair at CramSink.java:89), which has no missing parents
14:51:37.538 INFO MemoryStore - Block broadcast_411 stored as values in memory (estimated size 153.2 KiB, free 1918.8 MiB)
14:51:37.538 INFO MemoryStore - Block broadcast_411_piece0 stored as bytes in memory (estimated size 58.1 KiB, free 1918.7 MiB)
14:51:37.539 INFO BlockManagerInfo - Added broadcast_411_piece0 in memory on localhost:44923 (size: 58.1 KiB, free: 1919.7 MiB)
14:51:37.539 INFO SparkContext - Created broadcast 411 from broadcast at DAGScheduler.scala:1580
14:51:37.539 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 206 (MapPartitionsRDD[983] at mapToPair at CramSink.java:89) (first 15 tasks are for partitions Vector(0))
14:51:37.539 INFO TaskSchedulerImpl - Adding task set 206.0 with 1 tasks resource profile 0
14:51:37.540 INFO TaskSetManager - Starting task 0.0 in stage 206.0 (TID 262) (localhost, executor driver, partition 0, NODE_LOCAL, 7513 bytes)
14:51:37.540 INFO Executor - Running task 0.0 in stage 206.0 (TID 262)
14:51:37.544 INFO ShuffleBlockFetcherIterator - Getting 1 (82.3 KiB) non-empty blocks including 1 (82.3 KiB) local and 0 (0.0 B) host-local and 0 (0.0 B) push-merged-local and 0 (0.0 B) remote blocks
14:51:37.544 INFO ShuffleBlockFetcherIterator - Started 0 remote fetches in 0 ms
14:51:37.551 INFO PathOutputCommitterFactory - No output committer factory defined, defaulting to FileOutputCommitterFactory
14:51:37.551 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
14:51:37.551 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
14:51:37.551 INFO PathOutputCommitterFactory - No output committer factory defined, defaulting to FileOutputCommitterFactory
14:51:37.551 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
14:51:37.551 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
14:51:37.596 INFO FileOutputCommitter - Saved output of task 'attempt_202603041451373592912689413675690_0983_r_000000_0' to file:/tmp/ReadsSparkSinkUnitTest5.someOtherPlace6299929310343890135/_temporary/0/task_202603041451373592912689413675690_0983_r_000000
14:51:37.596 INFO SparkHadoopMapRedUtil - attempt_202603041451373592912689413675690_0983_r_000000_0: Committed. Elapsed time: 0 ms.
14:51:37.597 INFO Executor - Finished task 0.0 in stage 206.0 (TID 262). 1858 bytes result sent to driver
14:51:37.597 INFO TaskSetManager - Finished task 0.0 in stage 206.0 (TID 262) in 57 ms on localhost (executor driver) (1/1)
14:51:37.597 INFO TaskSchedulerImpl - Removed TaskSet 206.0, whose tasks have all completed, from pool
14:51:37.597 INFO DAGScheduler - ResultStage 206 (runJob at SparkHadoopWriter.scala:83) finished in 0.066 s
14:51:37.597 INFO DAGScheduler - Job 153 is finished. Cancelling potential speculative or zombie tasks for this job
14:51:37.597 INFO TaskSchedulerImpl - Killing all running tasks in stage 206: Stage finished
14:51:37.598 INFO DAGScheduler - Job 153 finished: runJob at SparkHadoopWriter.scala:83, took 0.120799 s
14:51:37.598 INFO SparkHadoopWriter - Start to commit write Job job_202603041451373592912689413675690_0983.
14:51:37.605 INFO SparkHadoopWriter - Write Job job_202603041451373592912689413675690_0983 committed. Elapsed time: 6 ms.
14:51:37.620 INFO HadoopFileSystemWrapper - Concatenating 3 parts to /tmp/ReadsSparkSinkUnitTest53337406413115875942.cram
14:51:37.625 INFO HadoopFileSystemWrapper - Concatenating to /tmp/ReadsSparkSinkUnitTest53337406413115875942.cram done
14:51:37.627 INFO MemoryStore - Block broadcast_412 stored as values in memory (estimated size 504.0 B, free 1918.7 MiB)
14:51:37.627 INFO MemoryStore - Block broadcast_412_piece0 stored as bytes in memory (estimated size 159.0 B, free 1918.7 MiB)
14:51:37.628 INFO BlockManagerInfo - Added broadcast_412_piece0 in memory on localhost:44923 (size: 159.0 B, free: 1919.7 MiB)
14:51:37.628 INFO SparkContext - Created broadcast 412 from broadcast at CramSource.java:114
14:51:37.629 INFO MemoryStore - Block broadcast_413 stored as values in memory (estimated size 297.9 KiB, free 1918.4 MiB)
14:51:37.636 INFO MemoryStore - Block broadcast_413_piece0 stored as bytes in memory (estimated size 50.2 KiB, free 1918.4 MiB)
14:51:37.636 INFO BlockManagerInfo - Added broadcast_413_piece0 in memory on localhost:44923 (size: 50.2 KiB, free: 1919.7 MiB)
14:51:37.636 INFO SparkContext - Created broadcast 413 from newAPIHadoopFile at PathSplitSource.java:96
14:51:37.651 INFO FileInputFormat - Total input files to process : 1
14:51:37.677 INFO SparkContext - Starting job: collect at ReadsSparkSinkUnitTest.java:182
14:51:37.678 INFO DAGScheduler - Got job 154 (collect at ReadsSparkSinkUnitTest.java:182) with 1 output partitions
14:51:37.678 INFO DAGScheduler - Final stage: ResultStage 207 (collect at ReadsSparkSinkUnitTest.java:182)
14:51:37.678 INFO DAGScheduler - Parents of final stage: List()
14:51:37.678 INFO DAGScheduler - Missing parents: List()
14:51:37.678 INFO DAGScheduler - Submitting ResultStage 207 (MapPartitionsRDD[989] at filter at ReadsSparkSource.java:96), which has no missing parents
14:51:37.691 INFO MemoryStore - Block broadcast_414 stored as values in memory (estimated size 286.8 KiB, free 1918.1 MiB)
14:51:37.692 INFO MemoryStore - Block broadcast_414_piece0 stored as bytes in memory (estimated size 103.6 KiB, free 1918.0 MiB)
14:51:37.692 INFO BlockManagerInfo - Added broadcast_414_piece0 in memory on localhost:44923 (size: 103.6 KiB, free: 1919.6 MiB)
14:51:37.692 INFO SparkContext - Created broadcast 414 from broadcast at DAGScheduler.scala:1580
14:51:37.693 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 207 (MapPartitionsRDD[989] at filter at ReadsSparkSource.java:96) (first 15 tasks are for partitions Vector(0))
14:51:37.693 INFO TaskSchedulerImpl - Adding task set 207.0 with 1 tasks resource profile 0
14:51:37.693 INFO TaskSetManager - Starting task 0.0 in stage 207.0 (TID 263) (localhost, executor driver, partition 0, PROCESS_LOCAL, 7810 bytes)
14:51:37.694 INFO Executor - Running task 0.0 in stage 207.0 (TID 263)
14:51:37.717 INFO NewHadoopRDD - Input split: file:/tmp/ReadsSparkSinkUnitTest53337406413115875942.cram:0+43713
14:51:37.731 INFO Executor - Finished task 0.0 in stage 207.0 (TID 263). 154058 bytes result sent to driver
14:51:37.731 INFO TaskSetManager - Finished task 0.0 in stage 207.0 (TID 263) in 38 ms on localhost (executor driver) (1/1)
14:51:37.731 INFO TaskSchedulerImpl - Removed TaskSet 207.0, whose tasks have all completed, from pool
14:51:37.732 INFO DAGScheduler - ResultStage 207 (collect at ReadsSparkSinkUnitTest.java:182) finished in 0.053 s
14:51:37.732 INFO DAGScheduler - Job 154 is finished. Cancelling potential speculative or zombie tasks for this job
14:51:37.732 INFO TaskSchedulerImpl - Killing all running tasks in stage 207: Stage finished
14:51:37.732 INFO DAGScheduler - Job 154 finished: collect at ReadsSparkSinkUnitTest.java:182, took 0.054578 s
14:51:37.740 INFO SparkContext - Starting job: count at ReadsSparkSinkUnitTest.java:185
14:51:37.740 INFO DAGScheduler - Got job 155 (count at ReadsSparkSinkUnitTest.java:185) with 1 output partitions
14:51:37.740 INFO DAGScheduler - Final stage: ResultStage 208 (count at ReadsSparkSinkUnitTest.java:185)
14:51:37.740 INFO DAGScheduler - Parents of final stage: List()
14:51:37.740 INFO DAGScheduler - Missing parents: List()
14:51:37.740 INFO DAGScheduler - Submitting ResultStage 208 (MapPartitionsRDD[972] at filter at ReadsSparkSource.java:96), which has no missing parents
14:51:37.752 INFO MemoryStore - Block broadcast_415 stored as values in memory (estimated size 286.8 KiB, free 1917.7 MiB)
14:51:37.753 INFO MemoryStore - Block broadcast_415_piece0 stored as bytes in memory (estimated size 103.6 KiB, free 1917.6 MiB)
14:51:37.753 INFO BlockManagerInfo - Added broadcast_415_piece0 in memory on localhost:44923 (size: 103.6 KiB, free: 1919.5 MiB)
14:51:37.753 INFO SparkContext - Created broadcast 415 from broadcast at DAGScheduler.scala:1580
14:51:37.754 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 208 (MapPartitionsRDD[972] at filter at ReadsSparkSource.java:96) (first 15 tasks are for partitions Vector(0))
14:51:37.754 INFO TaskSchedulerImpl - Adding task set 208.0 with 1 tasks resource profile 0
14:51:37.754 INFO TaskSetManager - Starting task 0.0 in stage 208.0 (TID 264) (localhost, executor driver, partition 0, PROCESS_LOCAL, 7880 bytes)
14:51:37.754 INFO Executor - Running task 0.0 in stage 208.0 (TID 264)
14:51:37.783 INFO NewHadoopRDD - Input split: file:/home/runner/work/gatk/gatk/src/test/resources/org/broadinstitute/hellbender/tools/BQSR/NA12878.chr17_69k_70k.dictFix.cram:0+50619
14:51:37.791 INFO Executor - Finished task 0.0 in stage 208.0 (TID 264). 989 bytes result sent to driver
14:51:37.791 INFO TaskSetManager - Finished task 0.0 in stage 208.0 (TID 264) in 37 ms on localhost (executor driver) (1/1)
14:51:37.791 INFO TaskSchedulerImpl - Removed TaskSet 208.0, whose tasks have all completed, from pool
14:51:37.792 INFO DAGScheduler - ResultStage 208 (count at ReadsSparkSinkUnitTest.java:185) finished in 0.052 s
14:51:37.792 INFO DAGScheduler - Job 155 is finished. Cancelling potential speculative or zombie tasks for this job
14:51:37.792 INFO TaskSchedulerImpl - Killing all running tasks in stage 208: Stage finished
14:51:37.792 INFO DAGScheduler - Job 155 finished: count at ReadsSparkSinkUnitTest.java:185, took 0.052158 s
14:51:37.796 INFO SparkContext - Starting job: count at ReadsSparkSinkUnitTest.java:185
14:51:37.797 INFO DAGScheduler - Got job 156 (count at ReadsSparkSinkUnitTest.java:185) with 1 output partitions
14:51:37.797 INFO DAGScheduler - Final stage: ResultStage 209 (count at ReadsSparkSinkUnitTest.java:185)
14:51:37.797 INFO DAGScheduler - Parents of final stage: List()
14:51:37.797 INFO DAGScheduler - Missing parents: List()
14:51:37.797 INFO DAGScheduler - Submitting ResultStage 209 (MapPartitionsRDD[989] at filter at ReadsSparkSource.java:96), which has no missing parents
14:51:37.808 INFO MemoryStore - Block broadcast_416 stored as values in memory (estimated size 286.8 KiB, free 1917.3 MiB)
14:51:37.809 INFO MemoryStore - Block broadcast_416_piece0 stored as bytes in memory (estimated size 103.6 KiB, free 1917.2 MiB)
14:51:37.810 INFO BlockManagerInfo - Added broadcast_416_piece0 in memory on localhost:44923 (size: 103.6 KiB, free: 1919.4 MiB)
14:51:37.810 INFO SparkContext - Created broadcast 416 from broadcast at DAGScheduler.scala:1580
14:51:37.810 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 209 (MapPartitionsRDD[989] at filter at ReadsSparkSource.java:96) (first 15 tasks are for partitions Vector(0))
14:51:37.810 INFO TaskSchedulerImpl - Adding task set 209.0 with 1 tasks resource profile 0
14:51:37.811 INFO TaskSetManager - Starting task 0.0 in stage 209.0 (TID 265) (localhost, executor driver, partition 0, PROCESS_LOCAL, 7810 bytes)
14:51:37.811 INFO Executor - Running task 0.0 in stage 209.0 (TID 265)
14:51:37.834 INFO NewHadoopRDD - Input split: file:/tmp/ReadsSparkSinkUnitTest53337406413115875942.cram:0+43713
14:51:37.844 INFO Executor - Finished task 0.0 in stage 209.0 (TID 265). 989 bytes result sent to driver
14:51:37.844 INFO TaskSetManager - Finished task 0.0 in stage 209.0 (TID 265) in 33 ms on localhost (executor driver) (1/1)
14:51:37.844 INFO TaskSchedulerImpl - Removed TaskSet 209.0, whose tasks have all completed, from pool
14:51:37.844 INFO DAGScheduler - ResultStage 209 (count at ReadsSparkSinkUnitTest.java:185) finished in 0.047 s
14:51:37.845 INFO DAGScheduler - Job 156 is finished. Cancelling potential speculative or zombie tasks for this job
14:51:37.845 INFO TaskSchedulerImpl - Killing all running tasks in stage 209: Stage finished
14:51:37.845 INFO DAGScheduler - Job 156 finished: count at ReadsSparkSinkUnitTest.java:185, took 0.048352 s
14:51:37.855 INFO MemoryStore - Block broadcast_417 stored as values in memory (estimated size 297.9 KiB, free 1916.9 MiB)
14:51:37.866 INFO MemoryStore - Block broadcast_417_piece0 stored as bytes in memory (estimated size 50.2 KiB, free 1916.9 MiB)
14:51:37.866 INFO BlockManagerInfo - Added broadcast_417_piece0 in memory on localhost:44923 (size: 50.2 KiB, free: 1919.3 MiB)
14:51:37.866 INFO SparkContext - Created broadcast 417 from newAPIHadoopFile at PathSplitSource.java:96
14:51:37.889 INFO MemoryStore - Block broadcast_418 stored as values in memory (estimated size 297.9 KiB, free 1916.6 MiB)
14:51:37.895 INFO BlockManagerInfo - Removed broadcast_414_piece0 on localhost:44923 in memory (size: 103.6 KiB, free: 1919.4 MiB)
14:51:37.895 INFO BlockManagerInfo - Removed broadcast_409_piece0 on localhost:44923 in memory (size: 1473.0 B, free: 1919.4 MiB)
14:51:37.896 INFO BlockManagerInfo - Removed broadcast_407_piece0 on localhost:44923 in memory (size: 50.2 KiB, free: 1919.5 MiB)
14:51:37.896 INFO BlockManagerInfo - Removed broadcast_404_piece0 on localhost:44923 in memory (size: 228.0 B, free: 1919.5 MiB)
14:51:37.897 INFO BlockManagerInfo - Removed broadcast_408_piece0 on localhost:44923 in memory (size: 1473.0 B, free: 1919.5 MiB)
14:51:37.897 INFO BlockManagerInfo - Removed broadcast_405_piece0 on localhost:44923 in memory (size: 50.2 KiB, free: 1919.5 MiB)
14:51:37.898 INFO BlockManagerInfo - Removed broadcast_416_piece0 on localhost:44923 in memory (size: 103.6 KiB, free: 1919.6 MiB)
14:51:37.899 INFO BlockManagerInfo - Removed broadcast_411_piece0 on localhost:44923 in memory (size: 58.1 KiB, free: 1919.7 MiB)
14:51:37.899 INFO BlockManagerInfo - Removed broadcast_415_piece0 on localhost:44923 in memory (size: 103.6 KiB, free: 1919.8 MiB)
14:51:37.899 INFO BlockManagerInfo - Removed broadcast_412_piece0 on localhost:44923 in memory (size: 159.0 B, free: 1919.8 MiB)
14:51:37.900 INFO BlockManagerInfo - Removed broadcast_410_piece0 on localhost:44923 in memory (size: 107.3 KiB, free: 1919.9 MiB)
14:51:37.900 INFO BlockManagerInfo - Removed broadcast_413_piece0 on localhost:44923 in memory (size: 50.2 KiB, free: 1920.0 MiB)
14:51:37.900 INFO BlockManagerInfo - Removed broadcast_406_piece0 on localhost:44923 in memory (size: 228.0 B, free: 1920.0 MiB)
14:51:37.901 INFO MemoryStore - Block broadcast_418_piece0 stored as bytes in memory (estimated size 50.2 KiB, free 1919.3 MiB)
14:51:37.901 INFO BlockManagerInfo - Added broadcast_418_piece0 in memory on localhost:44923 (size: 50.2 KiB, free: 1919.9 MiB)
14:51:37.902 INFO SparkContext - Created broadcast 418 from newAPIHadoopFile at PathSplitSource.java:96
14:51:37.923 INFO FileInputFormat - Total input files to process : 1
14:51:37.925 INFO MemoryStore - Block broadcast_419 stored as values in memory (estimated size 160.7 KiB, free 1919.2 MiB)
14:51:37.925 INFO MemoryStore - Block broadcast_419_piece0 stored as bytes in memory (estimated size 9.6 KiB, free 1919.2 MiB)
14:51:37.926 INFO BlockManagerInfo - Added broadcast_419_piece0 in memory on localhost:44923 (size: 9.6 KiB, free: 1919.9 MiB)
14:51:37.926 INFO SparkContext - Created broadcast 419 from broadcast at ReadsSparkSink.java:133
14:51:37.930 INFO HadoopMapRedCommitProtocol - Using output committer class org.apache.hadoop.mapred.FileOutputCommitter
14:51:37.930 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
14:51:37.930 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
14:51:37.948 INFO SparkContext - Starting job: runJob at SparkHadoopWriter.scala:83
14:51:37.949 INFO DAGScheduler - Registering RDD 1003 (mapToPair at SparkUtils.java:161) as input to shuffle 42
14:51:37.949 INFO DAGScheduler - Got job 157 (runJob at SparkHadoopWriter.scala:83) with 1 output partitions
14:51:37.949 INFO DAGScheduler - Final stage: ResultStage 211 (runJob at SparkHadoopWriter.scala:83)
14:51:37.949 INFO DAGScheduler - Parents of final stage: List(ShuffleMapStage 210)
14:51:37.949 INFO DAGScheduler - Missing parents: List(ShuffleMapStage 210)
14:51:37.949 INFO DAGScheduler - Submitting ShuffleMapStage 210 (MapPartitionsRDD[1003] at mapToPair at SparkUtils.java:161), which has no missing parents
14:51:37.967 INFO MemoryStore - Block broadcast_420 stored as values in memory (estimated size 520.4 KiB, free 1918.6 MiB)
14:51:37.969 INFO MemoryStore - Block broadcast_420_piece0 stored as bytes in memory (estimated size 166.1 KiB, free 1918.5 MiB)
14:51:37.969 INFO BlockManagerInfo - Added broadcast_420_piece0 in memory on localhost:44923 (size: 166.1 KiB, free: 1919.7 MiB)
14:51:37.969 INFO SparkContext - Created broadcast 420 from broadcast at DAGScheduler.scala:1580
14:51:37.970 INFO DAGScheduler - Submitting 1 missing tasks from ShuffleMapStage 210 (MapPartitionsRDD[1003] at mapToPair at SparkUtils.java:161) (first 15 tasks are for partitions Vector(0))
14:51:37.970 INFO TaskSchedulerImpl - Adding task set 210.0 with 1 tasks resource profile 0
14:51:37.970 INFO TaskSetManager - Starting task 0.0 in stage 210.0 (TID 266) (localhost, executor driver, partition 0, PROCESS_LOCAL, 7861 bytes)
14:51:37.970 INFO Executor - Running task 0.0 in stage 210.0 (TID 266)
14:51:38.006 INFO NewHadoopRDD - Input split: file:/home/runner/work/gatk/gatk/src/test/resources/org/broadinstitute/hellbender/tools/BQSR/HiSeq.1mb.1RG.2k_lines.bam:0+222075
14:51:38.022 INFO Executor - Finished task 0.0 in stage 210.0 (TID 266). 1148 bytes result sent to driver
14:51:38.022 INFO TaskSetManager - Finished task 0.0 in stage 210.0 (TID 266) in 52 ms on localhost (executor driver) (1/1)
14:51:38.022 INFO TaskSchedulerImpl - Removed TaskSet 210.0, whose tasks have all completed, from pool
14:51:38.023 INFO DAGScheduler - ShuffleMapStage 210 (mapToPair at SparkUtils.java:161) finished in 0.073 s
14:51:38.023 INFO DAGScheduler - looking for newly runnable stages
14:51:38.023 INFO DAGScheduler - running: HashSet()
14:51:38.023 INFO DAGScheduler - waiting: HashSet(ResultStage 211)
14:51:38.023 INFO DAGScheduler - failed: HashSet()
14:51:38.023 INFO DAGScheduler - Submitting ResultStage 211 (MapPartitionsRDD[1009] at saveAsTextFile at SamSink.java:65), which has no missing parents
14:51:38.030 INFO MemoryStore - Block broadcast_421 stored as values in memory (estimated size 241.1 KiB, free 1918.2 MiB)
14:51:38.030 INFO MemoryStore - Block broadcast_421_piece0 stored as bytes in memory (estimated size 66.9 KiB, free 1918.2 MiB)
14:51:38.031 INFO BlockManagerInfo - Added broadcast_421_piece0 in memory on localhost:44923 (size: 66.9 KiB, free: 1919.7 MiB)
14:51:38.031 INFO SparkContext - Created broadcast 421 from broadcast at DAGScheduler.scala:1580
14:51:38.031 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 211 (MapPartitionsRDD[1009] at saveAsTextFile at SamSink.java:65) (first 15 tasks are for partitions Vector(0))
14:51:38.031 INFO TaskSchedulerImpl - Adding task set 211.0 with 1 tasks resource profile 0
14:51:38.032 INFO TaskSetManager - Starting task 0.0 in stage 211.0 (TID 267) (localhost, executor driver, partition 0, NODE_LOCAL, 7513 bytes)
14:51:38.032 INFO Executor - Running task 0.0 in stage 211.0 (TID 267)
14:51:38.037 INFO ShuffleBlockFetcherIterator - Getting 1 (343.8 KiB) non-empty blocks including 1 (343.8 KiB) local and 0 (0.0 B) host-local and 0 (0.0 B) push-merged-local and 0 (0.0 B) remote blocks
14:51:38.037 INFO ShuffleBlockFetcherIterator - Started 0 remote fetches in 0 ms
14:51:38.048 INFO HadoopMapRedCommitProtocol - Using output committer class org.apache.hadoop.mapred.FileOutputCommitter
14:51:38.049 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
14:51:38.049 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
14:51:38.069 INFO FileOutputCommitter - Saved output of task 'attempt_202603041451376787281048757120047_1009_m_000000_0' to file:/tmp/ReadsSparkSinkUnitTest6.someOtherPlace479535333609267481/_temporary/0/task_202603041451376787281048757120047_1009_m_000000
14:51:38.069 INFO SparkHadoopMapRedUtil - attempt_202603041451376787281048757120047_1009_m_000000_0: Committed. Elapsed time: 0 ms.
14:51:38.070 INFO Executor - Finished task 0.0 in stage 211.0 (TID 267). 1858 bytes result sent to driver
14:51:38.070 INFO TaskSetManager - Finished task 0.0 in stage 211.0 (TID 267) in 39 ms on localhost (executor driver) (1/1)
14:51:38.070 INFO TaskSchedulerImpl - Removed TaskSet 211.0, whose tasks have all completed, from pool
14:51:38.070 INFO DAGScheduler - ResultStage 211 (runJob at SparkHadoopWriter.scala:83) finished in 0.047 s
14:51:38.071 INFO DAGScheduler - Job 157 is finished. Cancelling potential speculative or zombie tasks for this job
14:51:38.071 INFO TaskSchedulerImpl - Killing all running tasks in stage 211: Stage finished
14:51:38.071 INFO DAGScheduler - Job 157 finished: runJob at SparkHadoopWriter.scala:83, took 0.122145 s
14:51:38.071 INFO SparkHadoopWriter - Start to commit write Job job_202603041451376787281048757120047_1009.
14:51:38.076 INFO SparkHadoopWriter - Write Job job_202603041451376787281048757120047_1009 committed. Elapsed time: 5 ms.
14:51:38.087 INFO HadoopFileSystemWrapper - Concatenating 2 parts to /tmp/ReadsSparkSinkUnitTest62704346229121452570.sam
14:51:38.092 INFO HadoopFileSystemWrapper - Concatenating to /tmp/ReadsSparkSinkUnitTest62704346229121452570.sam done
WARNING 2026-03-04 14:51:38 SamReaderFactory Unable to detect file format from input URL or stream, assuming SAM format.
WARNING 2026-03-04 14:51:38 SamReaderFactory Unable to detect file format from input URL or stream, assuming SAM format.
14:51:38.096 INFO MemoryStore - Block broadcast_422 stored as values in memory (estimated size 160.7 KiB, free 1918.0 MiB)
14:51:38.096 INFO MemoryStore - Block broadcast_422_piece0 stored as bytes in memory (estimated size 9.6 KiB, free 1918.0 MiB)
14:51:38.097 INFO BlockManagerInfo - Added broadcast_422_piece0 in memory on localhost:44923 (size: 9.6 KiB, free: 1919.7 MiB)
14:51:38.097 INFO SparkContext - Created broadcast 422 from broadcast at SamSource.java:78
14:51:38.098 INFO MemoryStore - Block broadcast_423 stored as values in memory (estimated size 297.9 KiB, free 1917.7 MiB)
14:51:38.104 INFO MemoryStore - Block broadcast_423_piece0 stored as bytes in memory (estimated size 50.2 KiB, free 1917.7 MiB)
14:51:38.104 INFO BlockManagerInfo - Added broadcast_423_piece0 in memory on localhost:44923 (size: 50.2 KiB, free: 1919.6 MiB)
14:51:38.105 INFO SparkContext - Created broadcast 423 from newAPIHadoopFile at SamSource.java:108
14:51:38.107 INFO FileInputFormat - Total input files to process : 1
14:51:38.111 INFO SparkContext - Starting job: collect at ReadsSparkSinkUnitTest.java:182
14:51:38.112 INFO DAGScheduler - Got job 158 (collect at ReadsSparkSinkUnitTest.java:182) with 1 output partitions
14:51:38.112 INFO DAGScheduler - Final stage: ResultStage 212 (collect at ReadsSparkSinkUnitTest.java:182)
14:51:38.112 INFO DAGScheduler - Parents of final stage: List()
14:51:38.112 INFO DAGScheduler - Missing parents: List()
14:51:38.112 INFO DAGScheduler - Submitting ResultStage 212 (MapPartitionsRDD[1014] at filter at ReadsSparkSource.java:96), which has no missing parents
14:51:38.112 INFO MemoryStore - Block broadcast_424 stored as values in memory (estimated size 7.5 KiB, free 1917.7 MiB)
14:51:38.113 INFO MemoryStore - Block broadcast_424_piece0 stored as bytes in memory (estimated size 3.8 KiB, free 1917.7 MiB)
14:51:38.113 INFO BlockManagerInfo - Added broadcast_424_piece0 in memory on localhost:44923 (size: 3.8 KiB, free: 1919.6 MiB)
14:51:38.113 INFO SparkContext - Created broadcast 424 from broadcast at DAGScheduler.scala:1580
14:51:38.113 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 212 (MapPartitionsRDD[1014] at filter at ReadsSparkSource.java:96) (first 15 tasks are for partitions Vector(0))
14:51:38.113 INFO TaskSchedulerImpl - Adding task set 212.0 with 1 tasks resource profile 0
14:51:38.114 INFO TaskSetManager - Starting task 0.0 in stage 212.0 (TID 268) (localhost, executor driver, partition 0, PROCESS_LOCAL, 7809 bytes)
14:51:38.114 INFO Executor - Running task 0.0 in stage 212.0 (TID 268)
14:51:38.115 INFO NewHadoopRDD - Input split: file:/tmp/ReadsSparkSinkUnitTest62704346229121452570.sam:0+847558
14:51:38.135 INFO Executor - Finished task 0.0 in stage 212.0 (TID 268). 651526 bytes result sent to driver
14:51:38.138 INFO TaskSetManager - Finished task 0.0 in stage 212.0 (TID 268) in 24 ms on localhost (executor driver) (1/1)
14:51:38.138 INFO TaskSchedulerImpl - Removed TaskSet 212.0, whose tasks have all completed, from pool
14:51:38.138 INFO DAGScheduler - ResultStage 212 (collect at ReadsSparkSinkUnitTest.java:182) finished in 0.026 s
14:51:38.138 INFO DAGScheduler - Job 158 is finished. Cancelling potential speculative or zombie tasks for this job
14:51:38.138 INFO TaskSchedulerImpl - Killing all running tasks in stage 212: Stage finished
14:51:38.138 INFO DAGScheduler - Job 158 finished: collect at ReadsSparkSinkUnitTest.java:182, took 0.027104 s
14:51:38.153 INFO SparkContext - Starting job: count at ReadsSparkSinkUnitTest.java:185
14:51:38.154 INFO DAGScheduler - Got job 159 (count at ReadsSparkSinkUnitTest.java:185) with 1 output partitions
14:51:38.154 INFO DAGScheduler - Final stage: ResultStage 213 (count at ReadsSparkSinkUnitTest.java:185)
14:51:38.154 INFO DAGScheduler - Parents of final stage: List()
14:51:38.154 INFO DAGScheduler - Missing parents: List()
14:51:38.154 INFO DAGScheduler - Submitting ResultStage 213 (MapPartitionsRDD[996] at filter at ReadsSparkSource.java:96), which has no missing parents
14:51:38.171 INFO MemoryStore - Block broadcast_425 stored as values in memory (estimated size 426.1 KiB, free 1917.2 MiB)
14:51:38.173 INFO MemoryStore - Block broadcast_425_piece0 stored as bytes in memory (estimated size 153.6 KiB, free 1917.1 MiB)
14:51:38.173 INFO BlockManagerInfo - Added broadcast_425_piece0 in memory on localhost:44923 (size: 153.6 KiB, free: 1919.5 MiB)
14:51:38.173 INFO SparkContext - Created broadcast 425 from broadcast at DAGScheduler.scala:1580
14:51:38.173 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 213 (MapPartitionsRDD[996] at filter at ReadsSparkSource.java:96) (first 15 tasks are for partitions Vector(0))
14:51:38.173 INFO TaskSchedulerImpl - Adding task set 213.0 with 1 tasks resource profile 0
14:51:38.174 INFO TaskSetManager - Starting task 0.0 in stage 213.0 (TID 269) (localhost, executor driver, partition 0, PROCESS_LOCAL, 7872 bytes)
14:51:38.174 INFO Executor - Running task 0.0 in stage 213.0 (TID 269)
14:51:38.215 INFO NewHadoopRDD - Input split: file:/home/runner/work/gatk/gatk/src/test/resources/org/broadinstitute/hellbender/tools/BQSR/HiSeq.1mb.1RG.2k_lines.bam:0+222075
14:51:38.225 INFO Executor - Finished task 0.0 in stage 213.0 (TID 269). 989 bytes result sent to driver
14:51:38.226 INFO TaskSetManager - Finished task 0.0 in stage 213.0 (TID 269) in 52 ms on localhost (executor driver) (1/1)
14:51:38.226 INFO TaskSchedulerImpl - Removed TaskSet 213.0, whose tasks have all completed, from pool
14:51:38.226 INFO DAGScheduler - ResultStage 213 (count at ReadsSparkSinkUnitTest.java:185) finished in 0.072 s
14:51:38.226 INFO DAGScheduler - Job 159 is finished. Cancelling potential speculative or zombie tasks for this job
14:51:38.226 INFO TaskSchedulerImpl - Killing all running tasks in stage 213: Stage finished
14:51:38.226 INFO DAGScheduler - Job 159 finished: count at ReadsSparkSinkUnitTest.java:185, took 0.072821 s
14:51:38.230 INFO SparkContext - Starting job: count at ReadsSparkSinkUnitTest.java:185
14:51:38.230 INFO DAGScheduler - Got job 160 (count at ReadsSparkSinkUnitTest.java:185) with 1 output partitions
14:51:38.230 INFO DAGScheduler - Final stage: ResultStage 214 (count at ReadsSparkSinkUnitTest.java:185)
14:51:38.230 INFO DAGScheduler - Parents of final stage: List()
14:51:38.230 INFO DAGScheduler - Missing parents: List()
14:51:38.230 INFO DAGScheduler - Submitting ResultStage 214 (MapPartitionsRDD[1014] at filter at ReadsSparkSource.java:96), which has no missing parents
14:51:38.231 INFO MemoryStore - Block broadcast_426 stored as values in memory (estimated size 7.4 KiB, free 1917.1 MiB)
14:51:38.232 INFO MemoryStore - Block broadcast_426_piece0 stored as bytes in memory (estimated size 3.8 KiB, free 1917.1 MiB)
14:51:38.232 INFO BlockManagerInfo - Added broadcast_426_piece0 in memory on localhost:44923 (size: 3.8 KiB, free: 1919.4 MiB)
14:51:38.232 INFO SparkContext - Created broadcast 426 from broadcast at DAGScheduler.scala:1580
14:51:38.232 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 214 (MapPartitionsRDD[1014] at filter at ReadsSparkSource.java:96) (first 15 tasks are for partitions Vector(0))
14:51:38.232 INFO TaskSchedulerImpl - Adding task set 214.0 with 1 tasks resource profile 0
14:51:38.233 INFO TaskSetManager - Starting task 0.0 in stage 214.0 (TID 270) (localhost, executor driver, partition 0, PROCESS_LOCAL, 7809 bytes)
14:51:38.233 INFO Executor - Running task 0.0 in stage 214.0 (TID 270)
14:51:38.234 INFO NewHadoopRDD - Input split: file:/tmp/ReadsSparkSinkUnitTest62704346229121452570.sam:0+847558
14:51:38.241 INFO Executor - Finished task 0.0 in stage 214.0 (TID 270). 989 bytes result sent to driver
14:51:38.242 INFO TaskSetManager - Finished task 0.0 in stage 214.0 (TID 270) in 10 ms on localhost (executor driver) (1/1)
14:51:38.242 INFO TaskSchedulerImpl - Removed TaskSet 214.0, whose tasks have all completed, from pool
14:51:38.242 INFO DAGScheduler - ResultStage 214 (count at ReadsSparkSinkUnitTest.java:185) finished in 0.011 s
14:51:38.242 INFO DAGScheduler - Job 160 is finished. Cancelling potential speculative or zombie tasks for this job
14:51:38.242 INFO TaskSchedulerImpl - Killing all running tasks in stage 214: Stage finished
14:51:38.242 INFO DAGScheduler - Job 160 finished: count at ReadsSparkSinkUnitTest.java:185, took 0.012064 s
14:51:38.261 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_73f9ac03-5eb2-442c-ba2d-6fd040f35d59.bam dst=null perm=null proto=rpc
14:51:38.262 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=create src=/user/runner/ReadsSparkSinkUnitTest1_73f9ac03-5eb2-442c-ba2d-6fd040f35d59.bam dst=null perm=runner:supergroup:rw-r--r-- proto=rpc
14:51:38.263 INFO StateChange - DIR* completeFile: /user/runner/ReadsSparkSinkUnitTest1_73f9ac03-5eb2-442c-ba2d-6fd040f35d59.bam is closed by DFSClient_NONMAPREDUCE_-951833550_1
14:51:38.264 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_73f9ac03-5eb2-442c-ba2d-6fd040f35d59.bam dst=null perm=null proto=rpc
14:51:38.268 INFO MemoryStore - Block broadcast_427 stored as values in memory (estimated size 297.9 KiB, free 1916.8 MiB)
14:51:38.279 INFO MemoryStore - Block broadcast_427_piece0 stored as bytes in memory (estimated size 50.2 KiB, free 1916.7 MiB)
14:51:38.279 INFO BlockManagerInfo - Added broadcast_427_piece0 in memory on localhost:44923 (size: 50.2 KiB, free: 1919.4 MiB)
14:51:38.279 INFO SparkContext - Created broadcast 427 from newAPIHadoopFile at PathSplitSource.java:96
14:51:38.301 INFO MemoryStore - Block broadcast_428 stored as values in memory (estimated size 297.9 KiB, free 1916.5 MiB)
14:51:38.308 INFO MemoryStore - Block broadcast_428_piece0 stored as bytes in memory (estimated size 50.2 KiB, free 1916.4 MiB)
14:51:38.308 INFO BlockManagerInfo - Added broadcast_428_piece0 in memory on localhost:44923 (size: 50.2 KiB, free: 1919.4 MiB)
14:51:38.308 INFO SparkContext - Created broadcast 428 from newAPIHadoopFile at PathSplitSource.java:96
14:51:38.330 INFO FileInputFormat - Total input files to process : 1
14:51:38.332 INFO MemoryStore - Block broadcast_429 stored as values in memory (estimated size 160.7 KiB, free 1916.3 MiB)
14:51:38.332 INFO MemoryStore - Block broadcast_429_piece0 stored as bytes in memory (estimated size 9.6 KiB, free 1916.2 MiB)
14:51:38.332 INFO BlockManagerInfo - Added broadcast_429_piece0 in memory on localhost:44923 (size: 9.6 KiB, free: 1919.3 MiB)
14:51:38.333 INFO SparkContext - Created broadcast 429 from broadcast at ReadsSparkSink.java:133
14:51:38.334 INFO MemoryStore - Block broadcast_430 stored as values in memory (estimated size 163.2 KiB, free 1916.1 MiB)
14:51:38.339 INFO BlockManagerInfo - Removed broadcast_425_piece0 on localhost:44923 in memory (size: 153.6 KiB, free: 1919.5 MiB)
14:51:38.339 INFO MemoryStore - Block broadcast_430_piece0 stored as bytes in memory (estimated size 9.6 KiB, free 1916.2 MiB)
14:51:38.339 INFO BlockManagerInfo - Added broadcast_430_piece0 in memory on localhost:44923 (size: 9.6 KiB, free: 1919.5 MiB)
14:51:38.339 INFO BlockManagerInfo - Removed broadcast_424_piece0 on localhost:44923 in memory (size: 3.8 KiB, free: 1919.5 MiB)
14:51:38.339 INFO SparkContext - Created broadcast 430 from broadcast at BamSink.java:76
14:51:38.340 INFO BlockManagerInfo - Removed broadcast_422_piece0 on localhost:44923 in memory (size: 9.6 KiB, free: 1919.5 MiB)
14:51:38.340 INFO BlockManagerInfo - Removed broadcast_417_piece0 on localhost:44923 in memory (size: 50.2 KiB, free: 1919.5 MiB)
14:51:38.341 INFO BlockManagerInfo - Removed broadcast_423_piece0 on localhost:44923 in memory (size: 50.2 KiB, free: 1919.6 MiB)
14:51:38.341 INFO BlockManagerInfo - Removed broadcast_426_piece0 on localhost:44923 in memory (size: 3.8 KiB, free: 1919.6 MiB)
14:51:38.341 INFO BlockManagerInfo - Removed broadcast_428_piece0 on localhost:44923 in memory (size: 50.2 KiB, free: 1919.6 MiB)
14:51:38.342 INFO BlockManagerInfo - Removed broadcast_421_piece0 on localhost:44923 in memory (size: 66.9 KiB, free: 1919.7 MiB)
14:51:38.342 INFO BlockManagerInfo - Removed broadcast_418_piece0 on localhost:44923 in memory (size: 50.2 KiB, free: 1919.8 MiB)
14:51:38.343 INFO BlockManagerInfo - Removed broadcast_420_piece0 on localhost:44923 in memory (size: 166.1 KiB, free: 1919.9 MiB)
14:51:38.343 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_73f9ac03-5eb2-442c-ba2d-6fd040f35d59.bam.parts dst=null perm=null proto=rpc
14:51:38.343 INFO BlockManagerInfo - Removed broadcast_419_piece0 on localhost:44923 in memory (size: 9.6 KiB, free: 1919.9 MiB)
14:51:38.344 INFO PathOutputCommitterFactory - No output committer factory defined, defaulting to FileOutputCommitterFactory
14:51:38.344 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
14:51:38.344 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
14:51:38.345 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=mkdirs src=/user/runner/ReadsSparkSinkUnitTest1_73f9ac03-5eb2-442c-ba2d-6fd040f35d59.bam.parts/_temporary/0 dst=null perm=runner:supergroup:rwxr-xr-x proto=rpc
14:51:38.352 INFO SparkContext - Starting job: runJob at SparkHadoopWriter.scala:83
14:51:38.352 INFO DAGScheduler - Registering RDD 1028 (mapToPair at SparkUtils.java:161) as input to shuffle 43
14:51:38.353 INFO DAGScheduler - Got job 161 (runJob at SparkHadoopWriter.scala:83) with 1 output partitions
14:51:38.353 INFO DAGScheduler - Final stage: ResultStage 216 (runJob at SparkHadoopWriter.scala:83)
14:51:38.353 INFO DAGScheduler - Parents of final stage: List(ShuffleMapStage 215)
14:51:38.353 INFO DAGScheduler - Missing parents: List(ShuffleMapStage 215)
14:51:38.353 INFO DAGScheduler - Submitting ShuffleMapStage 215 (MapPartitionsRDD[1028] at mapToPair at SparkUtils.java:161), which has no missing parents
14:51:38.377 INFO MemoryStore - Block broadcast_431 stored as values in memory (estimated size 520.4 KiB, free 1918.8 MiB)
14:51:38.378 INFO MemoryStore - Block broadcast_431_piece0 stored as bytes in memory (estimated size 166.1 KiB, free 1918.7 MiB)
14:51:38.378 INFO BlockManagerInfo - Added broadcast_431_piece0 in memory on localhost:44923 (size: 166.1 KiB, free: 1919.8 MiB)
14:51:38.379 INFO SparkContext - Created broadcast 431 from broadcast at DAGScheduler.scala:1580
14:51:38.379 INFO DAGScheduler - Submitting 1 missing tasks from ShuffleMapStage 215 (MapPartitionsRDD[1028] at mapToPair at SparkUtils.java:161) (first 15 tasks are for partitions Vector(0))
14:51:38.379 INFO TaskSchedulerImpl - Adding task set 215.0 with 1 tasks resource profile 0
14:51:38.380 INFO TaskSetManager - Starting task 0.0 in stage 215.0 (TID 271) (localhost, executor driver, partition 0, PROCESS_LOCAL, 7861 bytes)
14:51:38.380 INFO Executor - Running task 0.0 in stage 215.0 (TID 271)
14:51:38.417 INFO NewHadoopRDD - Input split: file:/home/runner/work/gatk/gatk/src/test/resources/org/broadinstitute/hellbender/tools/BQSR/HiSeq.1mb.1RG.2k_lines.bam:0+222075
14:51:38.433 INFO Executor - Finished task 0.0 in stage 215.0 (TID 271). 1148 bytes result sent to driver
14:51:38.434 INFO TaskSetManager - Finished task 0.0 in stage 215.0 (TID 271) in 55 ms on localhost (executor driver) (1/1)
14:51:38.434 INFO TaskSchedulerImpl - Removed TaskSet 215.0, whose tasks have all completed, from pool
14:51:38.434 INFO DAGScheduler - ShuffleMapStage 215 (mapToPair at SparkUtils.java:161) finished in 0.081 s
14:51:38.434 INFO DAGScheduler - looking for newly runnable stages
14:51:38.434 INFO DAGScheduler - running: HashSet()
14:51:38.434 INFO DAGScheduler - waiting: HashSet(ResultStage 216)
14:51:38.434 INFO DAGScheduler - failed: HashSet()
14:51:38.434 INFO DAGScheduler - Submitting ResultStage 216 (MapPartitionsRDD[1033] at mapToPair at BamSink.java:91), which has no missing parents
14:51:38.441 INFO MemoryStore - Block broadcast_432 stored as values in memory (estimated size 241.5 KiB, free 1918.4 MiB)
14:51:38.442 INFO MemoryStore - Block broadcast_432_piece0 stored as bytes in memory (estimated size 67.1 KiB, free 1918.4 MiB)
14:51:38.442 INFO BlockManagerInfo - Added broadcast_432_piece0 in memory on localhost:44923 (size: 67.1 KiB, free: 1919.7 MiB)
14:51:38.442 INFO SparkContext - Created broadcast 432 from broadcast at DAGScheduler.scala:1580
14:51:38.443 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 216 (MapPartitionsRDD[1033] at mapToPair at BamSink.java:91) (first 15 tasks are for partitions Vector(0))
14:51:38.443 INFO TaskSchedulerImpl - Adding task set 216.0 with 1 tasks resource profile 0
14:51:38.443 INFO TaskSetManager - Starting task 0.0 in stage 216.0 (TID 272) (localhost, executor driver, partition 0, NODE_LOCAL, 7513 bytes)
14:51:38.444 INFO Executor - Running task 0.0 in stage 216.0 (TID 272)
14:51:38.449 INFO ShuffleBlockFetcherIterator - Getting 1 (343.8 KiB) non-empty blocks including 1 (343.8 KiB) local and 0 (0.0 B) host-local and 0 (0.0 B) push-merged-local and 0 (0.0 B) remote blocks
14:51:38.449 INFO ShuffleBlockFetcherIterator - Started 0 remote fetches in 0 ms
14:51:38.461 INFO PathOutputCommitterFactory - No output committer factory defined, defaulting to FileOutputCommitterFactory
14:51:38.461 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
14:51:38.461 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
14:51:38.461 INFO PathOutputCommitterFactory - No output committer factory defined, defaulting to FileOutputCommitterFactory
14:51:38.461 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
14:51:38.461 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
14:51:38.462 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=create src=/user/runner/ReadsSparkSinkUnitTest1_73f9ac03-5eb2-442c-ba2d-6fd040f35d59.bam.parts/_temporary/0/_temporary/attempt_202603041451388810365870178127583_1033_r_000000_0/part-r-00000 dst=null perm=runner:supergroup:rw-r--r-- proto=rpc
14:51:38.463 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=create src=/user/runner/ReadsSparkSinkUnitTest1_73f9ac03-5eb2-442c-ba2d-6fd040f35d59.bam.parts/_temporary/0/_temporary/attempt_202603041451388810365870178127583_1033_r_000000_0/.part-r-00000.sbi dst=null perm=runner:supergroup:rw-r--r-- proto=rpc
14:51:38.465 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=create src=/user/runner/ReadsSparkSinkUnitTest1_73f9ac03-5eb2-442c-ba2d-6fd040f35d59.bam.parts/_temporary/0/_temporary/attempt_202603041451388810365870178127583_1033_r_000000_0/.part-r-00000.bai dst=null perm=runner:supergroup:rw-r--r-- proto=rpc
14:51:38.467 INFO StateChange - BLOCK* allocate blk_1073741871_1047, replicas=127.0.0.1:34059 for /user/runner/ReadsSparkSinkUnitTest1_73f9ac03-5eb2-442c-ba2d-6fd040f35d59.bam.parts/_temporary/0/_temporary/attempt_202603041451388810365870178127583_1033_r_000000_0/part-r-00000
14:51:38.469 INFO DataNode - Receiving BP-1768883704-10.1.0.125-1772635868711:blk_1073741871_1047 src: /127.0.0.1:38994 dest: /127.0.0.1:34059
14:51:38.472 INFO clienttrace - src: /127.0.0.1:38994, dest: /127.0.0.1:34059, bytes: 231298, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-951833550_1, offset: 0, srvID: 3c05d188-bc2c-46c8-958b-79f7184a02a2, blockid: BP-1768883704-10.1.0.125-1772635868711:blk_1073741871_1047, duration(ns): 1430356
14:51:38.472 INFO DataNode - PacketResponder: BP-1768883704-10.1.0.125-1772635868711:blk_1073741871_1047, type=LAST_IN_PIPELINE terminating
14:51:38.473 INFO FSNamesystem - BLOCK* blk_1073741871_1047 is COMMITTED but not COMPLETE(numNodes= 0 < minimum = 1) in file /user/runner/ReadsSparkSinkUnitTest1_73f9ac03-5eb2-442c-ba2d-6fd040f35d59.bam.parts/_temporary/0/_temporary/attempt_202603041451388810365870178127583_1033_r_000000_0/part-r-00000
14:51:38.874 INFO StateChange - DIR* completeFile: /user/runner/ReadsSparkSinkUnitTest1_73f9ac03-5eb2-442c-ba2d-6fd040f35d59.bam.parts/_temporary/0/_temporary/attempt_202603041451388810365870178127583_1033_r_000000_0/part-r-00000 is closed by DFSClient_NONMAPREDUCE_-951833550_1
14:51:38.875 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_73f9ac03-5eb2-442c-ba2d-6fd040f35d59.bam.parts/_temporary/0/_temporary/attempt_202603041451388810365870178127583_1033_r_000000_0/part-r-00000 dst=null perm=null proto=rpc
14:51:38.876 INFO StateChange - BLOCK* allocate blk_1073741872_1048, replicas=127.0.0.1:34059 for /user/runner/ReadsSparkSinkUnitTest1_73f9ac03-5eb2-442c-ba2d-6fd040f35d59.bam.parts/_temporary/0/_temporary/attempt_202603041451388810365870178127583_1033_r_000000_0/.part-r-00000.sbi
14:51:38.877 INFO DataNode - Receiving BP-1768883704-10.1.0.125-1772635868711:blk_1073741872_1048 src: /127.0.0.1:39000 dest: /127.0.0.1:34059
14:51:38.879 INFO clienttrace - src: /127.0.0.1:39000, dest: /127.0.0.1:34059, bytes: 212, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-951833550_1, offset: 0, srvID: 3c05d188-bc2c-46c8-958b-79f7184a02a2, blockid: BP-1768883704-10.1.0.125-1772635868711:blk_1073741872_1048, duration(ns): 646833
14:51:38.879 INFO DataNode - PacketResponder: BP-1768883704-10.1.0.125-1772635868711:blk_1073741872_1048, type=LAST_IN_PIPELINE terminating
14:51:38.879 INFO FSNamesystem - BLOCK* blk_1073741872_1048 is COMMITTED but not COMPLETE(numNodes= 0 < minimum = 1) in file /user/runner/ReadsSparkSinkUnitTest1_73f9ac03-5eb2-442c-ba2d-6fd040f35d59.bam.parts/_temporary/0/_temporary/attempt_202603041451388810365870178127583_1033_r_000000_0/.part-r-00000.sbi
14:51:39.280 INFO StateChange - DIR* completeFile: /user/runner/ReadsSparkSinkUnitTest1_73f9ac03-5eb2-442c-ba2d-6fd040f35d59.bam.parts/_temporary/0/_temporary/attempt_202603041451388810365870178127583_1033_r_000000_0/.part-r-00000.sbi is closed by DFSClient_NONMAPREDUCE_-951833550_1
14:51:39.282 INFO StateChange - BLOCK* allocate blk_1073741873_1049, replicas=127.0.0.1:34059 for /user/runner/ReadsSparkSinkUnitTest1_73f9ac03-5eb2-442c-ba2d-6fd040f35d59.bam.parts/_temporary/0/_temporary/attempt_202603041451388810365870178127583_1033_r_000000_0/.part-r-00000.bai
14:51:39.283 INFO DataNode - Receiving BP-1768883704-10.1.0.125-1772635868711:blk_1073741873_1049 src: /127.0.0.1:39016 dest: /127.0.0.1:34059
14:51:39.285 INFO clienttrace - src: /127.0.0.1:39016, dest: /127.0.0.1:34059, bytes: 5472, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-951833550_1, offset: 0, srvID: 3c05d188-bc2c-46c8-958b-79f7184a02a2, blockid: BP-1768883704-10.1.0.125-1772635868711:blk_1073741873_1049, duration(ns): 520674
14:51:39.285 INFO DataNode - PacketResponder: BP-1768883704-10.1.0.125-1772635868711:blk_1073741873_1049, type=LAST_IN_PIPELINE terminating
14:51:39.286 INFO StateChange - DIR* completeFile: /user/runner/ReadsSparkSinkUnitTest1_73f9ac03-5eb2-442c-ba2d-6fd040f35d59.bam.parts/_temporary/0/_temporary/attempt_202603041451388810365870178127583_1033_r_000000_0/.part-r-00000.bai is closed by DFSClient_NONMAPREDUCE_-951833550_1
14:51:39.287 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_73f9ac03-5eb2-442c-ba2d-6fd040f35d59.bam.parts/_temporary/0/_temporary/attempt_202603041451388810365870178127583_1033_r_000000_0 dst=null perm=null proto=rpc
14:51:39.288 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_73f9ac03-5eb2-442c-ba2d-6fd040f35d59.bam.parts/_temporary/0/_temporary/attempt_202603041451388810365870178127583_1033_r_000000_0 dst=null perm=null proto=rpc
14:51:39.288 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_73f9ac03-5eb2-442c-ba2d-6fd040f35d59.bam.parts/_temporary/0/task_202603041451388810365870178127583_1033_r_000000 dst=null perm=null proto=rpc
14:51:39.289 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=rename src=/user/runner/ReadsSparkSinkUnitTest1_73f9ac03-5eb2-442c-ba2d-6fd040f35d59.bam.parts/_temporary/0/_temporary/attempt_202603041451388810365870178127583_1033_r_000000_0 dst=/user/runner/ReadsSparkSinkUnitTest1_73f9ac03-5eb2-442c-ba2d-6fd040f35d59.bam.parts/_temporary/0/task_202603041451388810365870178127583_1033_r_000000 perm=runner:supergroup:rwxr-xr-x proto=rpc
14:51:39.289 INFO FileOutputCommitter - Saved output of task 'attempt_202603041451388810365870178127583_1033_r_000000_0' to hdfs://localhost:43595/user/runner/ReadsSparkSinkUnitTest1_73f9ac03-5eb2-442c-ba2d-6fd040f35d59.bam.parts/_temporary/0/task_202603041451388810365870178127583_1033_r_000000
14:51:39.289 INFO SparkHadoopMapRedUtil - attempt_202603041451388810365870178127583_1033_r_000000_0: Committed. Elapsed time: 2 ms.
14:51:39.290 INFO Executor - Finished task 0.0 in stage 216.0 (TID 272). 1858 bytes result sent to driver
14:51:39.290 INFO TaskSetManager - Finished task 0.0 in stage 216.0 (TID 272) in 847 ms on localhost (executor driver) (1/1)
14:51:39.290 INFO TaskSchedulerImpl - Removed TaskSet 216.0, whose tasks have all completed, from pool
14:51:39.290 INFO DAGScheduler - ResultStage 216 (runJob at SparkHadoopWriter.scala:83) finished in 0.855 s
14:51:39.291 INFO DAGScheduler - Job 161 is finished. Cancelling potential speculative or zombie tasks for this job
14:51:39.291 INFO TaskSchedulerImpl - Killing all running tasks in stage 216: Stage finished
14:51:39.291 INFO DAGScheduler - Job 161 finished: runJob at SparkHadoopWriter.scala:83, took 0.938824 s
14:51:39.291 INFO SparkHadoopWriter - Start to commit write Job job_202603041451388810365870178127583_1033.
14:51:39.292 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=listStatus src=/user/runner/ReadsSparkSinkUnitTest1_73f9ac03-5eb2-442c-ba2d-6fd040f35d59.bam.parts/_temporary/0 dst=null perm=null proto=rpc
14:51:39.292 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_73f9ac03-5eb2-442c-ba2d-6fd040f35d59.bam.parts dst=null perm=null proto=rpc
14:51:39.293 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=listStatus src=/user/runner/ReadsSparkSinkUnitTest1_73f9ac03-5eb2-442c-ba2d-6fd040f35d59.bam.parts/_temporary/0/task_202603041451388810365870178127583_1033_r_000000 dst=null perm=null proto=rpc
14:51:39.293 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_73f9ac03-5eb2-442c-ba2d-6fd040f35d59.bam.parts/.part-r-00000.bai dst=null perm=null proto=rpc
14:51:39.294 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=rename src=/user/runner/ReadsSparkSinkUnitTest1_73f9ac03-5eb2-442c-ba2d-6fd040f35d59.bam.parts/_temporary/0/task_202603041451388810365870178127583_1033_r_000000/.part-r-00000.bai dst=/user/runner/ReadsSparkSinkUnitTest1_73f9ac03-5eb2-442c-ba2d-6fd040f35d59.bam.parts/.part-r-00000.bai perm=runner:supergroup:rw-r--r-- proto=rpc
14:51:39.294 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_73f9ac03-5eb2-442c-ba2d-6fd040f35d59.bam.parts/.part-r-00000.sbi dst=null perm=null proto=rpc
14:51:39.295 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=rename src=/user/runner/ReadsSparkSinkUnitTest1_73f9ac03-5eb2-442c-ba2d-6fd040f35d59.bam.parts/_temporary/0/task_202603041451388810365870178127583_1033_r_000000/.part-r-00000.sbi dst=/user/runner/ReadsSparkSinkUnitTest1_73f9ac03-5eb2-442c-ba2d-6fd040f35d59.bam.parts/.part-r-00000.sbi perm=runner:supergroup:rw-r--r-- proto=rpc
14:51:39.295 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_73f9ac03-5eb2-442c-ba2d-6fd040f35d59.bam.parts/part-r-00000 dst=null perm=null proto=rpc
14:51:39.296 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=rename src=/user/runner/ReadsSparkSinkUnitTest1_73f9ac03-5eb2-442c-ba2d-6fd040f35d59.bam.parts/_temporary/0/task_202603041451388810365870178127583_1033_r_000000/part-r-00000 dst=/user/runner/ReadsSparkSinkUnitTest1_73f9ac03-5eb2-442c-ba2d-6fd040f35d59.bam.parts/part-r-00000 perm=runner:supergroup:rw-r--r-- proto=rpc
14:51:39.296 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=delete src=/user/runner/ReadsSparkSinkUnitTest1_73f9ac03-5eb2-442c-ba2d-6fd040f35d59.bam.parts/_temporary dst=null perm=null proto=rpc
14:51:39.297 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=create src=/user/runner/ReadsSparkSinkUnitTest1_73f9ac03-5eb2-442c-ba2d-6fd040f35d59.bam.parts/_SUCCESS dst=null perm=runner:supergroup:rw-r--r-- proto=rpc
14:51:39.298 INFO StateChange - DIR* completeFile: /user/runner/ReadsSparkSinkUnitTest1_73f9ac03-5eb2-442c-ba2d-6fd040f35d59.bam.parts/_SUCCESS is closed by DFSClient_NONMAPREDUCE_-951833550_1
14:51:39.298 INFO audit - allowed=false ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=delete src=/user/runner/ReadsSparkSinkUnitTest1_73f9ac03-5eb2-442c-ba2d-6fd040f35d59.bam.parts/.spark-staging-1033 dst=null perm=null proto=rpc
14:51:39.299 INFO SparkHadoopWriter - Write Job job_202603041451388810365870178127583_1033 committed. Elapsed time: 7 ms.
14:51:39.299 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=create src=/user/runner/ReadsSparkSinkUnitTest1_73f9ac03-5eb2-442c-ba2d-6fd040f35d59.bam.parts/header dst=null perm=runner:supergroup:rw-r--r-- proto=rpc
14:51:39.301 INFO StateChange - BLOCK* allocate blk_1073741874_1050, replicas=127.0.0.1:34059 for /user/runner/ReadsSparkSinkUnitTest1_73f9ac03-5eb2-442c-ba2d-6fd040f35d59.bam.parts/header
14:51:39.302 INFO DataNode - Receiving BP-1768883704-10.1.0.125-1772635868711:blk_1073741874_1050 src: /127.0.0.1:39032 dest: /127.0.0.1:34059
14:51:39.303 INFO clienttrace - src: /127.0.0.1:39032, dest: /127.0.0.1:34059, bytes: 5712, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-951833550_1, offset: 0, srvID: 3c05d188-bc2c-46c8-958b-79f7184a02a2, blockid: BP-1768883704-10.1.0.125-1772635868711:blk_1073741874_1050, duration(ns): 737853
14:51:39.304 INFO DataNode - PacketResponder: BP-1768883704-10.1.0.125-1772635868711:blk_1073741874_1050, type=LAST_IN_PIPELINE terminating
14:51:39.304 INFO FSNamesystem - BLOCK* blk_1073741874_1050 is COMMITTED but not COMPLETE(numNodes= 0 < minimum = 1) in file /user/runner/ReadsSparkSinkUnitTest1_73f9ac03-5eb2-442c-ba2d-6fd040f35d59.bam.parts/header
14:51:39.706 INFO StateChange - DIR* completeFile: /user/runner/ReadsSparkSinkUnitTest1_73f9ac03-5eb2-442c-ba2d-6fd040f35d59.bam.parts/header is closed by DFSClient_NONMAPREDUCE_-951833550_1
14:51:39.707 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=create src=/user/runner/ReadsSparkSinkUnitTest1_73f9ac03-5eb2-442c-ba2d-6fd040f35d59.bam.parts/terminator dst=null perm=runner:supergroup:rw-r--r-- proto=rpc
14:51:39.708 INFO StateChange - BLOCK* allocate blk_1073741875_1051, replicas=127.0.0.1:34059 for /user/runner/ReadsSparkSinkUnitTest1_73f9ac03-5eb2-442c-ba2d-6fd040f35d59.bam.parts/terminator
14:51:39.709 INFO DataNode - Receiving BP-1768883704-10.1.0.125-1772635868711:blk_1073741875_1051 src: /127.0.0.1:39046 dest: /127.0.0.1:34059
14:51:39.710 INFO clienttrace - src: /127.0.0.1:39046, dest: /127.0.0.1:34059, bytes: 28, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-951833550_1, offset: 0, srvID: 3c05d188-bc2c-46c8-958b-79f7184a02a2, blockid: BP-1768883704-10.1.0.125-1772635868711:blk_1073741875_1051, duration(ns): 498322
14:51:39.710 INFO DataNode - PacketResponder: BP-1768883704-10.1.0.125-1772635868711:blk_1073741875_1051, type=LAST_IN_PIPELINE terminating
14:51:39.711 INFO StateChange - DIR* completeFile: /user/runner/ReadsSparkSinkUnitTest1_73f9ac03-5eb2-442c-ba2d-6fd040f35d59.bam.parts/terminator is closed by DFSClient_NONMAPREDUCE_-951833550_1
14:51:39.711 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=listStatus src=/user/runner/ReadsSparkSinkUnitTest1_73f9ac03-5eb2-442c-ba2d-6fd040f35d59.bam.parts dst=null perm=null proto=rpc
14:51:39.712 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=create src=/user/runner/ReadsSparkSinkUnitTest1_73f9ac03-5eb2-442c-ba2d-6fd040f35d59.bam.parts/output dst=null perm=runner:supergroup:rw-r--r-- proto=rpc
14:51:39.713 INFO StateChange - DIR* completeFile: /user/runner/ReadsSparkSinkUnitTest1_73f9ac03-5eb2-442c-ba2d-6fd040f35d59.bam.parts/output is closed by DFSClient_NONMAPREDUCE_-951833550_1
14:51:39.713 INFO HadoopFileSystemWrapper - Concatenating 3 parts to hdfs://localhost:43595/user/runner/ReadsSparkSinkUnitTest1_73f9ac03-5eb2-442c-ba2d-6fd040f35d59.bam
14:51:39.714 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=concat src=[/user/runner/ReadsSparkSinkUnitTest1_73f9ac03-5eb2-442c-ba2d-6fd040f35d59.bam.parts/header, /user/runner/ReadsSparkSinkUnitTest1_73f9ac03-5eb2-442c-ba2d-6fd040f35d59.bam.parts/part-r-00000, /user/runner/ReadsSparkSinkUnitTest1_73f9ac03-5eb2-442c-ba2d-6fd040f35d59.bam.parts/terminator] dst=/user/runner/ReadsSparkSinkUnitTest1_73f9ac03-5eb2-442c-ba2d-6fd040f35d59.bam.parts/output perm=runner:supergroup:rw-r--r-- proto=rpc
14:51:39.714 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_73f9ac03-5eb2-442c-ba2d-6fd040f35d59.bam dst=null perm=null proto=rpc
14:51:39.715 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=delete src=/user/runner/ReadsSparkSinkUnitTest1_73f9ac03-5eb2-442c-ba2d-6fd040f35d59.bam dst=null perm=null proto=rpc
14:51:39.715 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=rename src=/user/runner/ReadsSparkSinkUnitTest1_73f9ac03-5eb2-442c-ba2d-6fd040f35d59.bam.parts/output dst=/user/runner/ReadsSparkSinkUnitTest1_73f9ac03-5eb2-442c-ba2d-6fd040f35d59.bam perm=runner:supergroup:rw-r--r-- proto=rpc
14:51:39.715 INFO HadoopFileSystemWrapper - Concatenating to hdfs://localhost:43595/user/runner/ReadsSparkSinkUnitTest1_73f9ac03-5eb2-442c-ba2d-6fd040f35d59.bam done
14:51:39.716 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_73f9ac03-5eb2-442c-ba2d-6fd040f35d59.bam dst=null perm=null proto=rpc
14:51:39.716 INFO IndexFileMerger - Merging .sbi files in temp directory hdfs://localhost:43595/user/runner/ReadsSparkSinkUnitTest1_73f9ac03-5eb2-442c-ba2d-6fd040f35d59.bam.parts/ to hdfs://localhost:43595/user/runner/ReadsSparkSinkUnitTest1_73f9ac03-5eb2-442c-ba2d-6fd040f35d59.bam.sbi
14:51:39.716 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=listStatus src=/user/runner/ReadsSparkSinkUnitTest1_73f9ac03-5eb2-442c-ba2d-6fd040f35d59.bam.parts dst=null perm=null proto=rpc
14:51:39.717 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=create src=/user/runner/ReadsSparkSinkUnitTest1_73f9ac03-5eb2-442c-ba2d-6fd040f35d59.bam.sbi dst=null perm=runner:supergroup:rw-r--r-- proto=rpc
14:51:39.718 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_73f9ac03-5eb2-442c-ba2d-6fd040f35d59.bam.parts/.part-r-00000.sbi dst=null perm=null proto=rpc
14:51:39.718 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest1_73f9ac03-5eb2-442c-ba2d-6fd040f35d59.bam.parts/.part-r-00000.sbi dst=null perm=null proto=rpc
14:51:39.720 WARN DFSUtil - Unexpected value for data transfer bytes=216 duration=0
14:51:39.720 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=delete src=/user/runner/ReadsSparkSinkUnitTest1_73f9ac03-5eb2-442c-ba2d-6fd040f35d59.bam.parts/.part-r-00000.sbi dst=null perm=null proto=rpc
14:51:39.721 INFO StateChange - BLOCK* allocate blk_1073741876_1052, replicas=127.0.0.1:34059 for /user/runner/ReadsSparkSinkUnitTest1_73f9ac03-5eb2-442c-ba2d-6fd040f35d59.bam.sbi
14:51:39.722 INFO DataNode - Receiving BP-1768883704-10.1.0.125-1772635868711:blk_1073741876_1052 src: /127.0.0.1:39050 dest: /127.0.0.1:34059
14:51:39.724 INFO clienttrace - src: /127.0.0.1:39050, dest: /127.0.0.1:34059, bytes: 212, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-951833550_1, offset: 0, srvID: 3c05d188-bc2c-46c8-958b-79f7184a02a2, blockid: BP-1768883704-10.1.0.125-1772635868711:blk_1073741876_1052, duration(ns): 465411
14:51:39.724 INFO DataNode - PacketResponder: BP-1768883704-10.1.0.125-1772635868711:blk_1073741876_1052, type=LAST_IN_PIPELINE terminating
14:51:39.726 INFO StateChange - DIR* completeFile: /user/runner/ReadsSparkSinkUnitTest1_73f9ac03-5eb2-442c-ba2d-6fd040f35d59.bam.sbi is closed by DFSClient_NONMAPREDUCE_-951833550_1
14:51:39.726 INFO IndexFileMerger - Done merging .sbi files
14:51:39.726 INFO IndexFileMerger - Merging .bai files in temp directory hdfs://localhost:43595/user/runner/ReadsSparkSinkUnitTest1_73f9ac03-5eb2-442c-ba2d-6fd040f35d59.bam.parts/ to hdfs://localhost:43595/user/runner/ReadsSparkSinkUnitTest1_73f9ac03-5eb2-442c-ba2d-6fd040f35d59.bam.bai
14:51:39.727 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=listStatus src=/user/runner/ReadsSparkSinkUnitTest1_73f9ac03-5eb2-442c-ba2d-6fd040f35d59.bam.parts dst=null perm=null proto=rpc
14:51:39.727 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=create src=/user/runner/ReadsSparkSinkUnitTest1_73f9ac03-5eb2-442c-ba2d-6fd040f35d59.bam.bai dst=null perm=runner:supergroup:rw-r--r-- proto=rpc
14:51:39.728 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_73f9ac03-5eb2-442c-ba2d-6fd040f35d59.bam.parts/.part-r-00000.bai dst=null perm=null proto=rpc
14:51:39.730 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest1_73f9ac03-5eb2-442c-ba2d-6fd040f35d59.bam.parts/.part-r-00000.bai dst=null perm=null proto=rpc
14:51:39.731 WARN DFSUtil - Unexpected value for data transfer bytes=5516 duration=0
14:51:39.731 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=delete src=/user/runner/ReadsSparkSinkUnitTest1_73f9ac03-5eb2-442c-ba2d-6fd040f35d59.bam.parts/.part-r-00000.bai dst=null perm=null proto=rpc
14:51:39.733 INFO StateChange - BLOCK* allocate blk_1073741877_1053, replicas=127.0.0.1:34059 for /user/runner/ReadsSparkSinkUnitTest1_73f9ac03-5eb2-442c-ba2d-6fd040f35d59.bam.bai
14:51:39.734 INFO DataNode - Receiving BP-1768883704-10.1.0.125-1772635868711:blk_1073741877_1053 src: /127.0.0.1:39058 dest: /127.0.0.1:34059
14:51:39.735 INFO clienttrace - src: /127.0.0.1:39058, dest: /127.0.0.1:34059, bytes: 5472, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-951833550_1, offset: 0, srvID: 3c05d188-bc2c-46c8-958b-79f7184a02a2, blockid: BP-1768883704-10.1.0.125-1772635868711:blk_1073741877_1053, duration(ns): 557060
14:51:39.735 INFO DataNode - PacketResponder: BP-1768883704-10.1.0.125-1772635868711:blk_1073741877_1053, type=LAST_IN_PIPELINE terminating
14:51:39.736 INFO StateChange - DIR* completeFile: /user/runner/ReadsSparkSinkUnitTest1_73f9ac03-5eb2-442c-ba2d-6fd040f35d59.bam.bai is closed by DFSClient_NONMAPREDUCE_-951833550_1
14:51:39.736 INFO IndexFileMerger - Done merging .bai files
14:51:39.737 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=delete src=/user/runner/ReadsSparkSinkUnitTest1_73f9ac03-5eb2-442c-ba2d-6fd040f35d59.bam.parts dst=null perm=null proto=rpc
14:51:39.746 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_73f9ac03-5eb2-442c-ba2d-6fd040f35d59.bam.bai dst=null perm=null proto=rpc
14:51:39.756 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_73f9ac03-5eb2-442c-ba2d-6fd040f35d59.bam.sbi dst=null perm=null proto=rpc
14:51:39.757 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_73f9ac03-5eb2-442c-ba2d-6fd040f35d59.bam.sbi dst=null perm=null proto=rpc
14:51:39.757 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest1_73f9ac03-5eb2-442c-ba2d-6fd040f35d59.bam.sbi dst=null perm=null proto=rpc
14:51:39.758 WARN DFSUtil - Unexpected value for data transfer bytes=216 duration=0
14:51:39.759 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_73f9ac03-5eb2-442c-ba2d-6fd040f35d59.bam dst=null perm=null proto=rpc
14:51:39.759 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_73f9ac03-5eb2-442c-ba2d-6fd040f35d59.bam dst=null perm=null proto=rpc
14:51:39.760 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_73f9ac03-5eb2-442c-ba2d-6fd040f35d59.bam dst=null perm=null proto=rpc
14:51:39.760 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest1_73f9ac03-5eb2-442c-ba2d-6fd040f35d59.bam dst=null perm=null proto=rpc
14:51:39.761 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_73f9ac03-5eb2-442c-ba2d-6fd040f35d59.bam.bai dst=null perm=null proto=rpc
14:51:39.761 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_73f9ac03-5eb2-442c-ba2d-6fd040f35d59.bam.bai dst=null perm=null proto=rpc
14:51:39.762 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest1_73f9ac03-5eb2-442c-ba2d-6fd040f35d59.bam.bai dst=null perm=null proto=rpc
14:51:39.763 WARN DFSUtil - Unexpected value for data transfer bytes=5760 duration=0
14:51:39.765 WARN DFSUtil - Unexpected value for data transfer bytes=5516 duration=0
14:51:39.766 WARN DFSUtil - Unexpected value for data transfer bytes=5516 duration=0
14:51:39.766 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
14:51:39.766 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_73f9ac03-5eb2-442c-ba2d-6fd040f35d59.bam.sbi dst=null perm=null proto=rpc
14:51:39.767 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_73f9ac03-5eb2-442c-ba2d-6fd040f35d59.bam.sbi dst=null perm=null proto=rpc
14:51:39.767 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest1_73f9ac03-5eb2-442c-ba2d-6fd040f35d59.bam.sbi dst=null perm=null proto=rpc
14:51:39.768 WARN DFSUtil - Unexpected value for data transfer bytes=216 duration=0
14:51:39.768 INFO MemoryStore - Block broadcast_433 stored as values in memory (estimated size 320.0 B, free 1918.4 MiB)
14:51:39.769 INFO MemoryStore - Block broadcast_433_piece0 stored as bytes in memory (estimated size 233.0 B, free 1918.4 MiB)
14:51:39.769 INFO BlockManagerInfo - Added broadcast_433_piece0 in memory on localhost:44923 (size: 233.0 B, free: 1919.7 MiB)
14:51:39.769 INFO SparkContext - Created broadcast 433 from broadcast at BamSource.java:104
14:51:39.770 INFO MemoryStore - Block broadcast_434 stored as values in memory (estimated size 297.9 KiB, free 1918.1 MiB)
14:51:39.776 INFO MemoryStore - Block broadcast_434_piece0 stored as bytes in memory (estimated size 50.2 KiB, free 1918.0 MiB)
14:51:39.776 INFO BlockManagerInfo - Added broadcast_434_piece0 in memory on localhost:44923 (size: 50.2 KiB, free: 1919.7 MiB)
14:51:39.777 INFO SparkContext - Created broadcast 434 from newAPIHadoopFile at PathSplitSource.java:96
14:51:39.786 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_73f9ac03-5eb2-442c-ba2d-6fd040f35d59.bam dst=null perm=null proto=rpc
14:51:39.786 INFO FileInputFormat - Total input files to process : 1
14:51:39.787 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest1_73f9ac03-5eb2-442c-ba2d-6fd040f35d59.bam dst=null perm=null proto=rpc
14:51:39.801 INFO SparkContext - Starting job: collect at ReadsSparkSinkUnitTest.java:182
14:51:39.802 INFO DAGScheduler - Got job 162 (collect at ReadsSparkSinkUnitTest.java:182) with 1 output partitions
14:51:39.802 INFO DAGScheduler - Final stage: ResultStage 217 (collect at ReadsSparkSinkUnitTest.java:182)
14:51:39.802 INFO DAGScheduler - Parents of final stage: List()
14:51:39.802 INFO DAGScheduler - Missing parents: List()
14:51:39.802 INFO DAGScheduler - Submitting ResultStage 217 (MapPartitionsRDD[1039] at filter at ReadsSparkSource.java:96), which has no missing parents
14:51:39.808 INFO MemoryStore - Block broadcast_435 stored as values in memory (estimated size 148.2 KiB, free 1917.9 MiB)
14:51:39.809 INFO MemoryStore - Block broadcast_435_piece0 stored as bytes in memory (estimated size 54.6 KiB, free 1917.8 MiB)
14:51:39.809 INFO BlockManagerInfo - Added broadcast_435_piece0 in memory on localhost:44923 (size: 54.6 KiB, free: 1919.6 MiB)
14:51:39.809 INFO SparkContext - Created broadcast 435 from broadcast at DAGScheduler.scala:1580
14:51:39.810 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 217 (MapPartitionsRDD[1039] at filter at ReadsSparkSource.java:96) (first 15 tasks are for partitions Vector(0))
14:51:39.810 INFO TaskSchedulerImpl - Adding task set 217.0 with 1 tasks resource profile 0
14:51:39.810 INFO TaskSetManager - Starting task 0.0 in stage 217.0 (TID 273) (localhost, executor driver, partition 0, ANY, 7852 bytes)
14:51:39.810 INFO Executor - Running task 0.0 in stage 217.0 (TID 273)
14:51:39.823 INFO NewHadoopRDD - Input split: hdfs://localhost:43595/user/runner/ReadsSparkSinkUnitTest1_73f9ac03-5eb2-442c-ba2d-6fd040f35d59.bam:0+237038
14:51:39.824 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_73f9ac03-5eb2-442c-ba2d-6fd040f35d59.bam dst=null perm=null proto=rpc
14:51:39.824 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest1_73f9ac03-5eb2-442c-ba2d-6fd040f35d59.bam dst=null perm=null proto=rpc
14:51:39.825 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_73f9ac03-5eb2-442c-ba2d-6fd040f35d59.bam.bai dst=null perm=null proto=rpc
14:51:39.825 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_73f9ac03-5eb2-442c-ba2d-6fd040f35d59.bam.bai dst=null perm=null proto=rpc
14:51:39.826 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest1_73f9ac03-5eb2-442c-ba2d-6fd040f35d59.bam.bai dst=null perm=null proto=rpc
14:51:39.827 WARN DFSUtil - Unexpected value for data transfer bytes=5760 duration=0
14:51:39.829 WARN DFSUtil - Unexpected value for data transfer bytes=5516 duration=0
14:51:39.830 WARN DFSUtil - Unexpected value for data transfer bytes=5516 duration=0
14:51:39.831 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
14:51:39.832 WARN DFSUtil - Unexpected value for data transfer bytes=32 duration=0
14:51:39.834 INFO Executor - Finished task 0.0 in stage 217.0 (TID 273). 651483 bytes result sent to driver
14:51:39.836 INFO TaskSetManager - Finished task 0.0 in stage 217.0 (TID 273) in 26 ms on localhost (executor driver) (1/1)
14:51:39.836 INFO TaskSchedulerImpl - Removed TaskSet 217.0, whose tasks have all completed, from pool
14:51:39.836 INFO DAGScheduler - ResultStage 217 (collect at ReadsSparkSinkUnitTest.java:182) finished in 0.034 s
14:51:39.837 INFO DAGScheduler - Job 162 is finished. Cancelling potential speculative or zombie tasks for this job
14:51:39.837 INFO TaskSchedulerImpl - Killing all running tasks in stage 217: Stage finished
14:51:39.837 INFO DAGScheduler - Job 162 finished: collect at ReadsSparkSinkUnitTest.java:182, took 0.035200 s
14:51:39.846 INFO SparkContext - Starting job: count at ReadsSparkSinkUnitTest.java:185
14:51:39.846 INFO DAGScheduler - Got job 163 (count at ReadsSparkSinkUnitTest.java:185) with 1 output partitions
14:51:39.846 INFO DAGScheduler - Final stage: ResultStage 218 (count at ReadsSparkSinkUnitTest.java:185)
14:51:39.846 INFO DAGScheduler - Parents of final stage: List()
14:51:39.846 INFO DAGScheduler - Missing parents: List()
14:51:39.847 INFO DAGScheduler - Submitting ResultStage 218 (MapPartitionsRDD[1021] at filter at ReadsSparkSource.java:96), which has no missing parents
14:51:39.863 INFO MemoryStore - Block broadcast_436 stored as values in memory (estimated size 426.1 KiB, free 1917.4 MiB)
14:51:39.865 INFO MemoryStore - Block broadcast_436_piece0 stored as bytes in memory (estimated size 153.6 KiB, free 1917.2 MiB)
14:51:39.865 INFO BlockManagerInfo - Added broadcast_436_piece0 in memory on localhost:44923 (size: 153.6 KiB, free: 1919.5 MiB)
14:51:39.865 INFO SparkContext - Created broadcast 436 from broadcast at DAGScheduler.scala:1580
14:51:39.865 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 218 (MapPartitionsRDD[1021] at filter at ReadsSparkSource.java:96) (first 15 tasks are for partitions Vector(0))
14:51:39.865 INFO TaskSchedulerImpl - Adding task set 218.0 with 1 tasks resource profile 0
14:51:39.866 INFO TaskSetManager - Starting task 0.0 in stage 218.0 (TID 274) (localhost, executor driver, partition 0, PROCESS_LOCAL, 7872 bytes)
14:51:39.866 INFO Executor - Running task 0.0 in stage 218.0 (TID 274)
14:51:39.898 INFO NewHadoopRDD - Input split: file:/home/runner/work/gatk/gatk/src/test/resources/org/broadinstitute/hellbender/tools/BQSR/HiSeq.1mb.1RG.2k_lines.bam:0+222075
14:51:39.908 INFO Executor - Finished task 0.0 in stage 218.0 (TID 274). 989 bytes result sent to driver
14:51:39.908 INFO TaskSetManager - Finished task 0.0 in stage 218.0 (TID 274) in 42 ms on localhost (executor driver) (1/1)
14:51:39.908 INFO TaskSchedulerImpl - Removed TaskSet 218.0, whose tasks have all completed, from pool
14:51:39.908 INFO DAGScheduler - ResultStage 218 (count at ReadsSparkSinkUnitTest.java:185) finished in 0.061 s
14:51:39.908 INFO DAGScheduler - Job 163 is finished. Cancelling potential speculative or zombie tasks for this job
14:51:39.909 INFO TaskSchedulerImpl - Killing all running tasks in stage 218: Stage finished
14:51:39.909 INFO DAGScheduler - Job 163 finished: count at ReadsSparkSinkUnitTest.java:185, took 0.062516 s
14:51:39.912 INFO SparkContext - Starting job: count at ReadsSparkSinkUnitTest.java:185
14:51:39.912 INFO DAGScheduler - Got job 164 (count at ReadsSparkSinkUnitTest.java:185) with 1 output partitions
14:51:39.912 INFO DAGScheduler - Final stage: ResultStage 219 (count at ReadsSparkSinkUnitTest.java:185)
14:51:39.912 INFO DAGScheduler - Parents of final stage: List()
14:51:39.912 INFO DAGScheduler - Missing parents: List()
14:51:39.912 INFO DAGScheduler - Submitting ResultStage 219 (MapPartitionsRDD[1039] at filter at ReadsSparkSource.java:96), which has no missing parents
14:51:39.918 INFO MemoryStore - Block broadcast_437 stored as values in memory (estimated size 148.1 KiB, free 1917.1 MiB)
14:51:39.919 INFO MemoryStore - Block broadcast_437_piece0 stored as bytes in memory (estimated size 54.6 KiB, free 1917.1 MiB)
14:51:39.919 INFO BlockManagerInfo - Added broadcast_437_piece0 in memory on localhost:44923 (size: 54.6 KiB, free: 1919.4 MiB)
14:51:39.920 INFO SparkContext - Created broadcast 437 from broadcast at DAGScheduler.scala:1580
14:51:39.920 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 219 (MapPartitionsRDD[1039] at filter at ReadsSparkSource.java:96) (first 15 tasks are for partitions Vector(0))
14:51:39.920 INFO TaskSchedulerImpl - Adding task set 219.0 with 1 tasks resource profile 0
14:51:39.920 INFO TaskSetManager - Starting task 0.0 in stage 219.0 (TID 275) (localhost, executor driver, partition 0, ANY, 7852 bytes)
14:51:39.921 INFO Executor - Running task 0.0 in stage 219.0 (TID 275)
14:51:39.933 INFO NewHadoopRDD - Input split: hdfs://localhost:43595/user/runner/ReadsSparkSinkUnitTest1_73f9ac03-5eb2-442c-ba2d-6fd040f35d59.bam:0+237038
14:51:39.933 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_73f9ac03-5eb2-442c-ba2d-6fd040f35d59.bam dst=null perm=null proto=rpc
14:51:39.934 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest1_73f9ac03-5eb2-442c-ba2d-6fd040f35d59.bam dst=null perm=null proto=rpc
14:51:39.935 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_73f9ac03-5eb2-442c-ba2d-6fd040f35d59.bam.bai dst=null perm=null proto=rpc
14:51:39.935 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_73f9ac03-5eb2-442c-ba2d-6fd040f35d59.bam.bai dst=null perm=null proto=rpc
14:51:39.936 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest1_73f9ac03-5eb2-442c-ba2d-6fd040f35d59.bam.bai dst=null perm=null proto=rpc
14:51:39.937 WARN DFSUtil - Unexpected value for data transfer bytes=5760 duration=0
14:51:39.939 WARN DFSUtil - Unexpected value for data transfer bytes=5516 duration=0
14:51:39.941 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
14:51:39.941 WARN DFSUtil - Unexpected value for data transfer bytes=32 duration=0
14:51:39.942 INFO Executor - Finished task 0.0 in stage 219.0 (TID 275). 989 bytes result sent to driver
14:51:39.943 INFO TaskSetManager - Finished task 0.0 in stage 219.0 (TID 275) in 23 ms on localhost (executor driver) (1/1)
14:51:39.943 INFO TaskSchedulerImpl - Removed TaskSet 219.0, whose tasks have all completed, from pool
14:51:39.943 INFO DAGScheduler - ResultStage 219 (count at ReadsSparkSinkUnitTest.java:185) finished in 0.030 s
14:51:39.943 INFO DAGScheduler - Job 164 is finished. Cancelling potential speculative or zombie tasks for this job
14:51:39.943 INFO TaskSchedulerImpl - Killing all running tasks in stage 219: Stage finished
14:51:39.943 INFO DAGScheduler - Job 164 finished: count at ReadsSparkSinkUnitTest.java:185, took 0.031122 s
14:51:39.953 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_c9b4dff3-13b5-4caa-96ce-dbe48e2bb056.bam dst=null perm=null proto=rpc
14:51:39.954 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=create src=/user/runner/ReadsSparkSinkUnitTest1_c9b4dff3-13b5-4caa-96ce-dbe48e2bb056.bam dst=null perm=runner:supergroup:rw-r--r-- proto=rpc
14:51:39.955 INFO StateChange - DIR* completeFile: /user/runner/ReadsSparkSinkUnitTest1_c9b4dff3-13b5-4caa-96ce-dbe48e2bb056.bam is closed by DFSClient_NONMAPREDUCE_-951833550_1
14:51:39.955 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_c9b4dff3-13b5-4caa-96ce-dbe48e2bb056.bam dst=null perm=null proto=rpc
14:51:39.958 INFO MemoryStore - Block broadcast_438 stored as values in memory (estimated size 297.9 KiB, free 1916.8 MiB)
14:51:39.964 INFO MemoryStore - Block broadcast_438_piece0 stored as bytes in memory (estimated size 50.2 KiB, free 1916.7 MiB)
14:51:39.964 INFO BlockManagerInfo - Added broadcast_438_piece0 in memory on localhost:44923 (size: 50.2 KiB, free: 1919.3 MiB)
14:51:39.964 INFO SparkContext - Created broadcast 438 from newAPIHadoopFile at PathSplitSource.java:96
14:51:39.986 INFO MemoryStore - Block broadcast_439 stored as values in memory (estimated size 297.9 KiB, free 1916.4 MiB)
14:51:39.992 INFO MemoryStore - Block broadcast_439_piece0 stored as bytes in memory (estimated size 50.2 KiB, free 1916.4 MiB)
14:51:39.992 INFO BlockManagerInfo - Added broadcast_439_piece0 in memory on localhost:44923 (size: 50.2 KiB, free: 1919.3 MiB)
14:51:39.993 INFO SparkContext - Created broadcast 439 from newAPIHadoopFile at PathSplitSource.java:96
14:51:40.013 INFO FileInputFormat - Total input files to process : 1
14:51:40.015 INFO MemoryStore - Block broadcast_440 stored as values in memory (estimated size 160.7 KiB, free 1916.2 MiB)
14:51:40.016 INFO MemoryStore - Block broadcast_440_piece0 stored as bytes in memory (estimated size 9.6 KiB, free 1916.2 MiB)
14:51:40.016 INFO BlockManagerInfo - Added broadcast_440_piece0 in memory on localhost:44923 (size: 9.6 KiB, free: 1919.3 MiB)
14:51:40.017 INFO SparkContext - Created broadcast 440 from broadcast at ReadsSparkSink.java:133
14:51:40.018 INFO MemoryStore - Block broadcast_441 stored as values in memory (estimated size 163.2 KiB, free 1916.0 MiB)
14:51:40.019 INFO MemoryStore - Block broadcast_441_piece0 stored as bytes in memory (estimated size 9.6 KiB, free 1916.0 MiB)
14:51:40.019 INFO BlockManagerInfo - Added broadcast_441_piece0 in memory on localhost:44923 (size: 9.6 KiB, free: 1919.3 MiB)
14:51:40.019 INFO SparkContext - Created broadcast 441 from broadcast at BamSink.java:76
14:51:40.021 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_c9b4dff3-13b5-4caa-96ce-dbe48e2bb056.bam.parts dst=null perm=null proto=rpc
14:51:40.021 INFO PathOutputCommitterFactory - No output committer factory defined, defaulting to FileOutputCommitterFactory
14:51:40.021 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
14:51:40.021 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
14:51:40.022 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=mkdirs src=/user/runner/ReadsSparkSinkUnitTest1_c9b4dff3-13b5-4caa-96ce-dbe48e2bb056.bam.parts/_temporary/0 dst=null perm=runner:supergroup:rwxr-xr-x proto=rpc
14:51:40.029 INFO SparkContext - Starting job: runJob at SparkHadoopWriter.scala:83
14:51:40.029 INFO DAGScheduler - Registering RDD 1053 (mapToPair at SparkUtils.java:161) as input to shuffle 44
14:51:40.029 INFO DAGScheduler - Got job 165 (runJob at SparkHadoopWriter.scala:83) with 1 output partitions
14:51:40.029 INFO DAGScheduler - Final stage: ResultStage 221 (runJob at SparkHadoopWriter.scala:83)
14:51:40.029 INFO DAGScheduler - Parents of final stage: List(ShuffleMapStage 220)
14:51:40.029 INFO DAGScheduler - Missing parents: List(ShuffleMapStage 220)
14:51:40.029 INFO DAGScheduler - Submitting ShuffleMapStage 220 (MapPartitionsRDD[1053] at mapToPair at SparkUtils.java:161), which has no missing parents
14:51:40.047 INFO MemoryStore - Block broadcast_442 stored as values in memory (estimated size 520.4 KiB, free 1915.5 MiB)
14:51:40.052 INFO BlockManagerInfo - Removed broadcast_436_piece0 on localhost:44923 in memory (size: 153.6 KiB, free: 1919.4 MiB)
14:51:40.052 INFO MemoryStore - Block broadcast_442_piece0 stored as bytes in memory (estimated size 166.1 KiB, free 1915.9 MiB)
14:51:40.053 INFO BlockManagerInfo - Added broadcast_442_piece0 in memory on localhost:44923 (size: 166.1 KiB, free: 1919.3 MiB)
14:51:40.053 INFO BlockManagerInfo - Removed broadcast_433_piece0 on localhost:44923 in memory (size: 233.0 B, free: 1919.3 MiB)
14:51:40.053 INFO SparkContext - Created broadcast 442 from broadcast at DAGScheduler.scala:1580
14:51:40.053 INFO DAGScheduler - Submitting 1 missing tasks from ShuffleMapStage 220 (MapPartitionsRDD[1053] at mapToPair at SparkUtils.java:161) (first 15 tasks are for partitions Vector(0))
14:51:40.053 INFO TaskSchedulerImpl - Adding task set 220.0 with 1 tasks resource profile 0
14:51:40.053 INFO BlockManagerInfo - Removed broadcast_430_piece0 on localhost:44923 in memory (size: 9.6 KiB, free: 1919.3 MiB)
14:51:40.054 INFO BlockManagerInfo - Removed broadcast_439_piece0 on localhost:44923 in memory (size: 50.2 KiB, free: 1919.3 MiB)
14:51:40.054 INFO TaskSetManager - Starting task 0.0 in stage 220.0 (TID 276) (localhost, executor driver, partition 0, PROCESS_LOCAL, 7861 bytes)
14:51:40.054 INFO BlockManagerInfo - Removed broadcast_429_piece0 on localhost:44923 in memory (size: 9.6 KiB, free: 1919.3 MiB)
14:51:40.054 INFO Executor - Running task 0.0 in stage 220.0 (TID 276)
14:51:40.055 INFO BlockManagerInfo - Removed broadcast_427_piece0 on localhost:44923 in memory (size: 50.2 KiB, free: 1919.4 MiB)
14:51:40.055 INFO BlockManagerInfo - Removed broadcast_434_piece0 on localhost:44923 in memory (size: 50.2 KiB, free: 1919.4 MiB)
14:51:40.056 INFO BlockManagerInfo - Removed broadcast_431_piece0 on localhost:44923 in memory (size: 166.1 KiB, free: 1919.6 MiB)
14:51:40.057 INFO BlockManagerInfo - Removed broadcast_435_piece0 on localhost:44923 in memory (size: 54.6 KiB, free: 1919.7 MiB)
14:51:40.057 INFO BlockManagerInfo - Removed broadcast_432_piece0 on localhost:44923 in memory (size: 67.1 KiB, free: 1919.7 MiB)
14:51:40.058 INFO BlockManagerInfo - Removed broadcast_437_piece0 on localhost:44923 in memory (size: 54.6 KiB, free: 1919.8 MiB)
14:51:40.091 INFO NewHadoopRDD - Input split: file:/home/runner/work/gatk/gatk/src/test/resources/org/broadinstitute/hellbender/tools/BQSR/HiSeq.1mb.1RG.2k_lines.bam:0+222075
14:51:40.108 INFO Executor - Finished task 0.0 in stage 220.0 (TID 276). 1148 bytes result sent to driver
14:51:40.108 INFO TaskSetManager - Finished task 0.0 in stage 220.0 (TID 276) in 54 ms on localhost (executor driver) (1/1)
14:51:40.108 INFO TaskSchedulerImpl - Removed TaskSet 220.0, whose tasks have all completed, from pool
14:51:40.108 INFO DAGScheduler - ShuffleMapStage 220 (mapToPair at SparkUtils.java:161) finished in 0.078 s
14:51:40.108 INFO DAGScheduler - looking for newly runnable stages
14:51:40.108 INFO DAGScheduler - running: HashSet()
14:51:40.108 INFO DAGScheduler - waiting: HashSet(ResultStage 221)
14:51:40.108 INFO DAGScheduler - failed: HashSet()
14:51:40.109 INFO DAGScheduler - Submitting ResultStage 221 (MapPartitionsRDD[1058] at mapToPair at BamSink.java:91), which has no missing parents
14:51:40.118 INFO MemoryStore - Block broadcast_443 stored as values in memory (estimated size 241.5 KiB, free 1918.4 MiB)
14:51:40.119 INFO MemoryStore - Block broadcast_443_piece0 stored as bytes in memory (estimated size 67.1 KiB, free 1918.4 MiB)
14:51:40.119 INFO BlockManagerInfo - Added broadcast_443_piece0 in memory on localhost:44923 (size: 67.1 KiB, free: 1919.7 MiB)
14:51:40.119 INFO SparkContext - Created broadcast 443 from broadcast at DAGScheduler.scala:1580
14:51:40.119 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 221 (MapPartitionsRDD[1058] at mapToPair at BamSink.java:91) (first 15 tasks are for partitions Vector(0))
14:51:40.119 INFO TaskSchedulerImpl - Adding task set 221.0 with 1 tasks resource profile 0
14:51:40.120 INFO TaskSetManager - Starting task 0.0 in stage 221.0 (TID 277) (localhost, executor driver, partition 0, NODE_LOCAL, 7513 bytes)
14:51:40.120 INFO Executor - Running task 0.0 in stage 221.0 (TID 277)
14:51:40.125 INFO ShuffleBlockFetcherIterator - Getting 1 (343.8 KiB) non-empty blocks including 1 (343.8 KiB) local and 0 (0.0 B) host-local and 0 (0.0 B) push-merged-local and 0 (0.0 B) remote blocks
14:51:40.125 INFO ShuffleBlockFetcherIterator - Started 0 remote fetches in 0 ms
14:51:40.137 INFO PathOutputCommitterFactory - No output committer factory defined, defaulting to FileOutputCommitterFactory
14:51:40.137 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
14:51:40.137 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
14:51:40.137 INFO PathOutputCommitterFactory - No output committer factory defined, defaulting to FileOutputCommitterFactory
14:51:40.137 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
14:51:40.137 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
14:51:40.138 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=create src=/user/runner/ReadsSparkSinkUnitTest1_c9b4dff3-13b5-4caa-96ce-dbe48e2bb056.bam.parts/_temporary/0/_temporary/attempt_202603041451401519229783475399826_1058_r_000000_0/part-r-00000 dst=null perm=runner:supergroup:rw-r--r-- proto=rpc
14:51:40.139 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=create src=/user/runner/ReadsSparkSinkUnitTest1_c9b4dff3-13b5-4caa-96ce-dbe48e2bb056.bam.parts/_temporary/0/_temporary/attempt_202603041451401519229783475399826_1058_r_000000_0/.part-r-00000.sbi dst=null perm=runner:supergroup:rw-r--r-- proto=rpc
14:51:40.140 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=create src=/user/runner/ReadsSparkSinkUnitTest1_c9b4dff3-13b5-4caa-96ce-dbe48e2bb056.bam.parts/_temporary/0/_temporary/attempt_202603041451401519229783475399826_1058_r_000000_0/.part-r-00000.bai dst=null perm=runner:supergroup:rw-r--r-- proto=rpc
14:51:40.143 INFO StateChange - BLOCK* allocate blk_1073741878_1054, replicas=127.0.0.1:34059 for /user/runner/ReadsSparkSinkUnitTest1_c9b4dff3-13b5-4caa-96ce-dbe48e2bb056.bam.parts/_temporary/0/_temporary/attempt_202603041451401519229783475399826_1058_r_000000_0/part-r-00000
14:51:40.144 INFO DataNode - Receiving BP-1768883704-10.1.0.125-1772635868711:blk_1073741878_1054 src: /127.0.0.1:39086 dest: /127.0.0.1:34059
14:51:40.147 INFO clienttrace - src: /127.0.0.1:39086, dest: /127.0.0.1:34059, bytes: 231298, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-951833550_1, offset: 0, srvID: 3c05d188-bc2c-46c8-958b-79f7184a02a2, blockid: BP-1768883704-10.1.0.125-1772635868711:blk_1073741878_1054, duration(ns): 967120
14:51:40.147 INFO DataNode - PacketResponder: BP-1768883704-10.1.0.125-1772635868711:blk_1073741878_1054, type=LAST_IN_PIPELINE terminating
14:51:40.148 INFO FSNamesystem - BLOCK* blk_1073741878_1054 is COMMITTED but not COMPLETE(numNodes= 0 < minimum = 1) in file /user/runner/ReadsSparkSinkUnitTest1_c9b4dff3-13b5-4caa-96ce-dbe48e2bb056.bam.parts/_temporary/0/_temporary/attempt_202603041451401519229783475399826_1058_r_000000_0/part-r-00000
14:51:40.549 INFO StateChange - DIR* completeFile: /user/runner/ReadsSparkSinkUnitTest1_c9b4dff3-13b5-4caa-96ce-dbe48e2bb056.bam.parts/_temporary/0/_temporary/attempt_202603041451401519229783475399826_1058_r_000000_0/part-r-00000 is closed by DFSClient_NONMAPREDUCE_-951833550_1
14:51:40.550 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_c9b4dff3-13b5-4caa-96ce-dbe48e2bb056.bam.parts/_temporary/0/_temporary/attempt_202603041451401519229783475399826_1058_r_000000_0/part-r-00000 dst=null perm=null proto=rpc
14:51:40.551 INFO StateChange - BLOCK* allocate blk_1073741879_1055, replicas=127.0.0.1:34059 for /user/runner/ReadsSparkSinkUnitTest1_c9b4dff3-13b5-4caa-96ce-dbe48e2bb056.bam.parts/_temporary/0/_temporary/attempt_202603041451401519229783475399826_1058_r_000000_0/.part-r-00000.sbi
14:51:40.552 INFO DataNode - Receiving BP-1768883704-10.1.0.125-1772635868711:blk_1073741879_1055 src: /127.0.0.1:39102 dest: /127.0.0.1:34059
14:51:40.553 INFO clienttrace - src: /127.0.0.1:39102, dest: /127.0.0.1:34059, bytes: 13492, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-951833550_1, offset: 0, srvID: 3c05d188-bc2c-46c8-958b-79f7184a02a2, blockid: BP-1768883704-10.1.0.125-1772635868711:blk_1073741879_1055, duration(ns): 574903
14:51:40.553 INFO DataNode - PacketResponder: BP-1768883704-10.1.0.125-1772635868711:blk_1073741879_1055, type=LAST_IN_PIPELINE terminating
14:51:40.554 INFO FSNamesystem - BLOCK* blk_1073741879_1055 is COMMITTED but not COMPLETE(numNodes= 0 < minimum = 1) in file /user/runner/ReadsSparkSinkUnitTest1_c9b4dff3-13b5-4caa-96ce-dbe48e2bb056.bam.parts/_temporary/0/_temporary/attempt_202603041451401519229783475399826_1058_r_000000_0/.part-r-00000.sbi
14:51:40.955 INFO StateChange - DIR* completeFile: /user/runner/ReadsSparkSinkUnitTest1_c9b4dff3-13b5-4caa-96ce-dbe48e2bb056.bam.parts/_temporary/0/_temporary/attempt_202603041451401519229783475399826_1058_r_000000_0/.part-r-00000.sbi is closed by DFSClient_NONMAPREDUCE_-951833550_1
14:51:40.957 INFO StateChange - BLOCK* allocate blk_1073741880_1056, replicas=127.0.0.1:34059 for /user/runner/ReadsSparkSinkUnitTest1_c9b4dff3-13b5-4caa-96ce-dbe48e2bb056.bam.parts/_temporary/0/_temporary/attempt_202603041451401519229783475399826_1058_r_000000_0/.part-r-00000.bai
14:51:40.958 INFO DataNode - Receiving BP-1768883704-10.1.0.125-1772635868711:blk_1073741880_1056 src: /127.0.0.1:39108 dest: /127.0.0.1:34059
14:51:40.960 INFO clienttrace - src: /127.0.0.1:39108, dest: /127.0.0.1:34059, bytes: 5472, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-951833550_1, offset: 0, srvID: 3c05d188-bc2c-46c8-958b-79f7184a02a2, blockid: BP-1768883704-10.1.0.125-1772635868711:blk_1073741880_1056, duration(ns): 601545
14:51:40.960 INFO DataNode - PacketResponder: BP-1768883704-10.1.0.125-1772635868711:blk_1073741880_1056, type=LAST_IN_PIPELINE terminating
14:51:40.961 INFO StateChange - DIR* completeFile: /user/runner/ReadsSparkSinkUnitTest1_c9b4dff3-13b5-4caa-96ce-dbe48e2bb056.bam.parts/_temporary/0/_temporary/attempt_202603041451401519229783475399826_1058_r_000000_0/.part-r-00000.bai is closed by DFSClient_NONMAPREDUCE_-951833550_1
14:51:40.962 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_c9b4dff3-13b5-4caa-96ce-dbe48e2bb056.bam.parts/_temporary/0/_temporary/attempt_202603041451401519229783475399826_1058_r_000000_0 dst=null perm=null proto=rpc
14:51:40.963 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_c9b4dff3-13b5-4caa-96ce-dbe48e2bb056.bam.parts/_temporary/0/_temporary/attempt_202603041451401519229783475399826_1058_r_000000_0 dst=null perm=null proto=rpc
14:51:40.963 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_c9b4dff3-13b5-4caa-96ce-dbe48e2bb056.bam.parts/_temporary/0/task_202603041451401519229783475399826_1058_r_000000 dst=null perm=null proto=rpc
14:51:40.964 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=rename src=/user/runner/ReadsSparkSinkUnitTest1_c9b4dff3-13b5-4caa-96ce-dbe48e2bb056.bam.parts/_temporary/0/_temporary/attempt_202603041451401519229783475399826_1058_r_000000_0 dst=/user/runner/ReadsSparkSinkUnitTest1_c9b4dff3-13b5-4caa-96ce-dbe48e2bb056.bam.parts/_temporary/0/task_202603041451401519229783475399826_1058_r_000000 perm=runner:supergroup:rwxr-xr-x proto=rpc
14:51:40.964 INFO FileOutputCommitter - Saved output of task 'attempt_202603041451401519229783475399826_1058_r_000000_0' to hdfs://localhost:43595/user/runner/ReadsSparkSinkUnitTest1_c9b4dff3-13b5-4caa-96ce-dbe48e2bb056.bam.parts/_temporary/0/task_202603041451401519229783475399826_1058_r_000000
14:51:40.964 INFO SparkHadoopMapRedUtil - attempt_202603041451401519229783475399826_1058_r_000000_0: Committed. Elapsed time: 1 ms.
14:51:40.965 INFO Executor - Finished task 0.0 in stage 221.0 (TID 277). 1858 bytes result sent to driver
14:51:40.965 INFO TaskSetManager - Finished task 0.0 in stage 221.0 (TID 277) in 845 ms on localhost (executor driver) (1/1)
14:51:40.965 INFO TaskSchedulerImpl - Removed TaskSet 221.0, whose tasks have all completed, from pool
14:51:40.965 INFO DAGScheduler - ResultStage 221 (runJob at SparkHadoopWriter.scala:83) finished in 0.856 s
14:51:40.966 INFO DAGScheduler - Job 165 is finished. Cancelling potential speculative or zombie tasks for this job
14:51:40.966 INFO TaskSchedulerImpl - Killing all running tasks in stage 221: Stage finished
14:51:40.966 INFO DAGScheduler - Job 165 finished: runJob at SparkHadoopWriter.scala:83, took 0.937153 s
14:51:40.966 INFO SparkHadoopWriter - Start to commit write Job job_202603041451401519229783475399826_1058.
14:51:40.967 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=listStatus src=/user/runner/ReadsSparkSinkUnitTest1_c9b4dff3-13b5-4caa-96ce-dbe48e2bb056.bam.parts/_temporary/0 dst=null perm=null proto=rpc
14:51:40.967 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_c9b4dff3-13b5-4caa-96ce-dbe48e2bb056.bam.parts dst=null perm=null proto=rpc
14:51:40.968 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=listStatus src=/user/runner/ReadsSparkSinkUnitTest1_c9b4dff3-13b5-4caa-96ce-dbe48e2bb056.bam.parts/_temporary/0/task_202603041451401519229783475399826_1058_r_000000 dst=null perm=null proto=rpc
14:51:40.968 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_c9b4dff3-13b5-4caa-96ce-dbe48e2bb056.bam.parts/.part-r-00000.bai dst=null perm=null proto=rpc
14:51:40.969 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=rename src=/user/runner/ReadsSparkSinkUnitTest1_c9b4dff3-13b5-4caa-96ce-dbe48e2bb056.bam.parts/_temporary/0/task_202603041451401519229783475399826_1058_r_000000/.part-r-00000.bai dst=/user/runner/ReadsSparkSinkUnitTest1_c9b4dff3-13b5-4caa-96ce-dbe48e2bb056.bam.parts/.part-r-00000.bai perm=runner:supergroup:rw-r--r-- proto=rpc
14:51:40.970 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_c9b4dff3-13b5-4caa-96ce-dbe48e2bb056.bam.parts/.part-r-00000.sbi dst=null perm=null proto=rpc
14:51:40.970 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=rename src=/user/runner/ReadsSparkSinkUnitTest1_c9b4dff3-13b5-4caa-96ce-dbe48e2bb056.bam.parts/_temporary/0/task_202603041451401519229783475399826_1058_r_000000/.part-r-00000.sbi dst=/user/runner/ReadsSparkSinkUnitTest1_c9b4dff3-13b5-4caa-96ce-dbe48e2bb056.bam.parts/.part-r-00000.sbi perm=runner:supergroup:rw-r--r-- proto=rpc
14:51:40.971 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_c9b4dff3-13b5-4caa-96ce-dbe48e2bb056.bam.parts/part-r-00000 dst=null perm=null proto=rpc
14:51:40.971 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=rename src=/user/runner/ReadsSparkSinkUnitTest1_c9b4dff3-13b5-4caa-96ce-dbe48e2bb056.bam.parts/_temporary/0/task_202603041451401519229783475399826_1058_r_000000/part-r-00000 dst=/user/runner/ReadsSparkSinkUnitTest1_c9b4dff3-13b5-4caa-96ce-dbe48e2bb056.bam.parts/part-r-00000 perm=runner:supergroup:rw-r--r-- proto=rpc
14:51:40.972 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=delete src=/user/runner/ReadsSparkSinkUnitTest1_c9b4dff3-13b5-4caa-96ce-dbe48e2bb056.bam.parts/_temporary dst=null perm=null proto=rpc
14:51:40.973 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=create src=/user/runner/ReadsSparkSinkUnitTest1_c9b4dff3-13b5-4caa-96ce-dbe48e2bb056.bam.parts/_SUCCESS dst=null perm=runner:supergroup:rw-r--r-- proto=rpc
14:51:40.974 INFO StateChange - DIR* completeFile: /user/runner/ReadsSparkSinkUnitTest1_c9b4dff3-13b5-4caa-96ce-dbe48e2bb056.bam.parts/_SUCCESS is closed by DFSClient_NONMAPREDUCE_-951833550_1
14:51:40.975 INFO audit - allowed=false ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=delete src=/user/runner/ReadsSparkSinkUnitTest1_c9b4dff3-13b5-4caa-96ce-dbe48e2bb056.bam.parts/.spark-staging-1058 dst=null perm=null proto=rpc
14:51:40.975 INFO SparkHadoopWriter - Write Job job_202603041451401519229783475399826_1058 committed. Elapsed time: 8 ms.
14:51:40.975 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=create src=/user/runner/ReadsSparkSinkUnitTest1_c9b4dff3-13b5-4caa-96ce-dbe48e2bb056.bam.parts/header dst=null perm=runner:supergroup:rw-r--r-- proto=rpc
14:51:40.977 INFO StateChange - BLOCK* allocate blk_1073741881_1057, replicas=127.0.0.1:34059 for /user/runner/ReadsSparkSinkUnitTest1_c9b4dff3-13b5-4caa-96ce-dbe48e2bb056.bam.parts/header
14:51:40.978 INFO DataNode - Receiving BP-1768883704-10.1.0.125-1772635868711:blk_1073741881_1057 src: /127.0.0.1:39120 dest: /127.0.0.1:34059
14:51:40.979 INFO clienttrace - src: /127.0.0.1:39120, dest: /127.0.0.1:34059, bytes: 5712, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-951833550_1, offset: 0, srvID: 3c05d188-bc2c-46c8-958b-79f7184a02a2, blockid: BP-1768883704-10.1.0.125-1772635868711:blk_1073741881_1057, duration(ns): 504243
14:51:40.979 INFO DataNode - PacketResponder: BP-1768883704-10.1.0.125-1772635868711:blk_1073741881_1057, type=LAST_IN_PIPELINE terminating
14:51:40.980 INFO FSNamesystem - BLOCK* blk_1073741881_1057 is COMMITTED but not COMPLETE(numNodes= 0 < minimum = 1) in file /user/runner/ReadsSparkSinkUnitTest1_c9b4dff3-13b5-4caa-96ce-dbe48e2bb056.bam.parts/header
14:51:41.381 INFO StateChange - DIR* completeFile: /user/runner/ReadsSparkSinkUnitTest1_c9b4dff3-13b5-4caa-96ce-dbe48e2bb056.bam.parts/header is closed by DFSClient_NONMAPREDUCE_-951833550_1
14:51:41.382 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=create src=/user/runner/ReadsSparkSinkUnitTest1_c9b4dff3-13b5-4caa-96ce-dbe48e2bb056.bam.parts/terminator dst=null perm=runner:supergroup:rw-r--r-- proto=rpc
14:51:41.383 INFO StateChange - BLOCK* allocate blk_1073741882_1058, replicas=127.0.0.1:34059 for /user/runner/ReadsSparkSinkUnitTest1_c9b4dff3-13b5-4caa-96ce-dbe48e2bb056.bam.parts/terminator
14:51:41.384 INFO DataNode - Receiving BP-1768883704-10.1.0.125-1772635868711:blk_1073741882_1058 src: /127.0.0.1:39126 dest: /127.0.0.1:34059
14:51:41.385 INFO clienttrace - src: /127.0.0.1:39126, dest: /127.0.0.1:34059, bytes: 28, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-951833550_1, offset: 0, srvID: 3c05d188-bc2c-46c8-958b-79f7184a02a2, blockid: BP-1768883704-10.1.0.125-1772635868711:blk_1073741882_1058, duration(ns): 506688
14:51:41.385 INFO DataNode - PacketResponder: BP-1768883704-10.1.0.125-1772635868711:blk_1073741882_1058, type=LAST_IN_PIPELINE terminating
14:51:41.386 INFO StateChange - DIR* completeFile: /user/runner/ReadsSparkSinkUnitTest1_c9b4dff3-13b5-4caa-96ce-dbe48e2bb056.bam.parts/terminator is closed by DFSClient_NONMAPREDUCE_-951833550_1
14:51:41.387 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=listStatus src=/user/runner/ReadsSparkSinkUnitTest1_c9b4dff3-13b5-4caa-96ce-dbe48e2bb056.bam.parts dst=null perm=null proto=rpc
14:51:41.388 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=create src=/user/runner/ReadsSparkSinkUnitTest1_c9b4dff3-13b5-4caa-96ce-dbe48e2bb056.bam.parts/output dst=null perm=runner:supergroup:rw-r--r-- proto=rpc
14:51:41.389 INFO StateChange - DIR* completeFile: /user/runner/ReadsSparkSinkUnitTest1_c9b4dff3-13b5-4caa-96ce-dbe48e2bb056.bam.parts/output is closed by DFSClient_NONMAPREDUCE_-951833550_1
14:51:41.389 INFO HadoopFileSystemWrapper - Concatenating 3 parts to hdfs://localhost:43595/user/runner/ReadsSparkSinkUnitTest1_c9b4dff3-13b5-4caa-96ce-dbe48e2bb056.bam
14:51:41.390 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=concat src=[/user/runner/ReadsSparkSinkUnitTest1_c9b4dff3-13b5-4caa-96ce-dbe48e2bb056.bam.parts/header, /user/runner/ReadsSparkSinkUnitTest1_c9b4dff3-13b5-4caa-96ce-dbe48e2bb056.bam.parts/part-r-00000, /user/runner/ReadsSparkSinkUnitTest1_c9b4dff3-13b5-4caa-96ce-dbe48e2bb056.bam.parts/terminator] dst=/user/runner/ReadsSparkSinkUnitTest1_c9b4dff3-13b5-4caa-96ce-dbe48e2bb056.bam.parts/output perm=runner:supergroup:rw-r--r-- proto=rpc
14:51:41.390 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_c9b4dff3-13b5-4caa-96ce-dbe48e2bb056.bam dst=null perm=null proto=rpc
14:51:41.391 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=delete src=/user/runner/ReadsSparkSinkUnitTest1_c9b4dff3-13b5-4caa-96ce-dbe48e2bb056.bam dst=null perm=null proto=rpc
14:51:41.391 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=rename src=/user/runner/ReadsSparkSinkUnitTest1_c9b4dff3-13b5-4caa-96ce-dbe48e2bb056.bam.parts/output dst=/user/runner/ReadsSparkSinkUnitTest1_c9b4dff3-13b5-4caa-96ce-dbe48e2bb056.bam perm=runner:supergroup:rw-r--r-- proto=rpc
14:51:41.392 INFO HadoopFileSystemWrapper - Concatenating to hdfs://localhost:43595/user/runner/ReadsSparkSinkUnitTest1_c9b4dff3-13b5-4caa-96ce-dbe48e2bb056.bam done
14:51:41.392 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_c9b4dff3-13b5-4caa-96ce-dbe48e2bb056.bam dst=null perm=null proto=rpc
14:51:41.392 INFO IndexFileMerger - Merging .sbi files in temp directory hdfs://localhost:43595/user/runner/ReadsSparkSinkUnitTest1_c9b4dff3-13b5-4caa-96ce-dbe48e2bb056.bam.parts/ to hdfs://localhost:43595/user/runner/ReadsSparkSinkUnitTest1_c9b4dff3-13b5-4caa-96ce-dbe48e2bb056.bam.sbi
14:51:41.392 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=listStatus src=/user/runner/ReadsSparkSinkUnitTest1_c9b4dff3-13b5-4caa-96ce-dbe48e2bb056.bam.parts dst=null perm=null proto=rpc
14:51:41.393 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=create src=/user/runner/ReadsSparkSinkUnitTest1_c9b4dff3-13b5-4caa-96ce-dbe48e2bb056.bam.sbi dst=null perm=runner:supergroup:rw-r--r-- proto=rpc
14:51:41.394 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_c9b4dff3-13b5-4caa-96ce-dbe48e2bb056.bam.parts/.part-r-00000.sbi dst=null perm=null proto=rpc
14:51:41.395 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest1_c9b4dff3-13b5-4caa-96ce-dbe48e2bb056.bam.parts/.part-r-00000.sbi dst=null perm=null proto=rpc
14:51:41.396 WARN DFSUtil - Unexpected value for data transfer bytes=13600 duration=0
14:51:41.397 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=delete src=/user/runner/ReadsSparkSinkUnitTest1_c9b4dff3-13b5-4caa-96ce-dbe48e2bb056.bam.parts/.part-r-00000.sbi dst=null perm=null proto=rpc
14:51:41.398 INFO StateChange - BLOCK* allocate blk_1073741883_1059, replicas=127.0.0.1:34059 for /user/runner/ReadsSparkSinkUnitTest1_c9b4dff3-13b5-4caa-96ce-dbe48e2bb056.bam.sbi
14:51:41.399 INFO DataNode - Receiving BP-1768883704-10.1.0.125-1772635868711:blk_1073741883_1059 src: /127.0.0.1:39142 dest: /127.0.0.1:34059
14:51:41.400 INFO clienttrace - src: /127.0.0.1:39142, dest: /127.0.0.1:34059, bytes: 13492, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-951833550_1, offset: 0, srvID: 3c05d188-bc2c-46c8-958b-79f7184a02a2, blockid: BP-1768883704-10.1.0.125-1772635868711:blk_1073741883_1059, duration(ns): 618199
14:51:41.400 INFO DataNode - PacketResponder: BP-1768883704-10.1.0.125-1772635868711:blk_1073741883_1059, type=LAST_IN_PIPELINE terminating
14:51:41.401 INFO StateChange - DIR* completeFile: /user/runner/ReadsSparkSinkUnitTest1_c9b4dff3-13b5-4caa-96ce-dbe48e2bb056.bam.sbi is closed by DFSClient_NONMAPREDUCE_-951833550_1
14:51:41.401 INFO IndexFileMerger - Done merging .sbi files
14:51:41.401 INFO IndexFileMerger - Merging .bai files in temp directory hdfs://localhost:43595/user/runner/ReadsSparkSinkUnitTest1_c9b4dff3-13b5-4caa-96ce-dbe48e2bb056.bam.parts/ to hdfs://localhost:43595/user/runner/ReadsSparkSinkUnitTest1_c9b4dff3-13b5-4caa-96ce-dbe48e2bb056.bam.bai
14:51:41.402 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=listStatus src=/user/runner/ReadsSparkSinkUnitTest1_c9b4dff3-13b5-4caa-96ce-dbe48e2bb056.bam.parts dst=null perm=null proto=rpc
14:51:41.403 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=create src=/user/runner/ReadsSparkSinkUnitTest1_c9b4dff3-13b5-4caa-96ce-dbe48e2bb056.bam.bai dst=null perm=runner:supergroup:rw-r--r-- proto=rpc
14:51:41.404 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_c9b4dff3-13b5-4caa-96ce-dbe48e2bb056.bam.parts/.part-r-00000.bai dst=null perm=null proto=rpc
14:51:41.404 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest1_c9b4dff3-13b5-4caa-96ce-dbe48e2bb056.bam.parts/.part-r-00000.bai dst=null perm=null proto=rpc
14:51:41.405 WARN DFSUtil - Unexpected value for data transfer bytes=5516 duration=0
14:51:41.406 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=delete src=/user/runner/ReadsSparkSinkUnitTest1_c9b4dff3-13b5-4caa-96ce-dbe48e2bb056.bam.parts/.part-r-00000.bai dst=null perm=null proto=rpc
14:51:41.407 INFO StateChange - BLOCK* allocate blk_1073741884_1060, replicas=127.0.0.1:34059 for /user/runner/ReadsSparkSinkUnitTest1_c9b4dff3-13b5-4caa-96ce-dbe48e2bb056.bam.bai
14:51:41.408 INFO DataNode - Receiving BP-1768883704-10.1.0.125-1772635868711:blk_1073741884_1060 src: /127.0.0.1:39150 dest: /127.0.0.1:34059
14:51:41.409 INFO clienttrace - src: /127.0.0.1:39150, dest: /127.0.0.1:34059, bytes: 5472, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-951833550_1, offset: 0, srvID: 3c05d188-bc2c-46c8-958b-79f7184a02a2, blockid: BP-1768883704-10.1.0.125-1772635868711:blk_1073741884_1060, duration(ns): 550380
14:51:41.410 INFO DataNode - PacketResponder: BP-1768883704-10.1.0.125-1772635868711:blk_1073741884_1060, type=LAST_IN_PIPELINE terminating
14:51:41.411 INFO FSNamesystem - BLOCK* blk_1073741884_1060 is COMMITTED but not COMPLETE(numNodes= 0 < minimum = 1) in file /user/runner/ReadsSparkSinkUnitTest1_c9b4dff3-13b5-4caa-96ce-dbe48e2bb056.bam.bai
14:51:41.812 INFO StateChange - DIR* completeFile: /user/runner/ReadsSparkSinkUnitTest1_c9b4dff3-13b5-4caa-96ce-dbe48e2bb056.bam.bai is closed by DFSClient_NONMAPREDUCE_-951833550_1
14:51:41.812 INFO IndexFileMerger - Done merging .bai files
14:51:41.813 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=delete src=/user/runner/ReadsSparkSinkUnitTest1_c9b4dff3-13b5-4caa-96ce-dbe48e2bb056.bam.parts dst=null perm=null proto=rpc
14:51:41.822 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_c9b4dff3-13b5-4caa-96ce-dbe48e2bb056.bam.bai dst=null perm=null proto=rpc
14:51:41.835 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_c9b4dff3-13b5-4caa-96ce-dbe48e2bb056.bam.sbi dst=null perm=null proto=rpc
14:51:41.836 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_c9b4dff3-13b5-4caa-96ce-dbe48e2bb056.bam.sbi dst=null perm=null proto=rpc
14:51:41.837 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest1_c9b4dff3-13b5-4caa-96ce-dbe48e2bb056.bam.sbi dst=null perm=null proto=rpc
14:51:41.838 WARN DFSUtil - Unexpected value for data transfer bytes=13600 duration=0
14:51:41.838 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_c9b4dff3-13b5-4caa-96ce-dbe48e2bb056.bam dst=null perm=null proto=rpc
14:51:41.839 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_c9b4dff3-13b5-4caa-96ce-dbe48e2bb056.bam dst=null perm=null proto=rpc
14:51:41.839 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_c9b4dff3-13b5-4caa-96ce-dbe48e2bb056.bam dst=null perm=null proto=rpc
14:51:41.840 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest1_c9b4dff3-13b5-4caa-96ce-dbe48e2bb056.bam dst=null perm=null proto=rpc
14:51:41.841 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_c9b4dff3-13b5-4caa-96ce-dbe48e2bb056.bam.bai dst=null perm=null proto=rpc
14:51:41.841 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_c9b4dff3-13b5-4caa-96ce-dbe48e2bb056.bam.bai dst=null perm=null proto=rpc
14:51:41.841 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest1_c9b4dff3-13b5-4caa-96ce-dbe48e2bb056.bam.bai dst=null perm=null proto=rpc
14:51:41.843 WARN DFSUtil - Unexpected value for data transfer bytes=5760 duration=0
14:51:41.845 WARN DFSUtil - Unexpected value for data transfer bytes=5516 duration=0
14:51:41.845 WARN DFSUtil - Unexpected value for data transfer bytes=5516 duration=0
14:51:41.846 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_c9b4dff3-13b5-4caa-96ce-dbe48e2bb056.bam.sbi dst=null perm=null proto=rpc
14:51:41.846 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_c9b4dff3-13b5-4caa-96ce-dbe48e2bb056.bam.sbi dst=null perm=null proto=rpc
14:51:41.847 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest1_c9b4dff3-13b5-4caa-96ce-dbe48e2bb056.bam.sbi dst=null perm=null proto=rpc
14:51:41.848 WARN DFSUtil - Unexpected value for data transfer bytes=13600 duration=0
14:51:41.848 INFO MemoryStore - Block broadcast_444 stored as values in memory (estimated size 13.3 KiB, free 1918.3 MiB)
14:51:41.849 INFO MemoryStore - Block broadcast_444_piece0 stored as bytes in memory (estimated size 8.3 KiB, free 1918.3 MiB)
14:51:41.849 INFO BlockManagerInfo - Added broadcast_444_piece0 in memory on localhost:44923 (size: 8.3 KiB, free: 1919.7 MiB)
14:51:41.849 INFO SparkContext - Created broadcast 444 from broadcast at BamSource.java:104
14:51:41.850 INFO MemoryStore - Block broadcast_445 stored as values in memory (estimated size 297.9 KiB, free 1918.0 MiB)
14:51:41.861 INFO MemoryStore - Block broadcast_445_piece0 stored as bytes in memory (estimated size 50.2 KiB, free 1918.0 MiB)
14:51:41.861 INFO BlockManagerInfo - Added broadcast_445_piece0 in memory on localhost:44923 (size: 50.2 KiB, free: 1919.6 MiB)
14:51:41.861 INFO SparkContext - Created broadcast 445 from newAPIHadoopFile at PathSplitSource.java:96
14:51:41.876 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_c9b4dff3-13b5-4caa-96ce-dbe48e2bb056.bam dst=null perm=null proto=rpc
14:51:41.877 INFO FileInputFormat - Total input files to process : 1
14:51:41.877 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest1_c9b4dff3-13b5-4caa-96ce-dbe48e2bb056.bam dst=null perm=null proto=rpc
14:51:41.892 INFO SparkContext - Starting job: collect at ReadsSparkSinkUnitTest.java:182
14:51:41.893 INFO DAGScheduler - Got job 166 (collect at ReadsSparkSinkUnitTest.java:182) with 1 output partitions
14:51:41.893 INFO DAGScheduler - Final stage: ResultStage 222 (collect at ReadsSparkSinkUnitTest.java:182)
14:51:41.893 INFO DAGScheduler - Parents of final stage: List()
14:51:41.893 INFO DAGScheduler - Missing parents: List()
14:51:41.893 INFO DAGScheduler - Submitting ResultStage 222 (MapPartitionsRDD[1064] at filter at ReadsSparkSource.java:96), which has no missing parents
14:51:41.902 INFO MemoryStore - Block broadcast_446 stored as values in memory (estimated size 148.2 KiB, free 1917.8 MiB)
14:51:41.903 INFO MemoryStore - Block broadcast_446_piece0 stored as bytes in memory (estimated size 54.6 KiB, free 1917.8 MiB)
14:51:41.903 INFO BlockManagerInfo - Added broadcast_446_piece0 in memory on localhost:44923 (size: 54.6 KiB, free: 1919.6 MiB)
14:51:41.904 INFO SparkContext - Created broadcast 446 from broadcast at DAGScheduler.scala:1580
14:51:41.904 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 222 (MapPartitionsRDD[1064] at filter at ReadsSparkSource.java:96) (first 15 tasks are for partitions Vector(0))
14:51:41.904 INFO TaskSchedulerImpl - Adding task set 222.0 with 1 tasks resource profile 0
14:51:41.904 INFO TaskSetManager - Starting task 0.0 in stage 222.0 (TID 278) (localhost, executor driver, partition 0, ANY, 7852 bytes)
14:51:41.905 INFO Executor - Running task 0.0 in stage 222.0 (TID 278)
14:51:41.917 INFO NewHadoopRDD - Input split: hdfs://localhost:43595/user/runner/ReadsSparkSinkUnitTest1_c9b4dff3-13b5-4caa-96ce-dbe48e2bb056.bam:0+237038
14:51:41.918 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_c9b4dff3-13b5-4caa-96ce-dbe48e2bb056.bam dst=null perm=null proto=rpc
14:51:41.918 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest1_c9b4dff3-13b5-4caa-96ce-dbe48e2bb056.bam dst=null perm=null proto=rpc
14:51:41.919 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_c9b4dff3-13b5-4caa-96ce-dbe48e2bb056.bam.bai dst=null perm=null proto=rpc
14:51:41.920 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_c9b4dff3-13b5-4caa-96ce-dbe48e2bb056.bam.bai dst=null perm=null proto=rpc
14:51:41.920 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest1_c9b4dff3-13b5-4caa-96ce-dbe48e2bb056.bam.bai dst=null perm=null proto=rpc
14:51:41.922 WARN DFSUtil - Unexpected value for data transfer bytes=5760 duration=0
14:51:41.924 WARN DFSUtil - Unexpected value for data transfer bytes=5516 duration=0
14:51:41.925 WARN DFSUtil - Unexpected value for data transfer bytes=5516 duration=0
14:51:41.926 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
14:51:41.926 WARN DFSUtil - Unexpected value for data transfer bytes=32 duration=0
14:51:41.928 INFO Executor - Finished task 0.0 in stage 222.0 (TID 278). 651483 bytes result sent to driver
14:51:41.930 INFO TaskSetManager - Finished task 0.0 in stage 222.0 (TID 278) in 26 ms on localhost (executor driver) (1/1)
14:51:41.930 INFO TaskSchedulerImpl - Removed TaskSet 222.0, whose tasks have all completed, from pool
14:51:41.930 INFO DAGScheduler - ResultStage 222 (collect at ReadsSparkSinkUnitTest.java:182) finished in 0.037 s
14:51:41.930 INFO DAGScheduler - Job 166 is finished. Cancelling potential speculative or zombie tasks for this job
14:51:41.930 INFO TaskSchedulerImpl - Killing all running tasks in stage 222: Stage finished
14:51:41.930 INFO DAGScheduler - Job 166 finished: collect at ReadsSparkSinkUnitTest.java:182, took 0.037797 s
14:51:41.940 INFO SparkContext - Starting job: count at ReadsSparkSinkUnitTest.java:185
14:51:41.940 INFO DAGScheduler - Got job 167 (count at ReadsSparkSinkUnitTest.java:185) with 1 output partitions
14:51:41.940 INFO DAGScheduler - Final stage: ResultStage 223 (count at ReadsSparkSinkUnitTest.java:185)
14:51:41.940 INFO DAGScheduler - Parents of final stage: List()
14:51:41.940 INFO DAGScheduler - Missing parents: List()
14:51:41.940 INFO DAGScheduler - Submitting ResultStage 223 (MapPartitionsRDD[1046] at filter at ReadsSparkSource.java:96), which has no missing parents
14:51:41.957 INFO MemoryStore - Block broadcast_447 stored as values in memory (estimated size 426.1 KiB, free 1917.4 MiB)
14:51:41.958 INFO MemoryStore - Block broadcast_447_piece0 stored as bytes in memory (estimated size 153.6 KiB, free 1917.2 MiB)
14:51:41.958 INFO BlockManagerInfo - Added broadcast_447_piece0 in memory on localhost:44923 (size: 153.6 KiB, free: 1919.4 MiB)
14:51:41.959 INFO SparkContext - Created broadcast 447 from broadcast at DAGScheduler.scala:1580
14:51:41.959 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 223 (MapPartitionsRDD[1046] at filter at ReadsSparkSource.java:96) (first 15 tasks are for partitions Vector(0))
14:51:41.959 INFO TaskSchedulerImpl - Adding task set 223.0 with 1 tasks resource profile 0
14:51:41.959 INFO TaskSetManager - Starting task 0.0 in stage 223.0 (TID 279) (localhost, executor driver, partition 0, PROCESS_LOCAL, 7872 bytes)
14:51:41.960 INFO Executor - Running task 0.0 in stage 223.0 (TID 279)
14:51:41.992 INFO NewHadoopRDD - Input split: file:/home/runner/work/gatk/gatk/src/test/resources/org/broadinstitute/hellbender/tools/BQSR/HiSeq.1mb.1RG.2k_lines.bam:0+222075
14:51:42.002 INFO Executor - Finished task 0.0 in stage 223.0 (TID 279). 989 bytes result sent to driver
14:51:42.003 INFO TaskSetManager - Finished task 0.0 in stage 223.0 (TID 279) in 44 ms on localhost (executor driver) (1/1)
14:51:42.003 INFO TaskSchedulerImpl - Removed TaskSet 223.0, whose tasks have all completed, from pool
14:51:42.003 INFO DAGScheduler - ResultStage 223 (count at ReadsSparkSinkUnitTest.java:185) finished in 0.063 s
14:51:42.003 INFO DAGScheduler - Job 167 is finished. Cancelling potential speculative or zombie tasks for this job
14:51:42.003 INFO TaskSchedulerImpl - Killing all running tasks in stage 223: Stage finished
14:51:42.003 INFO DAGScheduler - Job 167 finished: count at ReadsSparkSinkUnitTest.java:185, took 0.063576 s
14:51:42.007 INFO SparkContext - Starting job: count at ReadsSparkSinkUnitTest.java:185
14:51:42.007 INFO DAGScheduler - Got job 168 (count at ReadsSparkSinkUnitTest.java:185) with 1 output partitions
14:51:42.007 INFO DAGScheduler - Final stage: ResultStage 224 (count at ReadsSparkSinkUnitTest.java:185)
14:51:42.007 INFO DAGScheduler - Parents of final stage: List()
14:51:42.007 INFO DAGScheduler - Missing parents: List()
14:51:42.007 INFO DAGScheduler - Submitting ResultStage 224 (MapPartitionsRDD[1064] at filter at ReadsSparkSource.java:96), which has no missing parents
14:51:42.013 INFO MemoryStore - Block broadcast_448 stored as values in memory (estimated size 148.1 KiB, free 1917.1 MiB)
14:51:42.014 INFO MemoryStore - Block broadcast_448_piece0 stored as bytes in memory (estimated size 54.6 KiB, free 1917.0 MiB)
14:51:42.014 INFO BlockManagerInfo - Added broadcast_448_piece0 in memory on localhost:44923 (size: 54.6 KiB, free: 1919.4 MiB)
14:51:42.014 INFO SparkContext - Created broadcast 448 from broadcast at DAGScheduler.scala:1580
14:51:42.015 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 224 (MapPartitionsRDD[1064] at filter at ReadsSparkSource.java:96) (first 15 tasks are for partitions Vector(0))
14:51:42.015 INFO TaskSchedulerImpl - Adding task set 224.0 with 1 tasks resource profile 0
14:51:42.015 INFO TaskSetManager - Starting task 0.0 in stage 224.0 (TID 280) (localhost, executor driver, partition 0, ANY, 7852 bytes)
14:51:42.016 INFO Executor - Running task 0.0 in stage 224.0 (TID 280)
14:51:42.028 INFO NewHadoopRDD - Input split: hdfs://localhost:43595/user/runner/ReadsSparkSinkUnitTest1_c9b4dff3-13b5-4caa-96ce-dbe48e2bb056.bam:0+237038
14:51:42.029 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_c9b4dff3-13b5-4caa-96ce-dbe48e2bb056.bam dst=null perm=null proto=rpc
14:51:42.030 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest1_c9b4dff3-13b5-4caa-96ce-dbe48e2bb056.bam dst=null perm=null proto=rpc
14:51:42.031 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_c9b4dff3-13b5-4caa-96ce-dbe48e2bb056.bam.bai dst=null perm=null proto=rpc
14:51:42.031 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_c9b4dff3-13b5-4caa-96ce-dbe48e2bb056.bam.bai dst=null perm=null proto=rpc
14:51:42.032 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest1_c9b4dff3-13b5-4caa-96ce-dbe48e2bb056.bam.bai dst=null perm=null proto=rpc
14:51:42.035 WARN DFSUtil - Unexpected value for data transfer bytes=5516 duration=0
14:51:42.036 WARN DFSUtil - Unexpected value for data transfer bytes=5516 duration=0
14:51:42.037 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
14:51:42.037 WARN DFSUtil - Unexpected value for data transfer bytes=32 duration=0
14:51:42.039 INFO Executor - Finished task 0.0 in stage 224.0 (TID 280). 989 bytes result sent to driver
14:51:42.039 INFO TaskSetManager - Finished task 0.0 in stage 224.0 (TID 280) in 24 ms on localhost (executor driver) (1/1)
14:51:42.039 INFO TaskSchedulerImpl - Removed TaskSet 224.0, whose tasks have all completed, from pool
14:51:42.040 INFO DAGScheduler - ResultStage 224 (count at ReadsSparkSinkUnitTest.java:185) finished in 0.033 s
14:51:42.040 INFO DAGScheduler - Job 168 is finished. Cancelling potential speculative or zombie tasks for this job
14:51:42.040 INFO TaskSchedulerImpl - Killing all running tasks in stage 224: Stage finished
14:51:42.040 INFO DAGScheduler - Job 168 finished: count at ReadsSparkSinkUnitTest.java:185, took 0.033256 s
14:51:42.054 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_e344e4a0-eb33-495d-8c06-c5291625c41b.bam dst=null perm=null proto=rpc
14:51:42.055 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=create src=/user/runner/ReadsSparkSinkUnitTest1_e344e4a0-eb33-495d-8c06-c5291625c41b.bam dst=null perm=runner:supergroup:rw-r--r-- proto=rpc
14:51:42.056 INFO StateChange - DIR* completeFile: /user/runner/ReadsSparkSinkUnitTest1_e344e4a0-eb33-495d-8c06-c5291625c41b.bam is closed by DFSClient_NONMAPREDUCE_-951833550_1
14:51:42.056 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_e344e4a0-eb33-495d-8c06-c5291625c41b.bam dst=null perm=null proto=rpc
14:51:42.059 INFO MemoryStore - Block broadcast_449 stored as values in memory (estimated size 297.9 KiB, free 1916.7 MiB)
14:51:42.065 INFO MemoryStore - Block broadcast_449_piece0 stored as bytes in memory (estimated size 50.2 KiB, free 1916.7 MiB)
14:51:42.066 INFO BlockManagerInfo - Added broadcast_449_piece0 in memory on localhost:44923 (size: 50.2 KiB, free: 1919.3 MiB)
14:51:42.066 INFO SparkContext - Created broadcast 449 from newAPIHadoopFile at PathSplitSource.java:96
14:51:42.089 INFO MemoryStore - Block broadcast_450 stored as values in memory (estimated size 297.9 KiB, free 1916.4 MiB)
14:51:42.095 INFO MemoryStore - Block broadcast_450_piece0 stored as bytes in memory (estimated size 50.2 KiB, free 1916.3 MiB)
14:51:42.095 INFO BlockManagerInfo - Added broadcast_450_piece0 in memory on localhost:44923 (size: 50.2 KiB, free: 1919.3 MiB)
14:51:42.095 INFO SparkContext - Created broadcast 450 from newAPIHadoopFile at PathSplitSource.java:96
14:51:42.116 INFO FileInputFormat - Total input files to process : 1
14:51:42.118 INFO MemoryStore - Block broadcast_451 stored as values in memory (estimated size 160.7 KiB, free 1916.2 MiB)
14:51:42.119 INFO MemoryStore - Block broadcast_451_piece0 stored as bytes in memory (estimated size 9.6 KiB, free 1916.2 MiB)
14:51:42.119 INFO BlockManagerInfo - Added broadcast_451_piece0 in memory on localhost:44923 (size: 9.6 KiB, free: 1919.3 MiB)
14:51:42.119 INFO SparkContext - Created broadcast 451 from broadcast at ReadsSparkSink.java:133
14:51:42.120 WARN HtsjdkReadsRddStorage - Unrecognized write option: DISABLE
14:51:42.121 INFO MemoryStore - Block broadcast_452 stored as values in memory (estimated size 163.2 KiB, free 1916.0 MiB)
14:51:42.122 INFO MemoryStore - Block broadcast_452_piece0 stored as bytes in memory (estimated size 9.6 KiB, free 1916.0 MiB)
14:51:42.122 INFO BlockManagerInfo - Added broadcast_452_piece0 in memory on localhost:44923 (size: 9.6 KiB, free: 1919.3 MiB)
14:51:42.122 INFO SparkContext - Created broadcast 452 from broadcast at BamSink.java:76
14:51:42.125 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_e344e4a0-eb33-495d-8c06-c5291625c41b.bam.parts dst=null perm=null proto=rpc
14:51:42.125 INFO PathOutputCommitterFactory - No output committer factory defined, defaulting to FileOutputCommitterFactory
14:51:42.125 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
14:51:42.125 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
14:51:42.126 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=mkdirs src=/user/runner/ReadsSparkSinkUnitTest1_e344e4a0-eb33-495d-8c06-c5291625c41b.bam.parts/_temporary/0 dst=null perm=runner:supergroup:rwxr-xr-x proto=rpc
14:51:42.132 INFO SparkContext - Starting job: runJob at SparkHadoopWriter.scala:83
14:51:42.133 INFO DAGScheduler - Registering RDD 1078 (mapToPair at SparkUtils.java:161) as input to shuffle 45
14:51:42.133 INFO DAGScheduler - Got job 169 (runJob at SparkHadoopWriter.scala:83) with 1 output partitions
14:51:42.133 INFO DAGScheduler - Final stage: ResultStage 226 (runJob at SparkHadoopWriter.scala:83)
14:51:42.133 INFO DAGScheduler - Parents of final stage: List(ShuffleMapStage 225)
14:51:42.133 INFO DAGScheduler - Missing parents: List(ShuffleMapStage 225)
14:51:42.134 INFO DAGScheduler - Submitting ShuffleMapStage 225 (MapPartitionsRDD[1078] at mapToPair at SparkUtils.java:161), which has no missing parents
14:51:42.158 INFO MemoryStore - Block broadcast_453 stored as values in memory (estimated size 520.4 KiB, free 1915.5 MiB)
14:51:42.159 INFO MemoryStore - Block broadcast_453_piece0 stored as bytes in memory (estimated size 166.1 KiB, free 1915.3 MiB)
14:51:42.159 INFO BlockManagerInfo - Added broadcast_453_piece0 in memory on localhost:44923 (size: 166.1 KiB, free: 1919.1 MiB)
14:51:42.160 INFO SparkContext - Created broadcast 453 from broadcast at DAGScheduler.scala:1580
14:51:42.160 INFO DAGScheduler - Submitting 1 missing tasks from ShuffleMapStage 225 (MapPartitionsRDD[1078] at mapToPair at SparkUtils.java:161) (first 15 tasks are for partitions Vector(0))
14:51:42.160 INFO TaskSchedulerImpl - Adding task set 225.0 with 1 tasks resource profile 0
14:51:42.160 INFO TaskSetManager - Starting task 0.0 in stage 225.0 (TID 281) (localhost, executor driver, partition 0, PROCESS_LOCAL, 7861 bytes)
14:51:42.161 INFO Executor - Running task 0.0 in stage 225.0 (TID 281)
14:51:42.195 INFO NewHadoopRDD - Input split: file:/home/runner/work/gatk/gatk/src/test/resources/org/broadinstitute/hellbender/tools/BQSR/HiSeq.1mb.1RG.2k_lines.bam:0+222075
14:51:42.210 INFO Executor - Finished task 0.0 in stage 225.0 (TID 281). 1148 bytes result sent to driver
14:51:42.210 INFO TaskSetManager - Finished task 0.0 in stage 225.0 (TID 281) in 50 ms on localhost (executor driver) (1/1)
14:51:42.210 INFO TaskSchedulerImpl - Removed TaskSet 225.0, whose tasks have all completed, from pool
14:51:42.211 INFO DAGScheduler - ShuffleMapStage 225 (mapToPair at SparkUtils.java:161) finished in 0.076 s
14:51:42.211 INFO DAGScheduler - looking for newly runnable stages
14:51:42.211 INFO DAGScheduler - running: HashSet()
14:51:42.211 INFO DAGScheduler - waiting: HashSet(ResultStage 226)
14:51:42.211 INFO DAGScheduler - failed: HashSet()
14:51:42.211 INFO DAGScheduler - Submitting ResultStage 226 (MapPartitionsRDD[1083] at mapToPair at BamSink.java:91), which has no missing parents
14:51:42.222 INFO MemoryStore - Block broadcast_454 stored as values in memory (estimated size 241.5 KiB, free 1915.1 MiB)
14:51:42.226 INFO BlockManagerInfo - Removed broadcast_448_piece0 on localhost:44923 in memory (size: 54.6 KiB, free: 1919.2 MiB)
14:51:42.227 INFO MemoryStore - Block broadcast_454_piece0 stored as bytes in memory (estimated size 67.1 KiB, free 1915.2 MiB)
14:51:42.227 INFO BlockManagerInfo - Added broadcast_454_piece0 in memory on localhost:44923 (size: 67.1 KiB, free: 1919.1 MiB)
14:51:42.227 INFO BlockManagerInfo - Removed broadcast_442_piece0 on localhost:44923 in memory (size: 166.1 KiB, free: 1919.3 MiB)
14:51:42.227 INFO SparkContext - Created broadcast 454 from broadcast at DAGScheduler.scala:1580
14:51:42.227 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 226 (MapPartitionsRDD[1083] at mapToPair at BamSink.java:91) (first 15 tasks are for partitions Vector(0))
14:51:42.227 INFO TaskSchedulerImpl - Adding task set 226.0 with 1 tasks resource profile 0
14:51:42.227 INFO BlockManagerInfo - Removed broadcast_447_piece0 on localhost:44923 in memory (size: 153.6 KiB, free: 1919.4 MiB)
14:51:42.228 INFO BlockManagerInfo - Removed broadcast_450_piece0 on localhost:44923 in memory (size: 50.2 KiB, free: 1919.5 MiB)
14:51:42.228 INFO TaskSetManager - Starting task 0.0 in stage 226.0 (TID 282) (localhost, executor driver, partition 0, NODE_LOCAL, 7513 bytes)
14:51:42.228 INFO Executor - Running task 0.0 in stage 226.0 (TID 282)
14:51:42.229 INFO BlockManagerInfo - Removed broadcast_445_piece0 on localhost:44923 in memory (size: 50.2 KiB, free: 1919.5 MiB)
14:51:42.230 INFO BlockManagerInfo - Removed broadcast_443_piece0 on localhost:44923 in memory (size: 67.1 KiB, free: 1919.6 MiB)
14:51:42.231 INFO BlockManagerInfo - Removed broadcast_438_piece0 on localhost:44923 in memory (size: 50.2 KiB, free: 1919.6 MiB)
14:51:42.231 INFO BlockManagerInfo - Removed broadcast_440_piece0 on localhost:44923 in memory (size: 9.6 KiB, free: 1919.6 MiB)
14:51:42.232 INFO BlockManagerInfo - Removed broadcast_446_piece0 on localhost:44923 in memory (size: 54.6 KiB, free: 1919.7 MiB)
14:51:42.232 INFO BlockManagerInfo - Removed broadcast_444_piece0 on localhost:44923 in memory (size: 8.3 KiB, free: 1919.7 MiB)
14:51:42.233 INFO BlockManagerInfo - Removed broadcast_441_piece0 on localhost:44923 in memory (size: 9.6 KiB, free: 1919.7 MiB)
14:51:42.234 INFO ShuffleBlockFetcherIterator - Getting 1 (343.8 KiB) non-empty blocks including 1 (343.8 KiB) local and 0 (0.0 B) host-local and 0 (0.0 B) push-merged-local and 0 (0.0 B) remote blocks
14:51:42.234 INFO ShuffleBlockFetcherIterator - Started 0 remote fetches in 0 ms
14:51:42.252 INFO PathOutputCommitterFactory - No output committer factory defined, defaulting to FileOutputCommitterFactory
14:51:42.252 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
14:51:42.252 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
14:51:42.252 INFO PathOutputCommitterFactory - No output committer factory defined, defaulting to FileOutputCommitterFactory
14:51:42.252 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
14:51:42.252 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
14:51:42.253 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=create src=/user/runner/ReadsSparkSinkUnitTest1_e344e4a0-eb33-495d-8c06-c5291625c41b.bam.parts/_temporary/0/_temporary/attempt_202603041451427712173661158852466_1083_r_000000_0/part-r-00000 dst=null perm=runner:supergroup:rw-r--r-- proto=rpc
14:51:42.254 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=create src=/user/runner/ReadsSparkSinkUnitTest1_e344e4a0-eb33-495d-8c06-c5291625c41b.bam.parts/_temporary/0/_temporary/attempt_202603041451427712173661158852466_1083_r_000000_0/.part-r-00000.bai dst=null perm=runner:supergroup:rw-r--r-- proto=rpc
14:51:42.257 INFO StateChange - BLOCK* allocate blk_1073741885_1061, replicas=127.0.0.1:34059 for /user/runner/ReadsSparkSinkUnitTest1_e344e4a0-eb33-495d-8c06-c5291625c41b.bam.parts/_temporary/0/_temporary/attempt_202603041451427712173661158852466_1083_r_000000_0/part-r-00000
14:51:42.258 INFO DataNode - Receiving BP-1768883704-10.1.0.125-1772635868711:blk_1073741885_1061 src: /127.0.0.1:39160 dest: /127.0.0.1:34059
14:51:42.260 INFO clienttrace - src: /127.0.0.1:39160, dest: /127.0.0.1:34059, bytes: 231298, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-951833550_1, offset: 0, srvID: 3c05d188-bc2c-46c8-958b-79f7184a02a2, blockid: BP-1768883704-10.1.0.125-1772635868711:blk_1073741885_1061, duration(ns): 1090752
14:51:42.260 INFO DataNode - PacketResponder: BP-1768883704-10.1.0.125-1772635868711:blk_1073741885_1061, type=LAST_IN_PIPELINE terminating
14:51:42.260 INFO FSNamesystem - BLOCK* blk_1073741885_1061 is COMMITTED but not COMPLETE(numNodes= 0 < minimum = 1) in file /user/runner/ReadsSparkSinkUnitTest1_e344e4a0-eb33-495d-8c06-c5291625c41b.bam.parts/_temporary/0/_temporary/attempt_202603041451427712173661158852466_1083_r_000000_0/part-r-00000
14:51:42.661 INFO StateChange - DIR* completeFile: /user/runner/ReadsSparkSinkUnitTest1_e344e4a0-eb33-495d-8c06-c5291625c41b.bam.parts/_temporary/0/_temporary/attempt_202603041451427712173661158852466_1083_r_000000_0/part-r-00000 is closed by DFSClient_NONMAPREDUCE_-951833550_1
14:51:42.662 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_e344e4a0-eb33-495d-8c06-c5291625c41b.bam.parts/_temporary/0/_temporary/attempt_202603041451427712173661158852466_1083_r_000000_0/part-r-00000 dst=null perm=null proto=rpc
14:51:42.664 INFO StateChange - BLOCK* allocate blk_1073741886_1062, replicas=127.0.0.1:34059 for /user/runner/ReadsSparkSinkUnitTest1_e344e4a0-eb33-495d-8c06-c5291625c41b.bam.parts/_temporary/0/_temporary/attempt_202603041451427712173661158852466_1083_r_000000_0/.part-r-00000.bai
14:51:42.665 INFO DataNode - Receiving BP-1768883704-10.1.0.125-1772635868711:blk_1073741886_1062 src: /127.0.0.1:39168 dest: /127.0.0.1:34059
14:51:42.666 INFO clienttrace - src: /127.0.0.1:39168, dest: /127.0.0.1:34059, bytes: 5472, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-951833550_1, offset: 0, srvID: 3c05d188-bc2c-46c8-958b-79f7184a02a2, blockid: BP-1768883704-10.1.0.125-1772635868711:blk_1073741886_1062, duration(ns): 533323
14:51:42.666 INFO DataNode - PacketResponder: BP-1768883704-10.1.0.125-1772635868711:blk_1073741886_1062, type=LAST_IN_PIPELINE terminating
14:51:42.667 INFO FSNamesystem - BLOCK* blk_1073741886_1062 is COMMITTED but not COMPLETE(numNodes= 0 < minimum = 1) in file /user/runner/ReadsSparkSinkUnitTest1_e344e4a0-eb33-495d-8c06-c5291625c41b.bam.parts/_temporary/0/_temporary/attempt_202603041451427712173661158852466_1083_r_000000_0/.part-r-00000.bai
14:51:43.031 INFO FsDatasetAsyncDiskService - Scheduling blk_1073741872_1048 replica FinalizedReplica, blk_1073741872_1048, FINALIZED
getNumBytes() = 212
getBytesOnDisk() = 212
getVisibleLength()= 212
getVolume() = /tmp/minicluster_storage16268522075870465194/data/data2
getBlockURI() = file:/tmp/minicluster_storage16268522075870465194/data/data2/current/BP-1768883704-10.1.0.125-1772635868711/current/finalized/subdir0/subdir0/blk_1073741872 for deletion
14:51:43.032 INFO FsDatasetAsyncDiskService - Scheduling blk_1073741873_1049 replica FinalizedReplica, blk_1073741873_1049, FINALIZED
getNumBytes() = 5472
getBytesOnDisk() = 5472
getVisibleLength()= 5472
getVolume() = /tmp/minicluster_storage16268522075870465194/data/data1
getBlockURI() = file:/tmp/minicluster_storage16268522075870465194/data/data1/current/BP-1768883704-10.1.0.125-1772635868711/current/finalized/subdir0/subdir0/blk_1073741873 for deletion
14:51:43.032 INFO FsDatasetAsyncDiskService - Scheduling blk_1073741879_1055 replica FinalizedReplica, blk_1073741879_1055, FINALIZED
getNumBytes() = 13492
getBytesOnDisk() = 13492
getVisibleLength()= 13492
getVolume() = /tmp/minicluster_storage16268522075870465194/data/data1
getBlockURI() = file:/tmp/minicluster_storage16268522075870465194/data/data1/current/BP-1768883704-10.1.0.125-1772635868711/current/finalized/subdir0/subdir0/blk_1073741879 for deletion
14:51:43.032 INFO FsDatasetAsyncDiskService - Scheduling blk_1073741880_1056 replica FinalizedReplica, blk_1073741880_1056, FINALIZED
getNumBytes() = 5472
getBytesOnDisk() = 5472
getVisibleLength()= 5472
getVolume() = /tmp/minicluster_storage16268522075870465194/data/data2
getBlockURI() = file:/tmp/minicluster_storage16268522075870465194/data/data2/current/BP-1768883704-10.1.0.125-1772635868711/current/finalized/subdir0/subdir0/blk_1073741880 for deletion
14:51:43.032 INFO FsDatasetAsyncDiskService - Deleted BP-1768883704-10.1.0.125-1772635868711 blk_1073741872_1048 URI file:/tmp/minicluster_storage16268522075870465194/data/data2/current/BP-1768883704-10.1.0.125-1772635868711/current/finalized/subdir0/subdir0/blk_1073741872
14:51:43.032 INFO FsDatasetAsyncDiskService - Deleted BP-1768883704-10.1.0.125-1772635868711 blk_1073741873_1049 URI file:/tmp/minicluster_storage16268522075870465194/data/data1/current/BP-1768883704-10.1.0.125-1772635868711/current/finalized/subdir0/subdir0/blk_1073741873
14:51:43.032 INFO FsDatasetAsyncDiskService - Deleted BP-1768883704-10.1.0.125-1772635868711 blk_1073741880_1056 URI file:/tmp/minicluster_storage16268522075870465194/data/data2/current/BP-1768883704-10.1.0.125-1772635868711/current/finalized/subdir0/subdir0/blk_1073741880
14:51:43.032 INFO FsDatasetAsyncDiskService - Deleted BP-1768883704-10.1.0.125-1772635868711 blk_1073741879_1055 URI file:/tmp/minicluster_storage16268522075870465194/data/data1/current/BP-1768883704-10.1.0.125-1772635868711/current/finalized/subdir0/subdir0/blk_1073741879
14:51:43.067 INFO StateChange - DIR* completeFile: /user/runner/ReadsSparkSinkUnitTest1_e344e4a0-eb33-495d-8c06-c5291625c41b.bam.parts/_temporary/0/_temporary/attempt_202603041451427712173661158852466_1083_r_000000_0/.part-r-00000.bai is closed by DFSClient_NONMAPREDUCE_-951833550_1
14:51:43.069 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_e344e4a0-eb33-495d-8c06-c5291625c41b.bam.parts/_temporary/0/_temporary/attempt_202603041451427712173661158852466_1083_r_000000_0 dst=null perm=null proto=rpc
14:51:43.069 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_e344e4a0-eb33-495d-8c06-c5291625c41b.bam.parts/_temporary/0/_temporary/attempt_202603041451427712173661158852466_1083_r_000000_0 dst=null perm=null proto=rpc
14:51:43.070 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_e344e4a0-eb33-495d-8c06-c5291625c41b.bam.parts/_temporary/0/task_202603041451427712173661158852466_1083_r_000000 dst=null perm=null proto=rpc
14:51:43.070 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=rename src=/user/runner/ReadsSparkSinkUnitTest1_e344e4a0-eb33-495d-8c06-c5291625c41b.bam.parts/_temporary/0/_temporary/attempt_202603041451427712173661158852466_1083_r_000000_0 dst=/user/runner/ReadsSparkSinkUnitTest1_e344e4a0-eb33-495d-8c06-c5291625c41b.bam.parts/_temporary/0/task_202603041451427712173661158852466_1083_r_000000 perm=runner:supergroup:rwxr-xr-x proto=rpc
14:51:43.071 INFO FileOutputCommitter - Saved output of task 'attempt_202603041451427712173661158852466_1083_r_000000_0' to hdfs://localhost:43595/user/runner/ReadsSparkSinkUnitTest1_e344e4a0-eb33-495d-8c06-c5291625c41b.bam.parts/_temporary/0/task_202603041451427712173661158852466_1083_r_000000
14:51:43.071 INFO SparkHadoopMapRedUtil - attempt_202603041451427712173661158852466_1083_r_000000_0: Committed. Elapsed time: 1 ms.
14:51:43.071 INFO Executor - Finished task 0.0 in stage 226.0 (TID 282). 1858 bytes result sent to driver
14:51:43.072 INFO TaskSetManager - Finished task 0.0 in stage 226.0 (TID 282) in 843 ms on localhost (executor driver) (1/1)
14:51:43.072 INFO TaskSchedulerImpl - Removed TaskSet 226.0, whose tasks have all completed, from pool
14:51:43.072 INFO DAGScheduler - ResultStage 226 (runJob at SparkHadoopWriter.scala:83) finished in 0.861 s
14:51:43.072 INFO DAGScheduler - Job 169 is finished. Cancelling potential speculative or zombie tasks for this job
14:51:43.072 INFO TaskSchedulerImpl - Killing all running tasks in stage 226: Stage finished
14:51:43.072 INFO DAGScheduler - Job 169 finished: runJob at SparkHadoopWriter.scala:83, took 0.939787 s
14:51:43.072 INFO SparkHadoopWriter - Start to commit write Job job_202603041451427712173661158852466_1083.
14:51:43.073 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=listStatus src=/user/runner/ReadsSparkSinkUnitTest1_e344e4a0-eb33-495d-8c06-c5291625c41b.bam.parts/_temporary/0 dst=null perm=null proto=rpc
14:51:43.073 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_e344e4a0-eb33-495d-8c06-c5291625c41b.bam.parts dst=null perm=null proto=rpc
14:51:43.074 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=listStatus src=/user/runner/ReadsSparkSinkUnitTest1_e344e4a0-eb33-495d-8c06-c5291625c41b.bam.parts/_temporary/0/task_202603041451427712173661158852466_1083_r_000000 dst=null perm=null proto=rpc
14:51:43.074 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_e344e4a0-eb33-495d-8c06-c5291625c41b.bam.parts/.part-r-00000.bai dst=null perm=null proto=rpc
14:51:43.075 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=rename src=/user/runner/ReadsSparkSinkUnitTest1_e344e4a0-eb33-495d-8c06-c5291625c41b.bam.parts/_temporary/0/task_202603041451427712173661158852466_1083_r_000000/.part-r-00000.bai dst=/user/runner/ReadsSparkSinkUnitTest1_e344e4a0-eb33-495d-8c06-c5291625c41b.bam.parts/.part-r-00000.bai perm=runner:supergroup:rw-r--r-- proto=rpc
14:51:43.075 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_e344e4a0-eb33-495d-8c06-c5291625c41b.bam.parts/part-r-00000 dst=null perm=null proto=rpc
14:51:43.076 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=rename src=/user/runner/ReadsSparkSinkUnitTest1_e344e4a0-eb33-495d-8c06-c5291625c41b.bam.parts/_temporary/0/task_202603041451427712173661158852466_1083_r_000000/part-r-00000 dst=/user/runner/ReadsSparkSinkUnitTest1_e344e4a0-eb33-495d-8c06-c5291625c41b.bam.parts/part-r-00000 perm=runner:supergroup:rw-r--r-- proto=rpc
14:51:43.076 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=delete src=/user/runner/ReadsSparkSinkUnitTest1_e344e4a0-eb33-495d-8c06-c5291625c41b.bam.parts/_temporary dst=null perm=null proto=rpc
14:51:43.077 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=create src=/user/runner/ReadsSparkSinkUnitTest1_e344e4a0-eb33-495d-8c06-c5291625c41b.bam.parts/_SUCCESS dst=null perm=runner:supergroup:rw-r--r-- proto=rpc
14:51:43.078 INFO StateChange - DIR* completeFile: /user/runner/ReadsSparkSinkUnitTest1_e344e4a0-eb33-495d-8c06-c5291625c41b.bam.parts/_SUCCESS is closed by DFSClient_NONMAPREDUCE_-951833550_1
14:51:43.078 INFO audit - allowed=false ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=delete src=/user/runner/ReadsSparkSinkUnitTest1_e344e4a0-eb33-495d-8c06-c5291625c41b.bam.parts/.spark-staging-1083 dst=null perm=null proto=rpc
14:51:43.078 INFO SparkHadoopWriter - Write Job job_202603041451427712173661158852466_1083 committed. Elapsed time: 6 ms.
14:51:43.079 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=create src=/user/runner/ReadsSparkSinkUnitTest1_e344e4a0-eb33-495d-8c06-c5291625c41b.bam.parts/header dst=null perm=runner:supergroup:rw-r--r-- proto=rpc
14:51:43.080 INFO StateChange - BLOCK* allocate blk_1073741887_1063, replicas=127.0.0.1:34059 for /user/runner/ReadsSparkSinkUnitTest1_e344e4a0-eb33-495d-8c06-c5291625c41b.bam.parts/header
14:51:43.081 INFO DataNode - Receiving BP-1768883704-10.1.0.125-1772635868711:blk_1073741887_1063 src: /127.0.0.1:39178 dest: /127.0.0.1:34059
14:51:43.083 INFO clienttrace - src: /127.0.0.1:39178, dest: /127.0.0.1:34059, bytes: 5712, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-951833550_1, offset: 0, srvID: 3c05d188-bc2c-46c8-958b-79f7184a02a2, blockid: BP-1768883704-10.1.0.125-1772635868711:blk_1073741887_1063, duration(ns): 535402
14:51:43.083 INFO DataNode - PacketResponder: BP-1768883704-10.1.0.125-1772635868711:blk_1073741887_1063, type=LAST_IN_PIPELINE terminating
14:51:43.083 INFO StateChange - DIR* completeFile: /user/runner/ReadsSparkSinkUnitTest1_e344e4a0-eb33-495d-8c06-c5291625c41b.bam.parts/header is closed by DFSClient_NONMAPREDUCE_-951833550_1
14:51:43.084 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=create src=/user/runner/ReadsSparkSinkUnitTest1_e344e4a0-eb33-495d-8c06-c5291625c41b.bam.parts/terminator dst=null perm=runner:supergroup:rw-r--r-- proto=rpc
14:51:43.085 INFO StateChange - BLOCK* allocate blk_1073741888_1064, replicas=127.0.0.1:34059 for /user/runner/ReadsSparkSinkUnitTest1_e344e4a0-eb33-495d-8c06-c5291625c41b.bam.parts/terminator
14:51:43.086 INFO DataNode - Receiving BP-1768883704-10.1.0.125-1772635868711:blk_1073741888_1064 src: /127.0.0.1:39188 dest: /127.0.0.1:34059
14:51:43.087 INFO clienttrace - src: /127.0.0.1:39188, dest: /127.0.0.1:34059, bytes: 28, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-951833550_1, offset: 0, srvID: 3c05d188-bc2c-46c8-958b-79f7184a02a2, blockid: BP-1768883704-10.1.0.125-1772635868711:blk_1073741888_1064, duration(ns): 457043
14:51:43.088 INFO DataNode - PacketResponder: BP-1768883704-10.1.0.125-1772635868711:blk_1073741888_1064, type=LAST_IN_PIPELINE terminating
14:51:43.088 INFO FSNamesystem - BLOCK* blk_1073741888_1064 is COMMITTED but not COMPLETE(numNodes= 0 < minimum = 1) in file /user/runner/ReadsSparkSinkUnitTest1_e344e4a0-eb33-495d-8c06-c5291625c41b.bam.parts/terminator
14:51:43.489 INFO StateChange - DIR* completeFile: /user/runner/ReadsSparkSinkUnitTest1_e344e4a0-eb33-495d-8c06-c5291625c41b.bam.parts/terminator is closed by DFSClient_NONMAPREDUCE_-951833550_1
14:51:43.490 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=listStatus src=/user/runner/ReadsSparkSinkUnitTest1_e344e4a0-eb33-495d-8c06-c5291625c41b.bam.parts dst=null perm=null proto=rpc
14:51:43.491 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=create src=/user/runner/ReadsSparkSinkUnitTest1_e344e4a0-eb33-495d-8c06-c5291625c41b.bam.parts/output dst=null perm=runner:supergroup:rw-r--r-- proto=rpc
14:51:43.492 INFO StateChange - DIR* completeFile: /user/runner/ReadsSparkSinkUnitTest1_e344e4a0-eb33-495d-8c06-c5291625c41b.bam.parts/output is closed by DFSClient_NONMAPREDUCE_-951833550_1
14:51:43.492 INFO HadoopFileSystemWrapper - Concatenating 3 parts to hdfs://localhost:43595/user/runner/ReadsSparkSinkUnitTest1_e344e4a0-eb33-495d-8c06-c5291625c41b.bam
14:51:43.492 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=concat src=[/user/runner/ReadsSparkSinkUnitTest1_e344e4a0-eb33-495d-8c06-c5291625c41b.bam.parts/header, /user/runner/ReadsSparkSinkUnitTest1_e344e4a0-eb33-495d-8c06-c5291625c41b.bam.parts/part-r-00000, /user/runner/ReadsSparkSinkUnitTest1_e344e4a0-eb33-495d-8c06-c5291625c41b.bam.parts/terminator] dst=/user/runner/ReadsSparkSinkUnitTest1_e344e4a0-eb33-495d-8c06-c5291625c41b.bam.parts/output perm=runner:supergroup:rw-r--r-- proto=rpc
14:51:43.493 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_e344e4a0-eb33-495d-8c06-c5291625c41b.bam dst=null perm=null proto=rpc
14:51:43.493 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=delete src=/user/runner/ReadsSparkSinkUnitTest1_e344e4a0-eb33-495d-8c06-c5291625c41b.bam dst=null perm=null proto=rpc
14:51:43.494 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=rename src=/user/runner/ReadsSparkSinkUnitTest1_e344e4a0-eb33-495d-8c06-c5291625c41b.bam.parts/output dst=/user/runner/ReadsSparkSinkUnitTest1_e344e4a0-eb33-495d-8c06-c5291625c41b.bam perm=runner:supergroup:rw-r--r-- proto=rpc
14:51:43.494 INFO HadoopFileSystemWrapper - Concatenating to hdfs://localhost:43595/user/runner/ReadsSparkSinkUnitTest1_e344e4a0-eb33-495d-8c06-c5291625c41b.bam done
14:51:43.494 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_e344e4a0-eb33-495d-8c06-c5291625c41b.bam dst=null perm=null proto=rpc
14:51:43.494 INFO IndexFileMerger - Merging .bai files in temp directory hdfs://localhost:43595/user/runner/ReadsSparkSinkUnitTest1_e344e4a0-eb33-495d-8c06-c5291625c41b.bam.parts/ to hdfs://localhost:43595/user/runner/ReadsSparkSinkUnitTest1_e344e4a0-eb33-495d-8c06-c5291625c41b.bam.bai
14:51:43.495 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=listStatus src=/user/runner/ReadsSparkSinkUnitTest1_e344e4a0-eb33-495d-8c06-c5291625c41b.bam.parts dst=null perm=null proto=rpc
14:51:43.496 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=create src=/user/runner/ReadsSparkSinkUnitTest1_e344e4a0-eb33-495d-8c06-c5291625c41b.bam.bai dst=null perm=runner:supergroup:rw-r--r-- proto=rpc
14:51:43.496 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_e344e4a0-eb33-495d-8c06-c5291625c41b.bam.parts/.part-r-00000.bai dst=null perm=null proto=rpc
14:51:43.497 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest1_e344e4a0-eb33-495d-8c06-c5291625c41b.bam.parts/.part-r-00000.bai dst=null perm=null proto=rpc
14:51:43.498 WARN DFSUtil - Unexpected value for data transfer bytes=5516 duration=0
14:51:43.498 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=delete src=/user/runner/ReadsSparkSinkUnitTest1_e344e4a0-eb33-495d-8c06-c5291625c41b.bam.parts/.part-r-00000.bai dst=null perm=null proto=rpc
14:51:43.499 INFO StateChange - BLOCK* allocate blk_1073741889_1065, replicas=127.0.0.1:34059 for /user/runner/ReadsSparkSinkUnitTest1_e344e4a0-eb33-495d-8c06-c5291625c41b.bam.bai
14:51:43.500 INFO DataNode - Receiving BP-1768883704-10.1.0.125-1772635868711:blk_1073741889_1065 src: /127.0.0.1:39190 dest: /127.0.0.1:34059
14:51:43.501 INFO clienttrace - src: /127.0.0.1:39190, dest: /127.0.0.1:34059, bytes: 5472, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-951833550_1, offset: 0, srvID: 3c05d188-bc2c-46c8-958b-79f7184a02a2, blockid: BP-1768883704-10.1.0.125-1772635868711:blk_1073741889_1065, duration(ns): 447102
14:51:43.501 INFO DataNode - PacketResponder: BP-1768883704-10.1.0.125-1772635868711:blk_1073741889_1065, type=LAST_IN_PIPELINE terminating
14:51:43.502 INFO FSNamesystem - BLOCK* blk_1073741889_1065 is COMMITTED but not COMPLETE(numNodes= 0 < minimum = 1) in file /user/runner/ReadsSparkSinkUnitTest1_e344e4a0-eb33-495d-8c06-c5291625c41b.bam.bai
14:51:43.903 INFO StateChange - DIR* completeFile: /user/runner/ReadsSparkSinkUnitTest1_e344e4a0-eb33-495d-8c06-c5291625c41b.bam.bai is closed by DFSClient_NONMAPREDUCE_-951833550_1
14:51:43.903 INFO IndexFileMerger - Done merging .bai files
14:51:43.904 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=delete src=/user/runner/ReadsSparkSinkUnitTest1_e344e4a0-eb33-495d-8c06-c5291625c41b.bam.parts dst=null perm=null proto=rpc
14:51:43.914 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_e344e4a0-eb33-495d-8c06-c5291625c41b.bam.bai dst=null perm=null proto=rpc
14:51:43.915 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_e344e4a0-eb33-495d-8c06-c5291625c41b.bam dst=null perm=null proto=rpc
14:51:43.915 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_e344e4a0-eb33-495d-8c06-c5291625c41b.bam dst=null perm=null proto=rpc
14:51:43.916 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_e344e4a0-eb33-495d-8c06-c5291625c41b.bam dst=null perm=null proto=rpc
14:51:43.916 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest1_e344e4a0-eb33-495d-8c06-c5291625c41b.bam dst=null perm=null proto=rpc
14:51:43.917 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_e344e4a0-eb33-495d-8c06-c5291625c41b.bam.bai dst=null perm=null proto=rpc
14:51:43.917 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_e344e4a0-eb33-495d-8c06-c5291625c41b.bam.bai dst=null perm=null proto=rpc
14:51:43.918 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest1_e344e4a0-eb33-495d-8c06-c5291625c41b.bam.bai dst=null perm=null proto=rpc
14:51:43.919 WARN DFSUtil - Unexpected value for data transfer bytes=5760 duration=0
14:51:43.921 WARN DFSUtil - Unexpected value for data transfer bytes=5516 duration=0
14:51:43.921 WARN DFSUtil - Unexpected value for data transfer bytes=5516 duration=0
14:51:43.922 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
14:51:43.922 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_e344e4a0-eb33-495d-8c06-c5291625c41b.bam.sbi dst=null perm=null proto=rpc
14:51:43.923 INFO MemoryStore - Block broadcast_455 stored as values in memory (estimated size 297.9 KiB, free 1918.1 MiB)
14:51:43.930 INFO MemoryStore - Block broadcast_455_piece0 stored as bytes in memory (estimated size 50.2 KiB, free 1918.0 MiB)
14:51:43.930 INFO BlockManagerInfo - Added broadcast_455_piece0 in memory on localhost:44923 (size: 50.2 KiB, free: 1919.7 MiB)
14:51:43.930 INFO SparkContext - Created broadcast 455 from newAPIHadoopFile at PathSplitSource.java:96
14:51:43.952 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_e344e4a0-eb33-495d-8c06-c5291625c41b.bam dst=null perm=null proto=rpc
14:51:43.952 INFO FileInputFormat - Total input files to process : 1
14:51:43.952 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest1_e344e4a0-eb33-495d-8c06-c5291625c41b.bam dst=null perm=null proto=rpc
14:51:43.994 INFO SparkContext - Starting job: collect at ReadsSparkSinkUnitTest.java:182
14:51:43.994 INFO DAGScheduler - Got job 170 (collect at ReadsSparkSinkUnitTest.java:182) with 1 output partitions
14:51:43.994 INFO DAGScheduler - Final stage: ResultStage 227 (collect at ReadsSparkSinkUnitTest.java:182)
14:51:43.994 INFO DAGScheduler - Parents of final stage: List()
14:51:43.994 INFO DAGScheduler - Missing parents: List()
14:51:43.995 INFO DAGScheduler - Submitting ResultStage 227 (MapPartitionsRDD[1090] at filter at ReadsSparkSource.java:96), which has no missing parents
14:51:44.012 INFO MemoryStore - Block broadcast_456 stored as values in memory (estimated size 426.2 KiB, free 1917.6 MiB)
14:51:44.013 INFO MemoryStore - Block broadcast_456_piece0 stored as bytes in memory (estimated size 153.6 KiB, free 1917.4 MiB)
14:51:44.013 INFO BlockManagerInfo - Added broadcast_456_piece0 in memory on localhost:44923 (size: 153.6 KiB, free: 1919.5 MiB)
14:51:44.013 INFO SparkContext - Created broadcast 456 from broadcast at DAGScheduler.scala:1580
14:51:44.014 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 227 (MapPartitionsRDD[1090] at filter at ReadsSparkSource.java:96) (first 15 tasks are for partitions Vector(0))
14:51:44.014 INFO TaskSchedulerImpl - Adding task set 227.0 with 1 tasks resource profile 0
14:51:44.014 INFO TaskSetManager - Starting task 0.0 in stage 227.0 (TID 283) (localhost, executor driver, partition 0, ANY, 7852 bytes)
14:51:44.014 INFO Executor - Running task 0.0 in stage 227.0 (TID 283)
14:51:44.046 INFO NewHadoopRDD - Input split: hdfs://localhost:43595/user/runner/ReadsSparkSinkUnitTest1_e344e4a0-eb33-495d-8c06-c5291625c41b.bam:0+237038
14:51:44.046 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_e344e4a0-eb33-495d-8c06-c5291625c41b.bam dst=null perm=null proto=rpc
14:51:44.047 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest1_e344e4a0-eb33-495d-8c06-c5291625c41b.bam dst=null perm=null proto=rpc
14:51:44.048 WARN DFSUtil - Unexpected value for data transfer bytes=5760 duration=0
14:51:44.049 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_e344e4a0-eb33-495d-8c06-c5291625c41b.bam dst=null perm=null proto=rpc
14:51:44.049 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest1_e344e4a0-eb33-495d-8c06-c5291625c41b.bam dst=null perm=null proto=rpc
14:51:44.050 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_e344e4a0-eb33-495d-8c06-c5291625c41b.bam.bai dst=null perm=null proto=rpc
14:51:44.050 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_e344e4a0-eb33-495d-8c06-c5291625c41b.bam.bai dst=null perm=null proto=rpc
14:51:44.051 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest1_e344e4a0-eb33-495d-8c06-c5291625c41b.bam.bai dst=null perm=null proto=rpc
14:51:44.053 WARN DFSUtil - Unexpected value for data transfer bytes=5760 duration=0
14:51:44.055 WARN DFSUtil - Unexpected value for data transfer bytes=5516 duration=0
14:51:44.056 WARN DFSUtil - Unexpected value for data transfer bytes=5516 duration=0
14:51:44.056 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_e344e4a0-eb33-495d-8c06-c5291625c41b.bam dst=null perm=null proto=rpc
14:51:44.056 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest1_e344e4a0-eb33-495d-8c06-c5291625c41b.bam dst=null perm=null proto=rpc
14:51:44.057 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
14:51:44.063 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
14:51:44.064 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
14:51:44.066 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
14:51:44.067 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
14:51:44.068 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
14:51:44.069 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
14:51:44.070 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
14:51:44.070 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
14:51:44.071 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
14:51:44.072 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
14:51:44.073 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
14:51:44.074 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
14:51:44.074 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
14:51:44.075 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
14:51:44.075 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
14:51:44.076 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
14:51:44.077 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
14:51:44.077 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
14:51:44.078 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
14:51:44.079 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
14:51:44.079 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
14:51:44.080 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
14:51:44.081 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
14:51:44.081 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
14:51:44.082 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
14:51:44.083 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
14:51:44.084 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
14:51:44.084 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
14:51:44.085 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
14:51:44.086 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
14:51:44.086 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
14:51:44.087 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
14:51:44.088 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
14:51:44.089 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
14:51:44.090 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
14:51:44.090 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
14:51:44.091 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
14:51:44.092 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
14:51:44.093 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
14:51:44.094 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
14:51:44.095 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
14:51:44.095 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
14:51:44.096 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
14:51:44.097 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
14:51:44.097 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
14:51:44.098 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
14:51:44.099 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
14:51:44.099 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
14:51:44.100 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
14:51:44.101 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
14:51:44.102 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
14:51:44.103 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
14:51:44.104 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
14:51:44.106 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
14:51:44.107 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
14:51:44.109 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
14:51:44.110 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
14:51:44.111 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
14:51:44.112 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
14:51:44.113 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
14:51:44.113 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
14:51:44.113 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_e344e4a0-eb33-495d-8c06-c5291625c41b.bam dst=null perm=null proto=rpc
14:51:44.114 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest1_e344e4a0-eb33-495d-8c06-c5291625c41b.bam dst=null perm=null proto=rpc
14:51:44.115 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_e344e4a0-eb33-495d-8c06-c5291625c41b.bam.bai dst=null perm=null proto=rpc
14:51:44.115 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_e344e4a0-eb33-495d-8c06-c5291625c41b.bam.bai dst=null perm=null proto=rpc
14:51:44.115 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest1_e344e4a0-eb33-495d-8c06-c5291625c41b.bam.bai dst=null perm=null proto=rpc
14:51:44.117 WARN DFSUtil - Unexpected value for data transfer bytes=5760 duration=0
14:51:44.119 WARN DFSUtil - Unexpected value for data transfer bytes=5516 duration=0
14:51:44.119 WARN DFSUtil - Unexpected value for data transfer bytes=5516 duration=0
14:51:44.120 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
14:51:44.121 WARN DFSUtil - Unexpected value for data transfer bytes=32 duration=0
14:51:44.123 INFO Executor - Finished task 0.0 in stage 227.0 (TID 283). 651483 bytes result sent to driver
14:51:44.124 INFO TaskSetManager - Finished task 0.0 in stage 227.0 (TID 283) in 110 ms on localhost (executor driver) (1/1)
14:51:44.125 INFO TaskSchedulerImpl - Removed TaskSet 227.0, whose tasks have all completed, from pool
14:51:44.125 INFO DAGScheduler - ResultStage 227 (collect at ReadsSparkSinkUnitTest.java:182) finished in 0.130 s
14:51:44.125 INFO DAGScheduler - Job 170 is finished. Cancelling potential speculative or zombie tasks for this job
14:51:44.125 INFO TaskSchedulerImpl - Killing all running tasks in stage 227: Stage finished
14:51:44.125 INFO DAGScheduler - Job 170 finished: collect at ReadsSparkSinkUnitTest.java:182, took 0.131198 s
14:51:44.135 INFO SparkContext - Starting job: count at ReadsSparkSinkUnitTest.java:185
14:51:44.135 INFO DAGScheduler - Got job 171 (count at ReadsSparkSinkUnitTest.java:185) with 1 output partitions
14:51:44.135 INFO DAGScheduler - Final stage: ResultStage 228 (count at ReadsSparkSinkUnitTest.java:185)
14:51:44.135 INFO DAGScheduler - Parents of final stage: List()
14:51:44.135 INFO DAGScheduler - Missing parents: List()
14:51:44.135 INFO DAGScheduler - Submitting ResultStage 228 (MapPartitionsRDD[1071] at filter at ReadsSparkSource.java:96), which has no missing parents
14:51:44.152 INFO MemoryStore - Block broadcast_457 stored as values in memory (estimated size 426.1 KiB, free 1917.0 MiB)
14:51:44.154 INFO MemoryStore - Block broadcast_457_piece0 stored as bytes in memory (estimated size 153.6 KiB, free 1916.9 MiB)
14:51:44.154 INFO BlockManagerInfo - Added broadcast_457_piece0 in memory on localhost:44923 (size: 153.6 KiB, free: 1919.4 MiB)
14:51:44.154 INFO SparkContext - Created broadcast 457 from broadcast at DAGScheduler.scala:1580
14:51:44.154 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 228 (MapPartitionsRDD[1071] at filter at ReadsSparkSource.java:96) (first 15 tasks are for partitions Vector(0))
14:51:44.154 INFO TaskSchedulerImpl - Adding task set 228.0 with 1 tasks resource profile 0
14:51:44.155 INFO TaskSetManager - Starting task 0.0 in stage 228.0 (TID 284) (localhost, executor driver, partition 0, PROCESS_LOCAL, 7872 bytes)
14:51:44.155 INFO Executor - Running task 0.0 in stage 228.0 (TID 284)
14:51:44.188 INFO NewHadoopRDD - Input split: file:/home/runner/work/gatk/gatk/src/test/resources/org/broadinstitute/hellbender/tools/BQSR/HiSeq.1mb.1RG.2k_lines.bam:0+222075
14:51:44.198 INFO Executor - Finished task 0.0 in stage 228.0 (TID 284). 989 bytes result sent to driver
14:51:44.198 INFO TaskSetManager - Finished task 0.0 in stage 228.0 (TID 284) in 43 ms on localhost (executor driver) (1/1)
14:51:44.198 INFO TaskSchedulerImpl - Removed TaskSet 228.0, whose tasks have all completed, from pool
14:51:44.199 INFO DAGScheduler - ResultStage 228 (count at ReadsSparkSinkUnitTest.java:185) finished in 0.063 s
14:51:44.199 INFO DAGScheduler - Job 171 is finished. Cancelling potential speculative or zombie tasks for this job
14:51:44.199 INFO TaskSchedulerImpl - Killing all running tasks in stage 228: Stage finished
14:51:44.199 INFO DAGScheduler - Job 171 finished: count at ReadsSparkSinkUnitTest.java:185, took 0.064095 s
14:51:44.202 INFO SparkContext - Starting job: count at ReadsSparkSinkUnitTest.java:185
14:51:44.203 INFO DAGScheduler - Got job 172 (count at ReadsSparkSinkUnitTest.java:185) with 1 output partitions
14:51:44.203 INFO DAGScheduler - Final stage: ResultStage 229 (count at ReadsSparkSinkUnitTest.java:185)
14:51:44.203 INFO DAGScheduler - Parents of final stage: List()
14:51:44.203 INFO DAGScheduler - Missing parents: List()
14:51:44.203 INFO DAGScheduler - Submitting ResultStage 229 (MapPartitionsRDD[1090] at filter at ReadsSparkSource.java:96), which has no missing parents
14:51:44.220 INFO MemoryStore - Block broadcast_458 stored as values in memory (estimated size 426.1 KiB, free 1916.5 MiB)
14:51:44.221 INFO MemoryStore - Block broadcast_458_piece0 stored as bytes in memory (estimated size 153.6 KiB, free 1916.3 MiB)
14:51:44.221 INFO BlockManagerInfo - Added broadcast_458_piece0 in memory on localhost:44923 (size: 153.6 KiB, free: 1919.2 MiB)
14:51:44.222 INFO SparkContext - Created broadcast 458 from broadcast at DAGScheduler.scala:1580
14:51:44.222 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 229 (MapPartitionsRDD[1090] at filter at ReadsSparkSource.java:96) (first 15 tasks are for partitions Vector(0))
14:51:44.222 INFO TaskSchedulerImpl - Adding task set 229.0 with 1 tasks resource profile 0
14:51:44.222 INFO TaskSetManager - Starting task 0.0 in stage 229.0 (TID 285) (localhost, executor driver, partition 0, ANY, 7852 bytes)
14:51:44.223 INFO Executor - Running task 0.0 in stage 229.0 (TID 285)
14:51:44.261 INFO NewHadoopRDD - Input split: hdfs://localhost:43595/user/runner/ReadsSparkSinkUnitTest1_e344e4a0-eb33-495d-8c06-c5291625c41b.bam:0+237038
14:51:44.261 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_e344e4a0-eb33-495d-8c06-c5291625c41b.bam dst=null perm=null proto=rpc
14:51:44.262 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest1_e344e4a0-eb33-495d-8c06-c5291625c41b.bam dst=null perm=null proto=rpc
14:51:44.264 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_e344e4a0-eb33-495d-8c06-c5291625c41b.bam dst=null perm=null proto=rpc
14:51:44.264 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest1_e344e4a0-eb33-495d-8c06-c5291625c41b.bam dst=null perm=null proto=rpc
14:51:44.265 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_e344e4a0-eb33-495d-8c06-c5291625c41b.bam.bai dst=null perm=null proto=rpc
14:51:44.266 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_e344e4a0-eb33-495d-8c06-c5291625c41b.bam.bai dst=null perm=null proto=rpc
14:51:44.266 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest1_e344e4a0-eb33-495d-8c06-c5291625c41b.bam.bai dst=null perm=null proto=rpc
14:51:44.269 WARN DFSUtil - Unexpected value for data transfer bytes=5516 duration=0
14:51:44.270 WARN DFSUtil - Unexpected value for data transfer bytes=5516 duration=0
14:51:44.270 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_e344e4a0-eb33-495d-8c06-c5291625c41b.bam dst=null perm=null proto=rpc
14:51:44.271 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest1_e344e4a0-eb33-495d-8c06-c5291625c41b.bam dst=null perm=null proto=rpc
14:51:44.271 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
14:51:44.272 WARN DFSUtil - Unexpected value for data transfer bytes=5760 duration=0
14:51:44.276 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
14:51:44.277 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
14:51:44.278 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
14:51:44.279 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
14:51:44.280 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
14:51:44.281 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
14:51:44.282 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
14:51:44.282 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
14:51:44.283 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
14:51:44.284 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
14:51:44.284 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
14:51:44.285 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
14:51:44.285 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
14:51:44.286 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
14:51:44.287 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
14:51:44.288 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
14:51:44.289 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
14:51:44.291 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
14:51:44.291 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
14:51:44.292 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
14:51:44.293 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
14:51:44.293 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
14:51:44.294 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
14:51:44.294 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
14:51:44.295 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
14:51:44.296 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
14:51:44.296 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
14:51:44.297 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
14:51:44.298 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
14:51:44.299 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
14:51:44.300 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
14:51:44.301 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
14:51:44.301 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
14:51:44.302 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
14:51:44.303 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
14:51:44.304 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
14:51:44.305 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
14:51:44.305 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
14:51:44.306 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
14:51:44.307 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
14:51:44.308 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
14:51:44.309 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
14:51:44.309 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
14:51:44.310 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
14:51:44.311 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
14:51:44.312 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
14:51:44.313 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
14:51:44.314 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
14:51:44.315 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
14:51:44.317 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
14:51:44.318 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
14:51:44.319 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
14:51:44.320 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
14:51:44.321 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
14:51:44.322 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
14:51:44.323 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
14:51:44.324 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
14:51:44.324 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_e344e4a0-eb33-495d-8c06-c5291625c41b.bam dst=null perm=null proto=rpc
14:51:44.325 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest1_e344e4a0-eb33-495d-8c06-c5291625c41b.bam dst=null perm=null proto=rpc
14:51:44.326 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_e344e4a0-eb33-495d-8c06-c5291625c41b.bam.bai dst=null perm=null proto=rpc
14:51:44.326 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_e344e4a0-eb33-495d-8c06-c5291625c41b.bam.bai dst=null perm=null proto=rpc
14:51:44.327 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest1_e344e4a0-eb33-495d-8c06-c5291625c41b.bam.bai dst=null perm=null proto=rpc
14:51:44.329 WARN DFSUtil - Unexpected value for data transfer bytes=5760 duration=0
14:51:44.332 WARN DFSUtil - Unexpected value for data transfer bytes=5516 duration=0
14:51:44.334 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
14:51:44.334 WARN DFSUtil - Unexpected value for data transfer bytes=32 duration=0
14:51:44.336 INFO Executor - Finished task 0.0 in stage 229.0 (TID 285). 989 bytes result sent to driver
14:51:44.337 INFO TaskSetManager - Finished task 0.0 in stage 229.0 (TID 285) in 115 ms on localhost (executor driver) (1/1)
14:51:44.337 INFO TaskSchedulerImpl - Removed TaskSet 229.0, whose tasks have all completed, from pool
14:51:44.337 INFO DAGScheduler - ResultStage 229 (count at ReadsSparkSinkUnitTest.java:185) finished in 0.134 s
14:51:44.337 INFO DAGScheduler - Job 172 is finished. Cancelling potential speculative or zombie tasks for this job
14:51:44.337 INFO TaskSchedulerImpl - Killing all running tasks in stage 229: Stage finished
14:51:44.337 INFO DAGScheduler - Job 172 finished: count at ReadsSparkSinkUnitTest.java:185, took 0.134862 s
14:51:44.348 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_1d6afac5-d81f-444d-a07b-2b9ec3def158.bam dst=null perm=null proto=rpc
14:51:44.349 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=create src=/user/runner/ReadsSparkSinkUnitTest1_1d6afac5-d81f-444d-a07b-2b9ec3def158.bam dst=null perm=runner:supergroup:rw-r--r-- proto=rpc
14:51:44.350 INFO StateChange - DIR* completeFile: /user/runner/ReadsSparkSinkUnitTest1_1d6afac5-d81f-444d-a07b-2b9ec3def158.bam is closed by DFSClient_NONMAPREDUCE_-951833550_1
14:51:44.351 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_1d6afac5-d81f-444d-a07b-2b9ec3def158.bam dst=null perm=null proto=rpc
14:51:44.354 INFO MemoryStore - Block broadcast_459 stored as values in memory (estimated size 297.9 KiB, free 1916.0 MiB)
14:51:44.361 INFO MemoryStore - Block broadcast_459_piece0 stored as bytes in memory (estimated size 50.2 KiB, free 1916.0 MiB)
14:51:44.361 INFO BlockManagerInfo - Added broadcast_459_piece0 in memory on localhost:44923 (size: 50.2 KiB, free: 1919.2 MiB)
14:51:44.362 INFO SparkContext - Created broadcast 459 from newAPIHadoopFile at PathSplitSource.java:96
14:51:44.383 INFO MemoryStore - Block broadcast_460 stored as values in memory (estimated size 297.9 KiB, free 1915.7 MiB)
14:51:44.389 INFO MemoryStore - Block broadcast_460_piece0 stored as bytes in memory (estimated size 50.2 KiB, free 1915.6 MiB)
14:51:44.390 INFO BlockManagerInfo - Added broadcast_460_piece0 in memory on localhost:44923 (size: 50.2 KiB, free: 1919.1 MiB)
14:51:44.390 INFO SparkContext - Created broadcast 460 from newAPIHadoopFile at PathSplitSource.java:96
14:51:44.411 INFO FileInputFormat - Total input files to process : 1
14:51:44.413 INFO MemoryStore - Block broadcast_461 stored as values in memory (estimated size 160.7 KiB, free 1915.5 MiB)
14:51:44.414 INFO MemoryStore - Block broadcast_461_piece0 stored as bytes in memory (estimated size 9.6 KiB, free 1915.5 MiB)
14:51:44.414 INFO BlockManagerInfo - Added broadcast_461_piece0 in memory on localhost:44923 (size: 9.6 KiB, free: 1919.1 MiB)
14:51:44.414 INFO SparkContext - Created broadcast 461 from broadcast at ReadsSparkSink.java:133
14:51:44.414 WARN HtsjdkReadsRddStorage - Unrecognized write option: DISABLE
14:51:44.415 INFO MemoryStore - Block broadcast_462 stored as values in memory (estimated size 163.2 KiB, free 1915.3 MiB)
14:51:44.416 INFO MemoryStore - Block broadcast_462_piece0 stored as bytes in memory (estimated size 9.6 KiB, free 1915.3 MiB)
14:51:44.416 INFO BlockManagerInfo - Added broadcast_462_piece0 in memory on localhost:44923 (size: 9.6 KiB, free: 1919.1 MiB)
14:51:44.416 INFO SparkContext - Created broadcast 462 from broadcast at BamSink.java:76
14:51:44.418 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_1d6afac5-d81f-444d-a07b-2b9ec3def158.bam.parts dst=null perm=null proto=rpc
14:51:44.419 INFO PathOutputCommitterFactory - No output committer factory defined, defaulting to FileOutputCommitterFactory
14:51:44.419 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
14:51:44.419 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
14:51:44.420 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=mkdirs src=/user/runner/ReadsSparkSinkUnitTest1_1d6afac5-d81f-444d-a07b-2b9ec3def158.bam.parts/_temporary/0 dst=null perm=runner:supergroup:rwxr-xr-x proto=rpc
14:51:44.426 INFO SparkContext - Starting job: runJob at SparkHadoopWriter.scala:83
14:51:44.426 INFO DAGScheduler - Registering RDD 1104 (mapToPair at SparkUtils.java:161) as input to shuffle 46
14:51:44.426 INFO DAGScheduler - Got job 173 (runJob at SparkHadoopWriter.scala:83) with 1 output partitions
14:51:44.426 INFO DAGScheduler - Final stage: ResultStage 231 (runJob at SparkHadoopWriter.scala:83)
14:51:44.426 INFO DAGScheduler - Parents of final stage: List(ShuffleMapStage 230)
14:51:44.426 INFO DAGScheduler - Missing parents: List(ShuffleMapStage 230)
14:51:44.427 INFO DAGScheduler - Submitting ShuffleMapStage 230 (MapPartitionsRDD[1104] at mapToPair at SparkUtils.java:161), which has no missing parents
14:51:44.444 INFO MemoryStore - Block broadcast_463 stored as values in memory (estimated size 520.4 KiB, free 1914.8 MiB)
14:51:44.445 INFO MemoryStore - Block broadcast_463_piece0 stored as bytes in memory (estimated size 166.1 KiB, free 1914.6 MiB)
14:51:44.445 INFO BlockManagerInfo - Added broadcast_463_piece0 in memory on localhost:44923 (size: 166.1 KiB, free: 1918.9 MiB)
14:51:44.446 INFO SparkContext - Created broadcast 463 from broadcast at DAGScheduler.scala:1580
14:51:44.446 INFO DAGScheduler - Submitting 1 missing tasks from ShuffleMapStage 230 (MapPartitionsRDD[1104] at mapToPair at SparkUtils.java:161) (first 15 tasks are for partitions Vector(0))
14:51:44.446 INFO TaskSchedulerImpl - Adding task set 230.0 with 1 tasks resource profile 0
14:51:44.446 INFO TaskSetManager - Starting task 0.0 in stage 230.0 (TID 286) (localhost, executor driver, partition 0, PROCESS_LOCAL, 7861 bytes)
14:51:44.447 INFO Executor - Running task 0.0 in stage 230.0 (TID 286)
14:51:44.482 INFO NewHadoopRDD - Input split: file:/home/runner/work/gatk/gatk/src/test/resources/org/broadinstitute/hellbender/tools/BQSR/HiSeq.1mb.1RG.2k_lines.bam:0+222075
14:51:44.488 INFO BlockManagerInfo - Removed broadcast_458_piece0 on localhost:44923 in memory (size: 153.6 KiB, free: 1919.1 MiB)
14:51:44.488 INFO BlockManagerInfo - Removed broadcast_454_piece0 on localhost:44923 in memory (size: 67.1 KiB, free: 1919.1 MiB)
14:51:44.489 INFO BlockManagerInfo - Removed broadcast_451_piece0 on localhost:44923 in memory (size: 9.6 KiB, free: 1919.2 MiB)
14:51:44.489 INFO BlockManagerInfo - Removed broadcast_452_piece0 on localhost:44923 in memory (size: 9.6 KiB, free: 1919.2 MiB)
14:51:44.489 INFO BlockManagerInfo - Removed broadcast_457_piece0 on localhost:44923 in memory (size: 153.6 KiB, free: 1919.3 MiB)
14:51:44.490 INFO BlockManagerInfo - Removed broadcast_455_piece0 on localhost:44923 in memory (size: 50.2 KiB, free: 1919.4 MiB)
14:51:44.491 INFO BlockManagerInfo - Removed broadcast_456_piece0 on localhost:44923 in memory (size: 153.6 KiB, free: 1919.5 MiB)
14:51:44.491 INFO BlockManagerInfo - Removed broadcast_460_piece0 on localhost:44923 in memory (size: 50.2 KiB, free: 1919.6 MiB)
14:51:44.492 INFO BlockManagerInfo - Removed broadcast_453_piece0 on localhost:44923 in memory (size: 166.1 KiB, free: 1919.7 MiB)
14:51:44.492 INFO BlockManagerInfo - Removed broadcast_449_piece0 on localhost:44923 in memory (size: 50.2 KiB, free: 1919.8 MiB)
14:51:44.504 INFO Executor - Finished task 0.0 in stage 230.0 (TID 286). 1191 bytes result sent to driver
14:51:44.504 INFO TaskSetManager - Finished task 0.0 in stage 230.0 (TID 286) in 58 ms on localhost (executor driver) (1/1)
14:51:44.504 INFO TaskSchedulerImpl - Removed TaskSet 230.0, whose tasks have all completed, from pool
14:51:44.504 INFO DAGScheduler - ShuffleMapStage 230 (mapToPair at SparkUtils.java:161) finished in 0.077 s
14:51:44.505 INFO DAGScheduler - looking for newly runnable stages
14:51:44.505 INFO DAGScheduler - running: HashSet()
14:51:44.505 INFO DAGScheduler - waiting: HashSet(ResultStage 231)
14:51:44.505 INFO DAGScheduler - failed: HashSet()
14:51:44.505 INFO DAGScheduler - Submitting ResultStage 231 (MapPartitionsRDD[1109] at mapToPair at BamSink.java:91), which has no missing parents
14:51:44.511 INFO MemoryStore - Block broadcast_464 stored as values in memory (estimated size 241.5 KiB, free 1918.4 MiB)
14:51:44.512 INFO MemoryStore - Block broadcast_464_piece0 stored as bytes in memory (estimated size 67.1 KiB, free 1918.4 MiB)
14:51:44.512 INFO BlockManagerInfo - Added broadcast_464_piece0 in memory on localhost:44923 (size: 67.1 KiB, free: 1919.7 MiB)
14:51:44.512 INFO SparkContext - Created broadcast 464 from broadcast at DAGScheduler.scala:1580
14:51:44.513 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 231 (MapPartitionsRDD[1109] at mapToPair at BamSink.java:91) (first 15 tasks are for partitions Vector(0))
14:51:44.513 INFO TaskSchedulerImpl - Adding task set 231.0 with 1 tasks resource profile 0
14:51:44.513 INFO TaskSetManager - Starting task 0.0 in stage 231.0 (TID 287) (localhost, executor driver, partition 0, NODE_LOCAL, 7513 bytes)
14:51:44.514 INFO Executor - Running task 0.0 in stage 231.0 (TID 287)
14:51:44.520 INFO ShuffleBlockFetcherIterator - Getting 1 (343.8 KiB) non-empty blocks including 1 (343.8 KiB) local and 0 (0.0 B) host-local and 0 (0.0 B) push-merged-local and 0 (0.0 B) remote blocks
14:51:44.520 INFO ShuffleBlockFetcherIterator - Started 0 remote fetches in 0 ms
14:51:44.532 INFO PathOutputCommitterFactory - No output committer factory defined, defaulting to FileOutputCommitterFactory
14:51:44.532 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
14:51:44.532 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
14:51:44.532 INFO PathOutputCommitterFactory - No output committer factory defined, defaulting to FileOutputCommitterFactory
14:51:44.532 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
14:51:44.532 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
14:51:44.533 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=create src=/user/runner/ReadsSparkSinkUnitTest1_1d6afac5-d81f-444d-a07b-2b9ec3def158.bam.parts/_temporary/0/_temporary/attempt_202603041451441439897909275816844_1109_r_000000_0/part-r-00000 dst=null perm=runner:supergroup:rw-r--r-- proto=rpc
14:51:44.534 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=create src=/user/runner/ReadsSparkSinkUnitTest1_1d6afac5-d81f-444d-a07b-2b9ec3def158.bam.parts/_temporary/0/_temporary/attempt_202603041451441439897909275816844_1109_r_000000_0/.part-r-00000.sbi dst=null perm=runner:supergroup:rw-r--r-- proto=rpc
14:51:44.537 INFO StateChange - BLOCK* allocate blk_1073741890_1066, replicas=127.0.0.1:34059 for /user/runner/ReadsSparkSinkUnitTest1_1d6afac5-d81f-444d-a07b-2b9ec3def158.bam.parts/_temporary/0/_temporary/attempt_202603041451441439897909275816844_1109_r_000000_0/part-r-00000
14:51:44.538 INFO DataNode - Receiving BP-1768883704-10.1.0.125-1772635868711:blk_1073741890_1066 src: /127.0.0.1:39898 dest: /127.0.0.1:34059
14:51:44.540 INFO clienttrace - src: /127.0.0.1:39898, dest: /127.0.0.1:34059, bytes: 231298, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-951833550_1, offset: 0, srvID: 3c05d188-bc2c-46c8-958b-79f7184a02a2, blockid: BP-1768883704-10.1.0.125-1772635868711:blk_1073741890_1066, duration(ns): 1412016
14:51:44.540 INFO DataNode - PacketResponder: BP-1768883704-10.1.0.125-1772635868711:blk_1073741890_1066, type=LAST_IN_PIPELINE terminating
14:51:44.541 INFO FSNamesystem - BLOCK* blk_1073741890_1066 is COMMITTED but not COMPLETE(numNodes= 0 < minimum = 1) in file /user/runner/ReadsSparkSinkUnitTest1_1d6afac5-d81f-444d-a07b-2b9ec3def158.bam.parts/_temporary/0/_temporary/attempt_202603041451441439897909275816844_1109_r_000000_0/part-r-00000
14:51:44.942 INFO StateChange - DIR* completeFile: /user/runner/ReadsSparkSinkUnitTest1_1d6afac5-d81f-444d-a07b-2b9ec3def158.bam.parts/_temporary/0/_temporary/attempt_202603041451441439897909275816844_1109_r_000000_0/part-r-00000 is closed by DFSClient_NONMAPREDUCE_-951833550_1
14:51:44.942 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_1d6afac5-d81f-444d-a07b-2b9ec3def158.bam.parts/_temporary/0/_temporary/attempt_202603041451441439897909275816844_1109_r_000000_0/part-r-00000 dst=null perm=null proto=rpc
14:51:44.943 INFO StateChange - BLOCK* allocate blk_1073741891_1067, replicas=127.0.0.1:34059 for /user/runner/ReadsSparkSinkUnitTest1_1d6afac5-d81f-444d-a07b-2b9ec3def158.bam.parts/_temporary/0/_temporary/attempt_202603041451441439897909275816844_1109_r_000000_0/.part-r-00000.sbi
14:51:44.944 INFO DataNode - Receiving BP-1768883704-10.1.0.125-1772635868711:blk_1073741891_1067 src: /127.0.0.1:39912 dest: /127.0.0.1:34059
14:51:44.946 INFO clienttrace - src: /127.0.0.1:39912, dest: /127.0.0.1:34059, bytes: 212, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-951833550_1, offset: 0, srvID: 3c05d188-bc2c-46c8-958b-79f7184a02a2, blockid: BP-1768883704-10.1.0.125-1772635868711:blk_1073741891_1067, duration(ns): 449327
14:51:44.946 INFO DataNode - PacketResponder: BP-1768883704-10.1.0.125-1772635868711:blk_1073741891_1067, type=LAST_IN_PIPELINE terminating
14:51:44.946 INFO FSNamesystem - BLOCK* blk_1073741891_1067 is COMMITTED but not COMPLETE(numNodes= 0 < minimum = 1) in file /user/runner/ReadsSparkSinkUnitTest1_1d6afac5-d81f-444d-a07b-2b9ec3def158.bam.parts/_temporary/0/_temporary/attempt_202603041451441439897909275816844_1109_r_000000_0/.part-r-00000.sbi
14:51:45.347 INFO StateChange - DIR* completeFile: /user/runner/ReadsSparkSinkUnitTest1_1d6afac5-d81f-444d-a07b-2b9ec3def158.bam.parts/_temporary/0/_temporary/attempt_202603041451441439897909275816844_1109_r_000000_0/.part-r-00000.sbi is closed by DFSClient_NONMAPREDUCE_-951833550_1
14:51:45.348 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_1d6afac5-d81f-444d-a07b-2b9ec3def158.bam.parts/_temporary/0/_temporary/attempt_202603041451441439897909275816844_1109_r_000000_0 dst=null perm=null proto=rpc
14:51:45.349 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_1d6afac5-d81f-444d-a07b-2b9ec3def158.bam.parts/_temporary/0/_temporary/attempt_202603041451441439897909275816844_1109_r_000000_0 dst=null perm=null proto=rpc
14:51:45.349 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_1d6afac5-d81f-444d-a07b-2b9ec3def158.bam.parts/_temporary/0/task_202603041451441439897909275816844_1109_r_000000 dst=null perm=null proto=rpc
14:51:45.350 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=rename src=/user/runner/ReadsSparkSinkUnitTest1_1d6afac5-d81f-444d-a07b-2b9ec3def158.bam.parts/_temporary/0/_temporary/attempt_202603041451441439897909275816844_1109_r_000000_0 dst=/user/runner/ReadsSparkSinkUnitTest1_1d6afac5-d81f-444d-a07b-2b9ec3def158.bam.parts/_temporary/0/task_202603041451441439897909275816844_1109_r_000000 perm=runner:supergroup:rwxr-xr-x proto=rpc
14:51:45.350 INFO FileOutputCommitter - Saved output of task 'attempt_202603041451441439897909275816844_1109_r_000000_0' to hdfs://localhost:43595/user/runner/ReadsSparkSinkUnitTest1_1d6afac5-d81f-444d-a07b-2b9ec3def158.bam.parts/_temporary/0/task_202603041451441439897909275816844_1109_r_000000
14:51:45.350 INFO SparkHadoopMapRedUtil - attempt_202603041451441439897909275816844_1109_r_000000_0: Committed. Elapsed time: 1 ms.
14:51:45.350 INFO Executor - Finished task 0.0 in stage 231.0 (TID 287). 1858 bytes result sent to driver
14:51:45.351 INFO TaskSetManager - Finished task 0.0 in stage 231.0 (TID 287) in 838 ms on localhost (executor driver) (1/1)
14:51:45.351 INFO TaskSchedulerImpl - Removed TaskSet 231.0, whose tasks have all completed, from pool
14:51:45.351 INFO DAGScheduler - ResultStage 231 (runJob at SparkHadoopWriter.scala:83) finished in 0.846 s
14:51:45.351 INFO DAGScheduler - Job 173 is finished. Cancelling potential speculative or zombie tasks for this job
14:51:45.351 INFO TaskSchedulerImpl - Killing all running tasks in stage 231: Stage finished
14:51:45.351 INFO DAGScheduler - Job 173 finished: runJob at SparkHadoopWriter.scala:83, took 0.925337 s
14:51:45.351 INFO SparkHadoopWriter - Start to commit write Job job_202603041451441439897909275816844_1109.
14:51:45.352 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=listStatus src=/user/runner/ReadsSparkSinkUnitTest1_1d6afac5-d81f-444d-a07b-2b9ec3def158.bam.parts/_temporary/0 dst=null perm=null proto=rpc
14:51:45.352 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_1d6afac5-d81f-444d-a07b-2b9ec3def158.bam.parts dst=null perm=null proto=rpc
14:51:45.353 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=listStatus src=/user/runner/ReadsSparkSinkUnitTest1_1d6afac5-d81f-444d-a07b-2b9ec3def158.bam.parts/_temporary/0/task_202603041451441439897909275816844_1109_r_000000 dst=null perm=null proto=rpc
14:51:45.353 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_1d6afac5-d81f-444d-a07b-2b9ec3def158.bam.parts/.part-r-00000.sbi dst=null perm=null proto=rpc
14:51:45.354 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=rename src=/user/runner/ReadsSparkSinkUnitTest1_1d6afac5-d81f-444d-a07b-2b9ec3def158.bam.parts/_temporary/0/task_202603041451441439897909275816844_1109_r_000000/.part-r-00000.sbi dst=/user/runner/ReadsSparkSinkUnitTest1_1d6afac5-d81f-444d-a07b-2b9ec3def158.bam.parts/.part-r-00000.sbi perm=runner:supergroup:rw-r--r-- proto=rpc
14:51:45.354 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_1d6afac5-d81f-444d-a07b-2b9ec3def158.bam.parts/part-r-00000 dst=null perm=null proto=rpc
14:51:45.355 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=rename src=/user/runner/ReadsSparkSinkUnitTest1_1d6afac5-d81f-444d-a07b-2b9ec3def158.bam.parts/_temporary/0/task_202603041451441439897909275816844_1109_r_000000/part-r-00000 dst=/user/runner/ReadsSparkSinkUnitTest1_1d6afac5-d81f-444d-a07b-2b9ec3def158.bam.parts/part-r-00000 perm=runner:supergroup:rw-r--r-- proto=rpc
14:51:45.355 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=delete src=/user/runner/ReadsSparkSinkUnitTest1_1d6afac5-d81f-444d-a07b-2b9ec3def158.bam.parts/_temporary dst=null perm=null proto=rpc
14:51:45.356 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=create src=/user/runner/ReadsSparkSinkUnitTest1_1d6afac5-d81f-444d-a07b-2b9ec3def158.bam.parts/_SUCCESS dst=null perm=runner:supergroup:rw-r--r-- proto=rpc
14:51:45.357 INFO StateChange - DIR* completeFile: /user/runner/ReadsSparkSinkUnitTest1_1d6afac5-d81f-444d-a07b-2b9ec3def158.bam.parts/_SUCCESS is closed by DFSClient_NONMAPREDUCE_-951833550_1
14:51:45.357 INFO audit - allowed=false ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=delete src=/user/runner/ReadsSparkSinkUnitTest1_1d6afac5-d81f-444d-a07b-2b9ec3def158.bam.parts/.spark-staging-1109 dst=null perm=null proto=rpc
14:51:45.358 INFO SparkHadoopWriter - Write Job job_202603041451441439897909275816844_1109 committed. Elapsed time: 6 ms.
14:51:45.358 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=create src=/user/runner/ReadsSparkSinkUnitTest1_1d6afac5-d81f-444d-a07b-2b9ec3def158.bam.parts/header dst=null perm=runner:supergroup:rw-r--r-- proto=rpc
14:51:45.360 INFO StateChange - BLOCK* allocate blk_1073741892_1068, replicas=127.0.0.1:34059 for /user/runner/ReadsSparkSinkUnitTest1_1d6afac5-d81f-444d-a07b-2b9ec3def158.bam.parts/header
14:51:45.361 INFO DataNode - Receiving BP-1768883704-10.1.0.125-1772635868711:blk_1073741892_1068 src: /127.0.0.1:53912 dest: /127.0.0.1:34059
14:51:45.362 INFO clienttrace - src: /127.0.0.1:53912, dest: /127.0.0.1:34059, bytes: 5712, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-951833550_1, offset: 0, srvID: 3c05d188-bc2c-46c8-958b-79f7184a02a2, blockid: BP-1768883704-10.1.0.125-1772635868711:blk_1073741892_1068, duration(ns): 540951
14:51:45.362 INFO DataNode - PacketResponder: BP-1768883704-10.1.0.125-1772635868711:blk_1073741892_1068, type=LAST_IN_PIPELINE terminating
14:51:45.363 INFO FSNamesystem - BLOCK* blk_1073741892_1068 is COMMITTED but not COMPLETE(numNodes= 0 < minimum = 1) in file /user/runner/ReadsSparkSinkUnitTest1_1d6afac5-d81f-444d-a07b-2b9ec3def158.bam.parts/header
14:51:45.764 INFO StateChange - DIR* completeFile: /user/runner/ReadsSparkSinkUnitTest1_1d6afac5-d81f-444d-a07b-2b9ec3def158.bam.parts/header is closed by DFSClient_NONMAPREDUCE_-951833550_1
14:51:45.765 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=create src=/user/runner/ReadsSparkSinkUnitTest1_1d6afac5-d81f-444d-a07b-2b9ec3def158.bam.parts/terminator dst=null perm=runner:supergroup:rw-r--r-- proto=rpc
14:51:45.766 INFO StateChange - BLOCK* allocate blk_1073741893_1069, replicas=127.0.0.1:34059 for /user/runner/ReadsSparkSinkUnitTest1_1d6afac5-d81f-444d-a07b-2b9ec3def158.bam.parts/terminator
14:51:45.767 INFO DataNode - Receiving BP-1768883704-10.1.0.125-1772635868711:blk_1073741893_1069 src: /127.0.0.1:53922 dest: /127.0.0.1:34059
14:51:45.768 INFO clienttrace - src: /127.0.0.1:53922, dest: /127.0.0.1:34059, bytes: 28, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-951833550_1, offset: 0, srvID: 3c05d188-bc2c-46c8-958b-79f7184a02a2, blockid: BP-1768883704-10.1.0.125-1772635868711:blk_1073741893_1069, duration(ns): 486207
14:51:45.768 INFO DataNode - PacketResponder: BP-1768883704-10.1.0.125-1772635868711:blk_1073741893_1069, type=LAST_IN_PIPELINE terminating
14:51:45.769 INFO FSNamesystem - BLOCK* blk_1073741893_1069 is COMMITTED but not COMPLETE(numNodes= 0 < minimum = 1) in file /user/runner/ReadsSparkSinkUnitTest1_1d6afac5-d81f-444d-a07b-2b9ec3def158.bam.parts/terminator
14:51:46.032 INFO FsDatasetAsyncDiskService - Scheduling blk_1073741886_1062 replica FinalizedReplica, blk_1073741886_1062, FINALIZED
getNumBytes() = 5472
getBytesOnDisk() = 5472
getVisibleLength()= 5472
getVolume() = /tmp/minicluster_storage16268522075870465194/data/data2
getBlockURI() = file:/tmp/minicluster_storage16268522075870465194/data/data2/current/BP-1768883704-10.1.0.125-1772635868711/current/finalized/subdir0/subdir0/blk_1073741886 for deletion
14:51:46.032 INFO FsDatasetAsyncDiskService - Deleted BP-1768883704-10.1.0.125-1772635868711 blk_1073741886_1062 URI file:/tmp/minicluster_storage16268522075870465194/data/data2/current/BP-1768883704-10.1.0.125-1772635868711/current/finalized/subdir0/subdir0/blk_1073741886
14:51:46.170 INFO StateChange - DIR* completeFile: /user/runner/ReadsSparkSinkUnitTest1_1d6afac5-d81f-444d-a07b-2b9ec3def158.bam.parts/terminator is closed by DFSClient_NONMAPREDUCE_-951833550_1
14:51:46.170 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=listStatus src=/user/runner/ReadsSparkSinkUnitTest1_1d6afac5-d81f-444d-a07b-2b9ec3def158.bam.parts dst=null perm=null proto=rpc
14:51:46.171 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=create src=/user/runner/ReadsSparkSinkUnitTest1_1d6afac5-d81f-444d-a07b-2b9ec3def158.bam.parts/output dst=null perm=runner:supergroup:rw-r--r-- proto=rpc
14:51:46.172 INFO StateChange - DIR* completeFile: /user/runner/ReadsSparkSinkUnitTest1_1d6afac5-d81f-444d-a07b-2b9ec3def158.bam.parts/output is closed by DFSClient_NONMAPREDUCE_-951833550_1
14:51:46.172 INFO HadoopFileSystemWrapper - Concatenating 3 parts to hdfs://localhost:43595/user/runner/ReadsSparkSinkUnitTest1_1d6afac5-d81f-444d-a07b-2b9ec3def158.bam
14:51:46.172 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=concat src=[/user/runner/ReadsSparkSinkUnitTest1_1d6afac5-d81f-444d-a07b-2b9ec3def158.bam.parts/header, /user/runner/ReadsSparkSinkUnitTest1_1d6afac5-d81f-444d-a07b-2b9ec3def158.bam.parts/part-r-00000, /user/runner/ReadsSparkSinkUnitTest1_1d6afac5-d81f-444d-a07b-2b9ec3def158.bam.parts/terminator] dst=/user/runner/ReadsSparkSinkUnitTest1_1d6afac5-d81f-444d-a07b-2b9ec3def158.bam.parts/output perm=runner:supergroup:rw-r--r-- proto=rpc
14:51:46.173 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_1d6afac5-d81f-444d-a07b-2b9ec3def158.bam dst=null perm=null proto=rpc
14:51:46.173 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=delete src=/user/runner/ReadsSparkSinkUnitTest1_1d6afac5-d81f-444d-a07b-2b9ec3def158.bam dst=null perm=null proto=rpc
14:51:46.174 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=rename src=/user/runner/ReadsSparkSinkUnitTest1_1d6afac5-d81f-444d-a07b-2b9ec3def158.bam.parts/output dst=/user/runner/ReadsSparkSinkUnitTest1_1d6afac5-d81f-444d-a07b-2b9ec3def158.bam perm=runner:supergroup:rw-r--r-- proto=rpc
14:51:46.174 INFO HadoopFileSystemWrapper - Concatenating to hdfs://localhost:43595/user/runner/ReadsSparkSinkUnitTest1_1d6afac5-d81f-444d-a07b-2b9ec3def158.bam done
14:51:46.174 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_1d6afac5-d81f-444d-a07b-2b9ec3def158.bam dst=null perm=null proto=rpc
14:51:46.174 INFO IndexFileMerger - Merging .sbi files in temp directory hdfs://localhost:43595/user/runner/ReadsSparkSinkUnitTest1_1d6afac5-d81f-444d-a07b-2b9ec3def158.bam.parts/ to hdfs://localhost:43595/user/runner/ReadsSparkSinkUnitTest1_1d6afac5-d81f-444d-a07b-2b9ec3def158.bam.sbi
14:51:46.175 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=listStatus src=/user/runner/ReadsSparkSinkUnitTest1_1d6afac5-d81f-444d-a07b-2b9ec3def158.bam.parts dst=null perm=null proto=rpc
14:51:46.176 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=create src=/user/runner/ReadsSparkSinkUnitTest1_1d6afac5-d81f-444d-a07b-2b9ec3def158.bam.sbi dst=null perm=runner:supergroup:rw-r--r-- proto=rpc
14:51:46.177 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_1d6afac5-d81f-444d-a07b-2b9ec3def158.bam.parts/.part-r-00000.sbi dst=null perm=null proto=rpc
14:51:46.177 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest1_1d6afac5-d81f-444d-a07b-2b9ec3def158.bam.parts/.part-r-00000.sbi dst=null perm=null proto=rpc
14:51:46.178 WARN DFSUtil - Unexpected value for data transfer bytes=216 duration=0
14:51:46.179 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=delete src=/user/runner/ReadsSparkSinkUnitTest1_1d6afac5-d81f-444d-a07b-2b9ec3def158.bam.parts/.part-r-00000.sbi dst=null perm=null proto=rpc
14:51:46.180 INFO StateChange - BLOCK* allocate blk_1073741894_1070, replicas=127.0.0.1:34059 for /user/runner/ReadsSparkSinkUnitTest1_1d6afac5-d81f-444d-a07b-2b9ec3def158.bam.sbi
14:51:46.181 INFO DataNode - Receiving BP-1768883704-10.1.0.125-1772635868711:blk_1073741894_1070 src: /127.0.0.1:53930 dest: /127.0.0.1:34059
14:51:46.182 INFO clienttrace - src: /127.0.0.1:53930, dest: /127.0.0.1:34059, bytes: 212, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-951833550_1, offset: 0, srvID: 3c05d188-bc2c-46c8-958b-79f7184a02a2, blockid: BP-1768883704-10.1.0.125-1772635868711:blk_1073741894_1070, duration(ns): 504630
14:51:46.182 INFO DataNode - PacketResponder: BP-1768883704-10.1.0.125-1772635868711:blk_1073741894_1070, type=LAST_IN_PIPELINE terminating
14:51:46.183 INFO FSNamesystem - BLOCK* blk_1073741894_1070 is COMMITTED but not COMPLETE(numNodes= 0 < minimum = 1) in file /user/runner/ReadsSparkSinkUnitTest1_1d6afac5-d81f-444d-a07b-2b9ec3def158.bam.sbi
14:51:46.584 INFO StateChange - DIR* completeFile: /user/runner/ReadsSparkSinkUnitTest1_1d6afac5-d81f-444d-a07b-2b9ec3def158.bam.sbi is closed by DFSClient_NONMAPREDUCE_-951833550_1
14:51:46.584 INFO IndexFileMerger - Done merging .sbi files
14:51:46.585 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=delete src=/user/runner/ReadsSparkSinkUnitTest1_1d6afac5-d81f-444d-a07b-2b9ec3def158.bam.parts dst=null perm=null proto=rpc
14:51:46.594 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_1d6afac5-d81f-444d-a07b-2b9ec3def158.bam.sbi dst=null perm=null proto=rpc
14:51:46.595 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_1d6afac5-d81f-444d-a07b-2b9ec3def158.bam.sbi dst=null perm=null proto=rpc
14:51:46.595 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest1_1d6afac5-d81f-444d-a07b-2b9ec3def158.bam.sbi dst=null perm=null proto=rpc
14:51:46.596 WARN DFSUtil - Unexpected value for data transfer bytes=216 duration=0
14:51:46.597 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_1d6afac5-d81f-444d-a07b-2b9ec3def158.bam dst=null perm=null proto=rpc
14:51:46.597 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_1d6afac5-d81f-444d-a07b-2b9ec3def158.bam dst=null perm=null proto=rpc
14:51:46.598 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_1d6afac5-d81f-444d-a07b-2b9ec3def158.bam dst=null perm=null proto=rpc
14:51:46.598 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest1_1d6afac5-d81f-444d-a07b-2b9ec3def158.bam dst=null perm=null proto=rpc
14:51:46.599 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_1d6afac5-d81f-444d-a07b-2b9ec3def158.bam.bai dst=null perm=null proto=rpc
14:51:46.599 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_1d6afac5-d81f-444d-a07b-2b9ec3def158.bai dst=null perm=null proto=rpc
14:51:46.600 WARN DFSUtil - Unexpected value for data transfer bytes=5760 duration=0
14:51:46.602 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
14:51:46.602 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_1d6afac5-d81f-444d-a07b-2b9ec3def158.bam.sbi dst=null perm=null proto=rpc
14:51:46.602 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_1d6afac5-d81f-444d-a07b-2b9ec3def158.bam.sbi dst=null perm=null proto=rpc
14:51:46.603 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest1_1d6afac5-d81f-444d-a07b-2b9ec3def158.bam.sbi dst=null perm=null proto=rpc
14:51:46.604 WARN DFSUtil - Unexpected value for data transfer bytes=216 duration=0
14:51:46.604 INFO MemoryStore - Block broadcast_465 stored as values in memory (estimated size 320.0 B, free 1918.4 MiB)
14:51:46.605 INFO MemoryStore - Block broadcast_465_piece0 stored as bytes in memory (estimated size 233.0 B, free 1918.4 MiB)
14:51:46.605 INFO BlockManagerInfo - Added broadcast_465_piece0 in memory on localhost:44923 (size: 233.0 B, free: 1919.7 MiB)
14:51:46.605 INFO SparkContext - Created broadcast 465 from broadcast at BamSource.java:104
14:51:46.607 INFO MemoryStore - Block broadcast_466 stored as values in memory (estimated size 297.9 KiB, free 1918.1 MiB)
14:51:46.613 INFO MemoryStore - Block broadcast_466_piece0 stored as bytes in memory (estimated size 50.2 KiB, free 1918.0 MiB)
14:51:46.613 INFO BlockManagerInfo - Added broadcast_466_piece0 in memory on localhost:44923 (size: 50.2 KiB, free: 1919.7 MiB)
14:51:46.614 INFO SparkContext - Created broadcast 466 from newAPIHadoopFile at PathSplitSource.java:96
14:51:46.623 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_1d6afac5-d81f-444d-a07b-2b9ec3def158.bam dst=null perm=null proto=rpc
14:51:46.623 INFO FileInputFormat - Total input files to process : 1
14:51:46.624 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest1_1d6afac5-d81f-444d-a07b-2b9ec3def158.bam dst=null perm=null proto=rpc
14:51:46.639 INFO SparkContext - Starting job: collect at ReadsSparkSinkUnitTest.java:182
14:51:46.639 INFO DAGScheduler - Got job 174 (collect at ReadsSparkSinkUnitTest.java:182) with 1 output partitions
14:51:46.639 INFO DAGScheduler - Final stage: ResultStage 232 (collect at ReadsSparkSinkUnitTest.java:182)
14:51:46.639 INFO DAGScheduler - Parents of final stage: List()
14:51:46.639 INFO DAGScheduler - Missing parents: List()
14:51:46.640 INFO DAGScheduler - Submitting ResultStage 232 (MapPartitionsRDD[1115] at filter at ReadsSparkSource.java:96), which has no missing parents
14:51:46.646 INFO MemoryStore - Block broadcast_467 stored as values in memory (estimated size 148.2 KiB, free 1917.9 MiB)
14:51:46.646 INFO MemoryStore - Block broadcast_467_piece0 stored as bytes in memory (estimated size 54.6 KiB, free 1917.8 MiB)
14:51:46.646 INFO BlockManagerInfo - Added broadcast_467_piece0 in memory on localhost:44923 (size: 54.6 KiB, free: 1919.6 MiB)
14:51:46.647 INFO SparkContext - Created broadcast 467 from broadcast at DAGScheduler.scala:1580
14:51:46.647 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 232 (MapPartitionsRDD[1115] at filter at ReadsSparkSource.java:96) (first 15 tasks are for partitions Vector(0))
14:51:46.647 INFO TaskSchedulerImpl - Adding task set 232.0 with 1 tasks resource profile 0
14:51:46.647 INFO TaskSetManager - Starting task 0.0 in stage 232.0 (TID 288) (localhost, executor driver, partition 0, ANY, 7852 bytes)
14:51:46.648 INFO Executor - Running task 0.0 in stage 232.0 (TID 288)
14:51:46.659 INFO NewHadoopRDD - Input split: hdfs://localhost:43595/user/runner/ReadsSparkSinkUnitTest1_1d6afac5-d81f-444d-a07b-2b9ec3def158.bam:0+237038
14:51:46.660 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_1d6afac5-d81f-444d-a07b-2b9ec3def158.bam dst=null perm=null proto=rpc
14:51:46.661 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest1_1d6afac5-d81f-444d-a07b-2b9ec3def158.bam dst=null perm=null proto=rpc
14:51:46.661 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_1d6afac5-d81f-444d-a07b-2b9ec3def158.bam.bai dst=null perm=null proto=rpc
14:51:46.662 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_1d6afac5-d81f-444d-a07b-2b9ec3def158.bai dst=null perm=null proto=rpc
14:51:46.663 WARN DFSUtil - Unexpected value for data transfer bytes=5760 duration=0
14:51:46.665 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
14:51:46.665 WARN DFSUtil - Unexpected value for data transfer bytes=32 duration=0
14:51:46.667 INFO Executor - Finished task 0.0 in stage 232.0 (TID 288). 651483 bytes result sent to driver
14:51:46.669 INFO TaskSetManager - Finished task 0.0 in stage 232.0 (TID 288) in 22 ms on localhost (executor driver) (1/1)
14:51:46.669 INFO TaskSchedulerImpl - Removed TaskSet 232.0, whose tasks have all completed, from pool
14:51:46.669 INFO DAGScheduler - ResultStage 232 (collect at ReadsSparkSinkUnitTest.java:182) finished in 0.029 s
14:51:46.669 INFO DAGScheduler - Job 174 is finished. Cancelling potential speculative or zombie tasks for this job
14:51:46.669 INFO TaskSchedulerImpl - Killing all running tasks in stage 232: Stage finished
14:51:46.669 INFO DAGScheduler - Job 174 finished: collect at ReadsSparkSinkUnitTest.java:182, took 0.029934 s
14:51:46.681 INFO SparkContext - Starting job: count at ReadsSparkSinkUnitTest.java:185
14:51:46.681 INFO DAGScheduler - Got job 175 (count at ReadsSparkSinkUnitTest.java:185) with 1 output partitions
14:51:46.681 INFO DAGScheduler - Final stage: ResultStage 233 (count at ReadsSparkSinkUnitTest.java:185)
14:51:46.681 INFO DAGScheduler - Parents of final stage: List()
14:51:46.681 INFO DAGScheduler - Missing parents: List()
14:51:46.681 INFO DAGScheduler - Submitting ResultStage 233 (MapPartitionsRDD[1097] at filter at ReadsSparkSource.java:96), which has no missing parents
14:51:46.698 INFO MemoryStore - Block broadcast_468 stored as values in memory (estimated size 426.1 KiB, free 1917.4 MiB)
14:51:46.699 INFO MemoryStore - Block broadcast_468_piece0 stored as bytes in memory (estimated size 153.6 KiB, free 1917.2 MiB)
14:51:46.700 INFO BlockManagerInfo - Added broadcast_468_piece0 in memory on localhost:44923 (size: 153.6 KiB, free: 1919.5 MiB)
14:51:46.700 INFO SparkContext - Created broadcast 468 from broadcast at DAGScheduler.scala:1580
14:51:46.700 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 233 (MapPartitionsRDD[1097] at filter at ReadsSparkSource.java:96) (first 15 tasks are for partitions Vector(0))
14:51:46.700 INFO TaskSchedulerImpl - Adding task set 233.0 with 1 tasks resource profile 0
14:51:46.701 INFO TaskSetManager - Starting task 0.0 in stage 233.0 (TID 289) (localhost, executor driver, partition 0, PROCESS_LOCAL, 7872 bytes)
14:51:46.701 INFO Executor - Running task 0.0 in stage 233.0 (TID 289)
14:51:46.731 INFO NewHadoopRDD - Input split: file:/home/runner/work/gatk/gatk/src/test/resources/org/broadinstitute/hellbender/tools/BQSR/HiSeq.1mb.1RG.2k_lines.bam:0+222075
14:51:46.741 INFO Executor - Finished task 0.0 in stage 233.0 (TID 289). 989 bytes result sent to driver
14:51:46.741 INFO TaskSetManager - Finished task 0.0 in stage 233.0 (TID 289) in 40 ms on localhost (executor driver) (1/1)
14:51:46.741 INFO TaskSchedulerImpl - Removed TaskSet 233.0, whose tasks have all completed, from pool
14:51:46.741 INFO DAGScheduler - ResultStage 233 (count at ReadsSparkSinkUnitTest.java:185) finished in 0.060 s
14:51:46.741 INFO DAGScheduler - Job 175 is finished. Cancelling potential speculative or zombie tasks for this job
14:51:46.741 INFO TaskSchedulerImpl - Killing all running tasks in stage 233: Stage finished
14:51:46.742 INFO DAGScheduler - Job 175 finished: count at ReadsSparkSinkUnitTest.java:185, took 0.060816 s
14:51:46.745 INFO SparkContext - Starting job: count at ReadsSparkSinkUnitTest.java:185
14:51:46.745 INFO DAGScheduler - Got job 176 (count at ReadsSparkSinkUnitTest.java:185) with 1 output partitions
14:51:46.745 INFO DAGScheduler - Final stage: ResultStage 234 (count at ReadsSparkSinkUnitTest.java:185)
14:51:46.745 INFO DAGScheduler - Parents of final stage: List()
14:51:46.745 INFO DAGScheduler - Missing parents: List()
14:51:46.745 INFO DAGScheduler - Submitting ResultStage 234 (MapPartitionsRDD[1115] at filter at ReadsSparkSource.java:96), which has no missing parents
14:51:46.751 INFO MemoryStore - Block broadcast_469 stored as values in memory (estimated size 148.1 KiB, free 1917.1 MiB)
14:51:46.752 INFO MemoryStore - Block broadcast_469_piece0 stored as bytes in memory (estimated size 54.6 KiB, free 1917.1 MiB)
14:51:46.752 INFO BlockManagerInfo - Added broadcast_469_piece0 in memory on localhost:44923 (size: 54.6 KiB, free: 1919.4 MiB)
14:51:46.753 INFO SparkContext - Created broadcast 469 from broadcast at DAGScheduler.scala:1580
14:51:46.753 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 234 (MapPartitionsRDD[1115] at filter at ReadsSparkSource.java:96) (first 15 tasks are for partitions Vector(0))
14:51:46.753 INFO TaskSchedulerImpl - Adding task set 234.0 with 1 tasks resource profile 0
14:51:46.753 INFO TaskSetManager - Starting task 0.0 in stage 234.0 (TID 290) (localhost, executor driver, partition 0, ANY, 7852 bytes)
14:51:46.753 INFO Executor - Running task 0.0 in stage 234.0 (TID 290)
14:51:46.765 INFO NewHadoopRDD - Input split: hdfs://localhost:43595/user/runner/ReadsSparkSinkUnitTest1_1d6afac5-d81f-444d-a07b-2b9ec3def158.bam:0+237038
14:51:46.766 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_1d6afac5-d81f-444d-a07b-2b9ec3def158.bam dst=null perm=null proto=rpc
14:51:46.766 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest1_1d6afac5-d81f-444d-a07b-2b9ec3def158.bam dst=null perm=null proto=rpc
14:51:46.767 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_1d6afac5-d81f-444d-a07b-2b9ec3def158.bam.bai dst=null perm=null proto=rpc
14:51:46.767 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_1d6afac5-d81f-444d-a07b-2b9ec3def158.bai dst=null perm=null proto=rpc
14:51:46.768 WARN DFSUtil - Unexpected value for data transfer bytes=5760 duration=0
14:51:46.770 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
14:51:46.771 WARN DFSUtil - Unexpected value for data transfer bytes=32 duration=0
14:51:46.772 INFO Executor - Finished task 0.0 in stage 234.0 (TID 290). 989 bytes result sent to driver
14:51:46.772 INFO TaskSetManager - Finished task 0.0 in stage 234.0 (TID 290) in 19 ms on localhost (executor driver) (1/1)
14:51:46.773 INFO TaskSchedulerImpl - Removed TaskSet 234.0, whose tasks have all completed, from pool
14:51:46.773 INFO DAGScheduler - ResultStage 234 (count at ReadsSparkSinkUnitTest.java:185) finished in 0.027 s
14:51:46.773 INFO DAGScheduler - Job 176 is finished. Cancelling potential speculative or zombie tasks for this job
14:51:46.773 INFO TaskSchedulerImpl - Killing all running tasks in stage 234: Stage finished
14:51:46.773 INFO DAGScheduler - Job 176 finished: count at ReadsSparkSinkUnitTest.java:185, took 0.027806 s
14:51:46.783 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_a5e78663-0f7c-46ba-87fa-41197b91b030.bam dst=null perm=null proto=rpc
14:51:46.784 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=create src=/user/runner/ReadsSparkSinkUnitTest1_a5e78663-0f7c-46ba-87fa-41197b91b030.bam dst=null perm=runner:supergroup:rw-r--r-- proto=rpc
14:51:46.785 INFO StateChange - DIR* completeFile: /user/runner/ReadsSparkSinkUnitTest1_a5e78663-0f7c-46ba-87fa-41197b91b030.bam is closed by DFSClient_NONMAPREDUCE_-951833550_1
14:51:46.785 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_a5e78663-0f7c-46ba-87fa-41197b91b030.bam dst=null perm=null proto=rpc
14:51:46.788 INFO MemoryStore - Block broadcast_470 stored as values in memory (estimated size 297.9 KiB, free 1916.8 MiB)
14:51:46.796 INFO MemoryStore - Block broadcast_470_piece0 stored as bytes in memory (estimated size 50.2 KiB, free 1916.7 MiB)
14:51:46.796 INFO BlockManagerInfo - Added broadcast_470_piece0 in memory on localhost:44923 (size: 50.2 KiB, free: 1919.3 MiB)
14:51:46.796 INFO SparkContext - Created broadcast 470 from newAPIHadoopFile at PathSplitSource.java:96
14:51:46.817 INFO MemoryStore - Block broadcast_471 stored as values in memory (estimated size 297.9 KiB, free 1916.4 MiB)
14:51:46.823 INFO MemoryStore - Block broadcast_471_piece0 stored as bytes in memory (estimated size 50.2 KiB, free 1916.4 MiB)
14:51:46.823 INFO BlockManagerInfo - Added broadcast_471_piece0 in memory on localhost:44923 (size: 50.2 KiB, free: 1919.3 MiB)
14:51:46.824 INFO SparkContext - Created broadcast 471 from newAPIHadoopFile at PathSplitSource.java:96
14:51:46.843 INFO FileInputFormat - Total input files to process : 1
14:51:46.845 INFO MemoryStore - Block broadcast_472 stored as values in memory (estimated size 160.7 KiB, free 1916.2 MiB)
14:51:46.846 INFO MemoryStore - Block broadcast_472_piece0 stored as bytes in memory (estimated size 9.6 KiB, free 1916.2 MiB)
14:51:46.846 INFO BlockManagerInfo - Added broadcast_472_piece0 in memory on localhost:44923 (size: 9.6 KiB, free: 1919.3 MiB)
14:51:46.846 INFO SparkContext - Created broadcast 472 from broadcast at ReadsSparkSink.java:133
14:51:46.846 WARN HtsjdkReadsRddStorage - Unrecognized write option: DISABLE
14:51:46.846 WARN HtsjdkReadsRddStorage - Unrecognized write option: DISABLE
14:51:46.847 INFO MemoryStore - Block broadcast_473 stored as values in memory (estimated size 163.2 KiB, free 1916.0 MiB)
14:51:46.848 INFO MemoryStore - Block broadcast_473_piece0 stored as bytes in memory (estimated size 9.6 KiB, free 1916.0 MiB)
14:51:46.848 INFO BlockManagerInfo - Added broadcast_473_piece0 in memory on localhost:44923 (size: 9.6 KiB, free: 1919.3 MiB)
14:51:46.848 INFO SparkContext - Created broadcast 473 from broadcast at BamSink.java:76
14:51:46.850 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_a5e78663-0f7c-46ba-87fa-41197b91b030.bam.parts dst=null perm=null proto=rpc
14:51:46.850 INFO PathOutputCommitterFactory - No output committer factory defined, defaulting to FileOutputCommitterFactory
14:51:46.850 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
14:51:46.850 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
14:51:46.851 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=mkdirs src=/user/runner/ReadsSparkSinkUnitTest1_a5e78663-0f7c-46ba-87fa-41197b91b030.bam.parts/_temporary/0 dst=null perm=runner:supergroup:rwxr-xr-x proto=rpc
14:51:46.857 INFO SparkContext - Starting job: runJob at SparkHadoopWriter.scala:83
14:51:46.858 INFO DAGScheduler - Registering RDD 1129 (mapToPair at SparkUtils.java:161) as input to shuffle 47
14:51:46.858 INFO DAGScheduler - Got job 177 (runJob at SparkHadoopWriter.scala:83) with 1 output partitions
14:51:46.858 INFO DAGScheduler - Final stage: ResultStage 236 (runJob at SparkHadoopWriter.scala:83)
14:51:46.858 INFO DAGScheduler - Parents of final stage: List(ShuffleMapStage 235)
14:51:46.858 INFO DAGScheduler - Missing parents: List(ShuffleMapStage 235)
14:51:46.858 INFO DAGScheduler - Submitting ShuffleMapStage 235 (MapPartitionsRDD[1129] at mapToPair at SparkUtils.java:161), which has no missing parents
14:51:46.875 INFO MemoryStore - Block broadcast_474 stored as values in memory (estimated size 520.4 KiB, free 1915.5 MiB)
14:51:46.877 INFO MemoryStore - Block broadcast_474_piece0 stored as bytes in memory (estimated size 166.1 KiB, free 1915.4 MiB)
14:51:46.877 INFO BlockManagerInfo - Added broadcast_474_piece0 in memory on localhost:44923 (size: 166.1 KiB, free: 1919.1 MiB)
14:51:46.877 INFO SparkContext - Created broadcast 474 from broadcast at DAGScheduler.scala:1580
14:51:46.877 INFO DAGScheduler - Submitting 1 missing tasks from ShuffleMapStage 235 (MapPartitionsRDD[1129] at mapToPair at SparkUtils.java:161) (first 15 tasks are for partitions Vector(0))
14:51:46.878 INFO TaskSchedulerImpl - Adding task set 235.0 with 1 tasks resource profile 0
14:51:46.878 INFO TaskSetManager - Starting task 0.0 in stage 235.0 (TID 291) (localhost, executor driver, partition 0, PROCESS_LOCAL, 7861 bytes)
14:51:46.878 INFO Executor - Running task 0.0 in stage 235.0 (TID 291)
14:51:46.911 INFO NewHadoopRDD - Input split: file:/home/runner/work/gatk/gatk/src/test/resources/org/broadinstitute/hellbender/tools/BQSR/HiSeq.1mb.1RG.2k_lines.bam:0+222075
14:51:46.926 INFO Executor - Finished task 0.0 in stage 235.0 (TID 291). 1148 bytes result sent to driver
14:51:46.926 INFO TaskSetManager - Finished task 0.0 in stage 235.0 (TID 291) in 48 ms on localhost (executor driver) (1/1)
14:51:46.926 INFO TaskSchedulerImpl - Removed TaskSet 235.0, whose tasks have all completed, from pool
14:51:46.926 INFO DAGScheduler - ShuffleMapStage 235 (mapToPair at SparkUtils.java:161) finished in 0.067 s
14:51:46.926 INFO DAGScheduler - looking for newly runnable stages
14:51:46.926 INFO DAGScheduler - running: HashSet()
14:51:46.927 INFO DAGScheduler - waiting: HashSet(ResultStage 236)
14:51:46.927 INFO DAGScheduler - failed: HashSet()
14:51:46.927 INFO DAGScheduler - Submitting ResultStage 236 (MapPartitionsRDD[1134] at mapToPair at BamSink.java:91), which has no missing parents
14:51:46.935 INFO MemoryStore - Block broadcast_475 stored as values in memory (estimated size 241.5 KiB, free 1915.1 MiB)
14:51:46.935 INFO MemoryStore - Block broadcast_475_piece0 stored as bytes in memory (estimated size 67.1 KiB, free 1915.1 MiB)
14:51:46.936 INFO BlockManagerInfo - Added broadcast_475_piece0 in memory on localhost:44923 (size: 67.1 KiB, free: 1919.1 MiB)
14:51:46.936 INFO SparkContext - Created broadcast 475 from broadcast at DAGScheduler.scala:1580
14:51:46.936 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 236 (MapPartitionsRDD[1134] at mapToPair at BamSink.java:91) (first 15 tasks are for partitions Vector(0))
14:51:46.936 INFO TaskSchedulerImpl - Adding task set 236.0 with 1 tasks resource profile 0
14:51:46.937 INFO TaskSetManager - Starting task 0.0 in stage 236.0 (TID 292) (localhost, executor driver, partition 0, NODE_LOCAL, 7513 bytes)
14:51:46.937 INFO Executor - Running task 0.0 in stage 236.0 (TID 292)
14:51:46.942 INFO ShuffleBlockFetcherIterator - Getting 1 (343.8 KiB) non-empty blocks including 1 (343.8 KiB) local and 0 (0.0 B) host-local and 0 (0.0 B) push-merged-local and 0 (0.0 B) remote blocks
14:51:46.942 INFO ShuffleBlockFetcherIterator - Started 0 remote fetches in 0 ms
14:51:46.954 INFO PathOutputCommitterFactory - No output committer factory defined, defaulting to FileOutputCommitterFactory
14:51:46.954 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
14:51:46.954 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
14:51:46.954 INFO PathOutputCommitterFactory - No output committer factory defined, defaulting to FileOutputCommitterFactory
14:51:46.954 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
14:51:46.954 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
14:51:46.955 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=create src=/user/runner/ReadsSparkSinkUnitTest1_a5e78663-0f7c-46ba-87fa-41197b91b030.bam.parts/_temporary/0/_temporary/attempt_202603041451465507610149061195936_1134_r_000000_0/part-r-00000 dst=null perm=runner:supergroup:rw-r--r-- proto=rpc
14:51:46.958 INFO StateChange - BLOCK* allocate blk_1073741895_1071, replicas=127.0.0.1:34059 for /user/runner/ReadsSparkSinkUnitTest1_a5e78663-0f7c-46ba-87fa-41197b91b030.bam.parts/_temporary/0/_temporary/attempt_202603041451465507610149061195936_1134_r_000000_0/part-r-00000
14:51:46.960 INFO DataNode - Receiving BP-1768883704-10.1.0.125-1772635868711:blk_1073741895_1071 src: /127.0.0.1:53932 dest: /127.0.0.1:34059
14:51:46.962 INFO clienttrace - src: /127.0.0.1:53932, dest: /127.0.0.1:34059, bytes: 231298, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-951833550_1, offset: 0, srvID: 3c05d188-bc2c-46c8-958b-79f7184a02a2, blockid: BP-1768883704-10.1.0.125-1772635868711:blk_1073741895_1071, duration(ns): 1114812
14:51:46.962 INFO DataNode - PacketResponder: BP-1768883704-10.1.0.125-1772635868711:blk_1073741895_1071, type=LAST_IN_PIPELINE terminating
14:51:46.963 INFO FSNamesystem - BLOCK* blk_1073741895_1071 is COMMITTED but not COMPLETE(numNodes= 0 < minimum = 1) in file /user/runner/ReadsSparkSinkUnitTest1_a5e78663-0f7c-46ba-87fa-41197b91b030.bam.parts/_temporary/0/_temporary/attempt_202603041451465507610149061195936_1134_r_000000_0/part-r-00000
14:51:47.364 INFO StateChange - DIR* completeFile: /user/runner/ReadsSparkSinkUnitTest1_a5e78663-0f7c-46ba-87fa-41197b91b030.bam.parts/_temporary/0/_temporary/attempt_202603041451465507610149061195936_1134_r_000000_0/part-r-00000 is closed by DFSClient_NONMAPREDUCE_-951833550_1
14:51:47.364 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_a5e78663-0f7c-46ba-87fa-41197b91b030.bam.parts/_temporary/0/_temporary/attempt_202603041451465507610149061195936_1134_r_000000_0/part-r-00000 dst=null perm=null proto=rpc
14:51:47.365 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_a5e78663-0f7c-46ba-87fa-41197b91b030.bam.parts/_temporary/0/_temporary/attempt_202603041451465507610149061195936_1134_r_000000_0 dst=null perm=null proto=rpc
14:51:47.366 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_a5e78663-0f7c-46ba-87fa-41197b91b030.bam.parts/_temporary/0/_temporary/attempt_202603041451465507610149061195936_1134_r_000000_0 dst=null perm=null proto=rpc
14:51:47.366 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_a5e78663-0f7c-46ba-87fa-41197b91b030.bam.parts/_temporary/0/task_202603041451465507610149061195936_1134_r_000000 dst=null perm=null proto=rpc
14:51:47.367 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=rename src=/user/runner/ReadsSparkSinkUnitTest1_a5e78663-0f7c-46ba-87fa-41197b91b030.bam.parts/_temporary/0/_temporary/attempt_202603041451465507610149061195936_1134_r_000000_0 dst=/user/runner/ReadsSparkSinkUnitTest1_a5e78663-0f7c-46ba-87fa-41197b91b030.bam.parts/_temporary/0/task_202603041451465507610149061195936_1134_r_000000 perm=runner:supergroup:rwxr-xr-x proto=rpc
14:51:47.367 INFO FileOutputCommitter - Saved output of task 'attempt_202603041451465507610149061195936_1134_r_000000_0' to hdfs://localhost:43595/user/runner/ReadsSparkSinkUnitTest1_a5e78663-0f7c-46ba-87fa-41197b91b030.bam.parts/_temporary/0/task_202603041451465507610149061195936_1134_r_000000
14:51:47.367 INFO SparkHadoopMapRedUtil - attempt_202603041451465507610149061195936_1134_r_000000_0: Committed. Elapsed time: 1 ms.
14:51:47.371 INFO Executor - Finished task 0.0 in stage 236.0 (TID 292). 1944 bytes result sent to driver
14:51:47.371 INFO BlockManagerInfo - Removed broadcast_474_piece0 on localhost:44923 in memory (size: 166.1 KiB, free: 1919.2 MiB)
14:51:47.372 INFO TaskSetManager - Finished task 0.0 in stage 236.0 (TID 292) in 434 ms on localhost (executor driver) (1/1)
14:51:47.372 INFO TaskSchedulerImpl - Removed TaskSet 236.0, whose tasks have all completed, from pool
14:51:47.372 INFO BlockManagerInfo - Removed broadcast_463_piece0 on localhost:44923 in memory (size: 166.1 KiB, free: 1919.4 MiB)
14:51:47.372 INFO DAGScheduler - ResultStage 236 (runJob at SparkHadoopWriter.scala:83) finished in 0.445 s
14:51:47.372 INFO DAGScheduler - Job 177 is finished. Cancelling potential speculative or zombie tasks for this job
14:51:47.372 INFO TaskSchedulerImpl - Killing all running tasks in stage 236: Stage finished
14:51:47.372 INFO DAGScheduler - Job 177 finished: runJob at SparkHadoopWriter.scala:83, took 0.514536 s
14:51:47.372 INFO BlockManagerInfo - Removed broadcast_459_piece0 on localhost:44923 in memory (size: 50.2 KiB, free: 1919.4 MiB)
14:51:47.372 INFO SparkHadoopWriter - Start to commit write Job job_202603041451465507610149061195936_1134.
14:51:47.373 INFO BlockManagerInfo - Removed broadcast_471_piece0 on localhost:44923 in memory (size: 50.2 KiB, free: 1919.5 MiB)
14:51:47.373 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=listStatus src=/user/runner/ReadsSparkSinkUnitTest1_a5e78663-0f7c-46ba-87fa-41197b91b030.bam.parts/_temporary/0 dst=null perm=null proto=rpc
14:51:47.374 INFO BlockManagerInfo - Removed broadcast_464_piece0 on localhost:44923 in memory (size: 67.1 KiB, free: 1919.5 MiB)
14:51:47.374 INFO BlockManagerInfo - Removed broadcast_466_piece0 on localhost:44923 in memory (size: 50.2 KiB, free: 1919.6 MiB)
14:51:47.375 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_a5e78663-0f7c-46ba-87fa-41197b91b030.bam.parts dst=null perm=null proto=rpc
14:51:47.375 INFO BlockManagerInfo - Removed broadcast_465_piece0 on localhost:44923 in memory (size: 233.0 B, free: 1919.6 MiB)
14:51:47.375 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=listStatus src=/user/runner/ReadsSparkSinkUnitTest1_a5e78663-0f7c-46ba-87fa-41197b91b030.bam.parts/_temporary/0/task_202603041451465507610149061195936_1134_r_000000 dst=null perm=null proto=rpc
14:51:47.376 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_a5e78663-0f7c-46ba-87fa-41197b91b030.bam.parts/part-r-00000 dst=null perm=null proto=rpc
14:51:47.376 INFO BlockManagerInfo - Removed broadcast_468_piece0 on localhost:44923 in memory (size: 153.6 KiB, free: 1919.7 MiB)
14:51:47.377 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=rename src=/user/runner/ReadsSparkSinkUnitTest1_a5e78663-0f7c-46ba-87fa-41197b91b030.bam.parts/_temporary/0/task_202603041451465507610149061195936_1134_r_000000/part-r-00000 dst=/user/runner/ReadsSparkSinkUnitTest1_a5e78663-0f7c-46ba-87fa-41197b91b030.bam.parts/part-r-00000 perm=runner:supergroup:rw-r--r-- proto=rpc
14:51:47.377 INFO BlockManagerInfo - Removed broadcast_462_piece0 on localhost:44923 in memory (size: 9.6 KiB, free: 1919.8 MiB)
14:51:47.377 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=delete src=/user/runner/ReadsSparkSinkUnitTest1_a5e78663-0f7c-46ba-87fa-41197b91b030.bam.parts/_temporary dst=null perm=null proto=rpc
14:51:47.377 INFO BlockManagerInfo - Removed broadcast_461_piece0 on localhost:44923 in memory (size: 9.6 KiB, free: 1919.8 MiB)
14:51:47.378 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=create src=/user/runner/ReadsSparkSinkUnitTest1_a5e78663-0f7c-46ba-87fa-41197b91b030.bam.parts/_SUCCESS dst=null perm=runner:supergroup:rw-r--r-- proto=rpc
14:51:47.379 INFO BlockManagerInfo - Removed broadcast_469_piece0 on localhost:44923 in memory (size: 54.6 KiB, free: 1919.8 MiB)
14:51:47.379 INFO StateChange - DIR* completeFile: /user/runner/ReadsSparkSinkUnitTest1_a5e78663-0f7c-46ba-87fa-41197b91b030.bam.parts/_SUCCESS is closed by DFSClient_NONMAPREDUCE_-951833550_1
14:51:47.379 INFO BlockManagerInfo - Removed broadcast_467_piece0 on localhost:44923 in memory (size: 54.6 KiB, free: 1919.9 MiB)
14:51:47.380 INFO audit - allowed=false ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=delete src=/user/runner/ReadsSparkSinkUnitTest1_a5e78663-0f7c-46ba-87fa-41197b91b030.bam.parts/.spark-staging-1134 dst=null perm=null proto=rpc
14:51:47.380 INFO SparkHadoopWriter - Write Job job_202603041451465507610149061195936_1134 committed. Elapsed time: 7 ms.
14:51:47.381 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=create src=/user/runner/ReadsSparkSinkUnitTest1_a5e78663-0f7c-46ba-87fa-41197b91b030.bam.parts/header dst=null perm=runner:supergroup:rw-r--r-- proto=rpc
14:51:47.382 INFO StateChange - BLOCK* allocate blk_1073741896_1072, replicas=127.0.0.1:34059 for /user/runner/ReadsSparkSinkUnitTest1_a5e78663-0f7c-46ba-87fa-41197b91b030.bam.parts/header
14:51:47.383 INFO DataNode - Receiving BP-1768883704-10.1.0.125-1772635868711:blk_1073741896_1072 src: /127.0.0.1:53940 dest: /127.0.0.1:34059
14:51:47.385 INFO clienttrace - src: /127.0.0.1:53940, dest: /127.0.0.1:34059, bytes: 5712, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-951833550_1, offset: 0, srvID: 3c05d188-bc2c-46c8-958b-79f7184a02a2, blockid: BP-1768883704-10.1.0.125-1772635868711:blk_1073741896_1072, duration(ns): 478846
14:51:47.385 INFO DataNode - PacketResponder: BP-1768883704-10.1.0.125-1772635868711:blk_1073741896_1072, type=LAST_IN_PIPELINE terminating
14:51:47.385 INFO FSNamesystem - BLOCK* blk_1073741896_1072 is COMMITTED but not COMPLETE(numNodes= 0 < minimum = 1) in file /user/runner/ReadsSparkSinkUnitTest1_a5e78663-0f7c-46ba-87fa-41197b91b030.bam.parts/header
14:51:47.786 INFO StateChange - DIR* completeFile: /user/runner/ReadsSparkSinkUnitTest1_a5e78663-0f7c-46ba-87fa-41197b91b030.bam.parts/header is closed by DFSClient_NONMAPREDUCE_-951833550_1
14:51:47.787 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=create src=/user/runner/ReadsSparkSinkUnitTest1_a5e78663-0f7c-46ba-87fa-41197b91b030.bam.parts/terminator dst=null perm=runner:supergroup:rw-r--r-- proto=rpc
14:51:47.788 INFO StateChange - BLOCK* allocate blk_1073741897_1073, replicas=127.0.0.1:34059 for /user/runner/ReadsSparkSinkUnitTest1_a5e78663-0f7c-46ba-87fa-41197b91b030.bam.parts/terminator
14:51:47.789 INFO DataNode - Receiving BP-1768883704-10.1.0.125-1772635868711:blk_1073741897_1073 src: /127.0.0.1:53954 dest: /127.0.0.1:34059
14:51:47.791 INFO clienttrace - src: /127.0.0.1:53954, dest: /127.0.0.1:34059, bytes: 28, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-951833550_1, offset: 0, srvID: 3c05d188-bc2c-46c8-958b-79f7184a02a2, blockid: BP-1768883704-10.1.0.125-1772635868711:blk_1073741897_1073, duration(ns): 520648
14:51:47.791 INFO DataNode - PacketResponder: BP-1768883704-10.1.0.125-1772635868711:blk_1073741897_1073, type=LAST_IN_PIPELINE terminating
14:51:47.791 INFO FSNamesystem - BLOCK* blk_1073741897_1073 is COMMITTED but not COMPLETE(numNodes= 0 < minimum = 1) in file /user/runner/ReadsSparkSinkUnitTest1_a5e78663-0f7c-46ba-87fa-41197b91b030.bam.parts/terminator
14:51:48.192 INFO StateChange - DIR* completeFile: /user/runner/ReadsSparkSinkUnitTest1_a5e78663-0f7c-46ba-87fa-41197b91b030.bam.parts/terminator is closed by DFSClient_NONMAPREDUCE_-951833550_1
14:51:48.193 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=listStatus src=/user/runner/ReadsSparkSinkUnitTest1_a5e78663-0f7c-46ba-87fa-41197b91b030.bam.parts dst=null perm=null proto=rpc
14:51:48.194 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=create src=/user/runner/ReadsSparkSinkUnitTest1_a5e78663-0f7c-46ba-87fa-41197b91b030.bam.parts/output dst=null perm=runner:supergroup:rw-r--r-- proto=rpc
14:51:48.194 INFO StateChange - DIR* completeFile: /user/runner/ReadsSparkSinkUnitTest1_a5e78663-0f7c-46ba-87fa-41197b91b030.bam.parts/output is closed by DFSClient_NONMAPREDUCE_-951833550_1
14:51:48.195 INFO HadoopFileSystemWrapper - Concatenating 3 parts to hdfs://localhost:43595/user/runner/ReadsSparkSinkUnitTest1_a5e78663-0f7c-46ba-87fa-41197b91b030.bam
14:51:48.195 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=concat src=[/user/runner/ReadsSparkSinkUnitTest1_a5e78663-0f7c-46ba-87fa-41197b91b030.bam.parts/header, /user/runner/ReadsSparkSinkUnitTest1_a5e78663-0f7c-46ba-87fa-41197b91b030.bam.parts/part-r-00000, /user/runner/ReadsSparkSinkUnitTest1_a5e78663-0f7c-46ba-87fa-41197b91b030.bam.parts/terminator] dst=/user/runner/ReadsSparkSinkUnitTest1_a5e78663-0f7c-46ba-87fa-41197b91b030.bam.parts/output perm=runner:supergroup:rw-r--r-- proto=rpc
14:51:48.195 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_a5e78663-0f7c-46ba-87fa-41197b91b030.bam dst=null perm=null proto=rpc
14:51:48.196 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=delete src=/user/runner/ReadsSparkSinkUnitTest1_a5e78663-0f7c-46ba-87fa-41197b91b030.bam dst=null perm=null proto=rpc
14:51:48.197 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=rename src=/user/runner/ReadsSparkSinkUnitTest1_a5e78663-0f7c-46ba-87fa-41197b91b030.bam.parts/output dst=/user/runner/ReadsSparkSinkUnitTest1_a5e78663-0f7c-46ba-87fa-41197b91b030.bam perm=runner:supergroup:rw-r--r-- proto=rpc
14:51:48.197 INFO HadoopFileSystemWrapper - Concatenating to hdfs://localhost:43595/user/runner/ReadsSparkSinkUnitTest1_a5e78663-0f7c-46ba-87fa-41197b91b030.bam done
14:51:48.197 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_a5e78663-0f7c-46ba-87fa-41197b91b030.bam dst=null perm=null proto=rpc
14:51:48.198 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=delete src=/user/runner/ReadsSparkSinkUnitTest1_a5e78663-0f7c-46ba-87fa-41197b91b030.bam.parts dst=null perm=null proto=rpc
14:51:48.198 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_a5e78663-0f7c-46ba-87fa-41197b91b030.bam dst=null perm=null proto=rpc
14:51:48.199 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_a5e78663-0f7c-46ba-87fa-41197b91b030.bam dst=null perm=null proto=rpc
14:51:48.199 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_a5e78663-0f7c-46ba-87fa-41197b91b030.bam dst=null perm=null proto=rpc
14:51:48.200 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest1_a5e78663-0f7c-46ba-87fa-41197b91b030.bam dst=null perm=null proto=rpc
14:51:48.200 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_a5e78663-0f7c-46ba-87fa-41197b91b030.bam.bai dst=null perm=null proto=rpc
14:51:48.201 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_a5e78663-0f7c-46ba-87fa-41197b91b030.bai dst=null perm=null proto=rpc
14:51:48.202 WARN DFSUtil - Unexpected value for data transfer bytes=5760 duration=0
14:51:48.204 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
14:51:48.204 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_a5e78663-0f7c-46ba-87fa-41197b91b030.bam.sbi dst=null perm=null proto=rpc
14:51:48.206 INFO MemoryStore - Block broadcast_476 stored as values in memory (estimated size 297.9 KiB, free 1918.7 MiB)
14:51:48.214 INFO MemoryStore - Block broadcast_476_piece0 stored as bytes in memory (estimated size 50.2 KiB, free 1918.7 MiB)
14:51:48.214 INFO BlockManagerInfo - Added broadcast_476_piece0 in memory on localhost:44923 (size: 50.2 KiB, free: 1919.8 MiB)
14:51:48.215 INFO SparkContext - Created broadcast 476 from newAPIHadoopFile at PathSplitSource.java:96
14:51:48.235 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_a5e78663-0f7c-46ba-87fa-41197b91b030.bam dst=null perm=null proto=rpc
14:51:48.236 INFO FileInputFormat - Total input files to process : 1
14:51:48.236 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest1_a5e78663-0f7c-46ba-87fa-41197b91b030.bam dst=null perm=null proto=rpc
14:51:48.272 INFO SparkContext - Starting job: collect at ReadsSparkSinkUnitTest.java:182
14:51:48.273 INFO DAGScheduler - Got job 178 (collect at ReadsSparkSinkUnitTest.java:182) with 1 output partitions
14:51:48.273 INFO DAGScheduler - Final stage: ResultStage 237 (collect at ReadsSparkSinkUnitTest.java:182)
14:51:48.273 INFO DAGScheduler - Parents of final stage: List()
14:51:48.273 INFO DAGScheduler - Missing parents: List()
14:51:48.273 INFO DAGScheduler - Submitting ResultStage 237 (MapPartitionsRDD[1141] at filter at ReadsSparkSource.java:96), which has no missing parents
14:51:48.290 INFO MemoryStore - Block broadcast_477 stored as values in memory (estimated size 426.2 KiB, free 1918.3 MiB)
14:51:48.291 INFO MemoryStore - Block broadcast_477_piece0 stored as bytes in memory (estimated size 153.7 KiB, free 1918.1 MiB)
14:51:48.291 INFO BlockManagerInfo - Added broadcast_477_piece0 in memory on localhost:44923 (size: 153.7 KiB, free: 1919.7 MiB)
14:51:48.291 INFO SparkContext - Created broadcast 477 from broadcast at DAGScheduler.scala:1580
14:51:48.292 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 237 (MapPartitionsRDD[1141] at filter at ReadsSparkSource.java:96) (first 15 tasks are for partitions Vector(0))
14:51:48.292 INFO TaskSchedulerImpl - Adding task set 237.0 with 1 tasks resource profile 0
14:51:48.292 INFO TaskSetManager - Starting task 0.0 in stage 237.0 (TID 293) (localhost, executor driver, partition 0, ANY, 7852 bytes)
14:51:48.293 INFO Executor - Running task 0.0 in stage 237.0 (TID 293)
14:51:48.324 INFO NewHadoopRDD - Input split: hdfs://localhost:43595/user/runner/ReadsSparkSinkUnitTest1_a5e78663-0f7c-46ba-87fa-41197b91b030.bam:0+237038
14:51:48.325 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_a5e78663-0f7c-46ba-87fa-41197b91b030.bam dst=null perm=null proto=rpc
14:51:48.326 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest1_a5e78663-0f7c-46ba-87fa-41197b91b030.bam dst=null perm=null proto=rpc
14:51:48.327 WARN DFSUtil - Unexpected value for data transfer bytes=5760 duration=0
14:51:48.327 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_a5e78663-0f7c-46ba-87fa-41197b91b030.bam dst=null perm=null proto=rpc
14:51:48.328 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest1_a5e78663-0f7c-46ba-87fa-41197b91b030.bam dst=null perm=null proto=rpc
14:51:48.328 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_a5e78663-0f7c-46ba-87fa-41197b91b030.bam.bai dst=null perm=null proto=rpc
14:51:48.328 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_a5e78663-0f7c-46ba-87fa-41197b91b030.bai dst=null perm=null proto=rpc
14:51:48.330 WARN DFSUtil - Unexpected value for data transfer bytes=5760 duration=0
14:51:48.331 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_a5e78663-0f7c-46ba-87fa-41197b91b030.bam dst=null perm=null proto=rpc
14:51:48.332 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest1_a5e78663-0f7c-46ba-87fa-41197b91b030.bam dst=null perm=null proto=rpc
14:51:48.333 WARN DFSUtil - Unexpected value for data transfer bytes=5760 duration=0
14:51:48.337 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
14:51:48.338 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
14:51:48.338 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
14:51:48.339 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
14:51:48.341 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
14:51:48.342 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
14:51:48.343 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
14:51:48.344 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
14:51:48.345 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
14:51:48.346 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
14:51:48.346 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
14:51:48.347 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
14:51:48.348 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
14:51:48.348 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
14:51:48.349 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
14:51:48.350 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
14:51:48.351 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
14:51:48.351 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
14:51:48.352 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
14:51:48.353 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
14:51:48.354 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
14:51:48.355 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
14:51:48.355 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
14:51:48.356 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
14:51:48.357 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
14:51:48.358 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
14:51:48.359 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
14:51:48.359 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
14:51:48.360 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
14:51:48.361 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
14:51:48.362 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
14:51:48.363 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
14:51:48.363 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
14:51:48.364 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
14:51:48.365 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
14:51:48.365 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
14:51:48.366 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
14:51:48.366 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
14:51:48.367 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
14:51:48.368 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
14:51:48.368 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
14:51:48.369 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
14:51:48.370 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
14:51:48.371 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
14:51:48.372 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
14:51:48.373 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
14:51:48.375 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
14:51:48.376 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
14:51:48.377 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
14:51:48.378 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
14:51:48.378 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
14:51:48.379 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
14:51:48.380 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
14:51:48.380 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
14:51:48.381 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
14:51:48.381 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
14:51:48.382 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
14:51:48.383 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
14:51:48.383 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
14:51:48.384 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
14:51:48.385 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
14:51:48.385 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
14:51:48.385 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_a5e78663-0f7c-46ba-87fa-41197b91b030.bam dst=null perm=null proto=rpc
14:51:48.385 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest1_a5e78663-0f7c-46ba-87fa-41197b91b030.bam dst=null perm=null proto=rpc
14:51:48.386 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_a5e78663-0f7c-46ba-87fa-41197b91b030.bam.bai dst=null perm=null proto=rpc
14:51:48.387 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_a5e78663-0f7c-46ba-87fa-41197b91b030.bai dst=null perm=null proto=rpc
14:51:48.388 WARN DFSUtil - Unexpected value for data transfer bytes=5760 duration=0
14:51:48.391 WARN DFSUtil - Unexpected value for data transfer bytes=32 duration=0
14:51:48.394 INFO Executor - Finished task 0.0 in stage 237.0 (TID 293). 651526 bytes result sent to driver
14:51:48.396 INFO TaskSetManager - Finished task 0.0 in stage 237.0 (TID 293) in 104 ms on localhost (executor driver) (1/1)
14:51:48.396 INFO TaskSchedulerImpl - Removed TaskSet 237.0, whose tasks have all completed, from pool
14:51:48.397 INFO DAGScheduler - ResultStage 237 (collect at ReadsSparkSinkUnitTest.java:182) finished in 0.123 s
14:51:48.397 INFO DAGScheduler - Job 178 is finished. Cancelling potential speculative or zombie tasks for this job
14:51:48.397 INFO TaskSchedulerImpl - Killing all running tasks in stage 237: Stage finished
14:51:48.397 INFO DAGScheduler - Job 178 finished: collect at ReadsSparkSinkUnitTest.java:182, took 0.124453 s
14:51:48.406 INFO SparkContext - Starting job: count at ReadsSparkSinkUnitTest.java:185
14:51:48.407 INFO DAGScheduler - Got job 179 (count at ReadsSparkSinkUnitTest.java:185) with 1 output partitions
14:51:48.407 INFO DAGScheduler - Final stage: ResultStage 238 (count at ReadsSparkSinkUnitTest.java:185)
14:51:48.407 INFO DAGScheduler - Parents of final stage: List()
14:51:48.407 INFO DAGScheduler - Missing parents: List()
14:51:48.407 INFO DAGScheduler - Submitting ResultStage 238 (MapPartitionsRDD[1122] at filter at ReadsSparkSource.java:96), which has no missing parents
14:51:48.423 INFO MemoryStore - Block broadcast_478 stored as values in memory (estimated size 426.1 KiB, free 1917.7 MiB)
14:51:48.425 INFO MemoryStore - Block broadcast_478_piece0 stored as bytes in memory (estimated size 153.6 KiB, free 1917.6 MiB)
14:51:48.425 INFO BlockManagerInfo - Added broadcast_478_piece0 in memory on localhost:44923 (size: 153.6 KiB, free: 1919.5 MiB)
14:51:48.425 INFO SparkContext - Created broadcast 478 from broadcast at DAGScheduler.scala:1580
14:51:48.425 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 238 (MapPartitionsRDD[1122] at filter at ReadsSparkSource.java:96) (first 15 tasks are for partitions Vector(0))
14:51:48.425 INFO TaskSchedulerImpl - Adding task set 238.0 with 1 tasks resource profile 0
14:51:48.426 INFO TaskSetManager - Starting task 0.0 in stage 238.0 (TID 294) (localhost, executor driver, partition 0, PROCESS_LOCAL, 7872 bytes)
14:51:48.426 INFO Executor - Running task 0.0 in stage 238.0 (TID 294)
14:51:48.459 INFO NewHadoopRDD - Input split: file:/home/runner/work/gatk/gatk/src/test/resources/org/broadinstitute/hellbender/tools/BQSR/HiSeq.1mb.1RG.2k_lines.bam:0+222075
14:51:48.469 INFO Executor - Finished task 0.0 in stage 238.0 (TID 294). 989 bytes result sent to driver
14:51:48.470 INFO TaskSetManager - Finished task 0.0 in stage 238.0 (TID 294) in 44 ms on localhost (executor driver) (1/1)
14:51:48.470 INFO TaskSchedulerImpl - Removed TaskSet 238.0, whose tasks have all completed, from pool
14:51:48.470 INFO DAGScheduler - ResultStage 238 (count at ReadsSparkSinkUnitTest.java:185) finished in 0.063 s
14:51:48.470 INFO DAGScheduler - Job 179 is finished. Cancelling potential speculative or zombie tasks for this job
14:51:48.470 INFO TaskSchedulerImpl - Killing all running tasks in stage 238: Stage finished
14:51:48.470 INFO DAGScheduler - Job 179 finished: count at ReadsSparkSinkUnitTest.java:185, took 0.063833 s
14:51:48.473 INFO SparkContext - Starting job: count at ReadsSparkSinkUnitTest.java:185
14:51:48.474 INFO DAGScheduler - Got job 180 (count at ReadsSparkSinkUnitTest.java:185) with 1 output partitions
14:51:48.474 INFO DAGScheduler - Final stage: ResultStage 239 (count at ReadsSparkSinkUnitTest.java:185)
14:51:48.474 INFO DAGScheduler - Parents of final stage: List()
14:51:48.474 INFO DAGScheduler - Missing parents: List()
14:51:48.474 INFO DAGScheduler - Submitting ResultStage 239 (MapPartitionsRDD[1141] at filter at ReadsSparkSource.java:96), which has no missing parents
14:51:48.491 INFO MemoryStore - Block broadcast_479 stored as values in memory (estimated size 426.1 KiB, free 1917.1 MiB)
14:51:48.492 INFO MemoryStore - Block broadcast_479_piece0 stored as bytes in memory (estimated size 153.6 KiB, free 1917.0 MiB)
14:51:48.492 INFO BlockManagerInfo - Added broadcast_479_piece0 in memory on localhost:44923 (size: 153.6 KiB, free: 1919.4 MiB)
14:51:48.493 INFO SparkContext - Created broadcast 479 from broadcast at DAGScheduler.scala:1580
14:51:48.493 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 239 (MapPartitionsRDD[1141] at filter at ReadsSparkSource.java:96) (first 15 tasks are for partitions Vector(0))
14:51:48.493 INFO TaskSchedulerImpl - Adding task set 239.0 with 1 tasks resource profile 0
14:51:48.493 INFO TaskSetManager - Starting task 0.0 in stage 239.0 (TID 295) (localhost, executor driver, partition 0, ANY, 7852 bytes)
14:51:48.494 INFO Executor - Running task 0.0 in stage 239.0 (TID 295)
14:51:48.525 INFO NewHadoopRDD - Input split: hdfs://localhost:43595/user/runner/ReadsSparkSinkUnitTest1_a5e78663-0f7c-46ba-87fa-41197b91b030.bam:0+237038
14:51:48.525 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_a5e78663-0f7c-46ba-87fa-41197b91b030.bam dst=null perm=null proto=rpc
14:51:48.526 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest1_a5e78663-0f7c-46ba-87fa-41197b91b030.bam dst=null perm=null proto=rpc
14:51:48.527 WARN DFSUtil - Unexpected value for data transfer bytes=5760 duration=0
14:51:48.527 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_a5e78663-0f7c-46ba-87fa-41197b91b030.bam dst=null perm=null proto=rpc
14:51:48.528 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest1_a5e78663-0f7c-46ba-87fa-41197b91b030.bam dst=null perm=null proto=rpc
14:51:48.529 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_a5e78663-0f7c-46ba-87fa-41197b91b030.bam.bai dst=null perm=null proto=rpc
14:51:48.529 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_a5e78663-0f7c-46ba-87fa-41197b91b030.bai dst=null perm=null proto=rpc
14:51:48.531 WARN DFSUtil - Unexpected value for data transfer bytes=5760 duration=0
14:51:48.532 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_a5e78663-0f7c-46ba-87fa-41197b91b030.bam dst=null perm=null proto=rpc
14:51:48.532 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest1_a5e78663-0f7c-46ba-87fa-41197b91b030.bam dst=null perm=null proto=rpc
14:51:48.533 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
14:51:48.534 WARN DFSUtil - Unexpected value for data transfer bytes=5760 duration=0
14:51:48.539 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
14:51:48.540 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
14:51:48.541 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
14:51:48.542 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
14:51:48.543 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
14:51:48.543 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
14:51:48.544 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
14:51:48.544 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
14:51:48.545 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
14:51:48.546 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
14:51:48.547 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
14:51:48.547 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
14:51:48.548 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
14:51:48.549 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
14:51:48.549 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
14:51:48.550 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
14:51:48.551 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
14:51:48.552 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
14:51:48.553 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
14:51:48.553 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
14:51:48.554 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
14:51:48.555 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
14:51:48.556 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
14:51:48.557 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
14:51:48.557 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
14:51:48.558 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
14:51:48.559 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
14:51:48.560 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
14:51:48.560 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
14:51:48.561 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
14:51:48.562 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
14:51:48.562 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
14:51:48.563 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
14:51:48.564 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
14:51:48.564 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
14:51:48.565 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
14:51:48.566 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
14:51:48.567 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
14:51:48.568 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
14:51:48.569 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
14:51:48.570 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
14:51:48.572 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
14:51:48.573 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
14:51:48.573 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
14:51:48.574 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
14:51:48.575 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
14:51:48.576 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
14:51:48.577 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
14:51:48.578 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
14:51:48.578 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
14:51:48.579 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
14:51:48.579 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
14:51:48.580 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
14:51:48.582 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
14:51:48.583 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
14:51:48.584 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
14:51:48.585 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
14:51:48.586 WARN DFSUtil - Unexpected value for data transfer bytes=1632 duration=0
14:51:48.587 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
14:51:48.587 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
14:51:48.587 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_a5e78663-0f7c-46ba-87fa-41197b91b030.bam dst=null perm=null proto=rpc
14:51:48.588 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest1_a5e78663-0f7c-46ba-87fa-41197b91b030.bam dst=null perm=null proto=rpc
14:51:48.589 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_a5e78663-0f7c-46ba-87fa-41197b91b030.bam.bai dst=null perm=null proto=rpc
14:51:48.589 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest1_a5e78663-0f7c-46ba-87fa-41197b91b030.bai dst=null perm=null proto=rpc
14:51:48.591 WARN DFSUtil - Unexpected value for data transfer bytes=5760 duration=0
14:51:48.593 WARN DFSUtil - Unexpected value for data transfer bytes=233106 duration=0
14:51:48.593 WARN DFSUtil - Unexpected value for data transfer bytes=32 duration=0
14:51:48.594 INFO Executor - Finished task 0.0 in stage 239.0 (TID 295). 989 bytes result sent to driver
14:51:48.595 INFO TaskSetManager - Finished task 0.0 in stage 239.0 (TID 295) in 102 ms on localhost (executor driver) (1/1)
14:51:48.595 INFO TaskSchedulerImpl - Removed TaskSet 239.0, whose tasks have all completed, from pool
14:51:48.595 INFO DAGScheduler - ResultStage 239 (count at ReadsSparkSinkUnitTest.java:185) finished in 0.121 s
14:51:48.595 INFO DAGScheduler - Job 180 is finished. Cancelling potential speculative or zombie tasks for this job
14:51:48.595 INFO TaskSchedulerImpl - Killing all running tasks in stage 239: Stage finished
14:51:48.595 INFO DAGScheduler - Job 180 finished: count at ReadsSparkSinkUnitTest.java:185, took 0.121748 s
14:51:48.606 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest2_825df899-4a57-4a10-8230-b0f5031b4021.bam dst=null perm=null proto=rpc
14:51:48.607 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=create src=/user/runner/ReadsSparkSinkUnitTest2_825df899-4a57-4a10-8230-b0f5031b4021.bam dst=null perm=runner:supergroup:rw-r--r-- proto=rpc
14:51:48.608 INFO StateChange - DIR* completeFile: /user/runner/ReadsSparkSinkUnitTest2_825df899-4a57-4a10-8230-b0f5031b4021.bam is closed by DFSClient_NONMAPREDUCE_-951833550_1
14:51:48.608 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest2_825df899-4a57-4a10-8230-b0f5031b4021.bam dst=null perm=null proto=rpc
14:51:48.612 INFO MemoryStore - Block broadcast_480 stored as values in memory (estimated size 298.0 KiB, free 1916.7 MiB)
14:51:48.618 INFO MemoryStore - Block broadcast_480_piece0 stored as bytes in memory (estimated size 50.3 KiB, free 1916.6 MiB)
14:51:48.618 INFO BlockManagerInfo - Added broadcast_480_piece0 in memory on localhost:44923 (size: 50.3 KiB, free: 1919.3 MiB)
14:51:48.618 INFO SparkContext - Created broadcast 480 from newAPIHadoopFile at PathSplitSource.java:96
14:51:48.640 INFO MemoryStore - Block broadcast_481 stored as values in memory (estimated size 298.0 KiB, free 1916.4 MiB)
14:51:48.646 INFO MemoryStore - Block broadcast_481_piece0 stored as bytes in memory (estimated size 50.3 KiB, free 1916.3 MiB)
14:51:48.646 INFO BlockManagerInfo - Added broadcast_481_piece0 in memory on localhost:44923 (size: 50.3 KiB, free: 1919.3 MiB)
14:51:48.646 INFO SparkContext - Created broadcast 481 from newAPIHadoopFile at PathSplitSource.java:96
14:51:48.666 INFO FileInputFormat - Total input files to process : 1
14:51:48.669 INFO MemoryStore - Block broadcast_482 stored as values in memory (estimated size 160.7 KiB, free 1916.1 MiB)
14:51:48.670 INFO MemoryStore - Block broadcast_482_piece0 stored as bytes in memory (estimated size 9.6 KiB, free 1916.1 MiB)
14:51:48.670 INFO BlockManagerInfo - Added broadcast_482_piece0 in memory on localhost:44923 (size: 9.6 KiB, free: 1919.3 MiB)
14:51:48.670 INFO SparkContext - Created broadcast 482 from broadcast at ReadsSparkSink.java:133
14:51:48.672 INFO MemoryStore - Block broadcast_483 stored as values in memory (estimated size 163.2 KiB, free 1916.0 MiB)
14:51:48.673 INFO MemoryStore - Block broadcast_483_piece0 stored as bytes in memory (estimated size 9.6 KiB, free 1916.0 MiB)
14:51:48.673 INFO BlockManagerInfo - Added broadcast_483_piece0 in memory on localhost:44923 (size: 9.6 KiB, free: 1919.3 MiB)
14:51:48.673 INFO SparkContext - Created broadcast 483 from broadcast at BamSink.java:76
14:51:48.676 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest2_825df899-4a57-4a10-8230-b0f5031b4021.bam.parts dst=null perm=null proto=rpc
14:51:48.676 INFO PathOutputCommitterFactory - No output committer factory defined, defaulting to FileOutputCommitterFactory
14:51:48.676 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
14:51:48.676 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
14:51:48.677 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=mkdirs src=/user/runner/ReadsSparkSinkUnitTest2_825df899-4a57-4a10-8230-b0f5031b4021.bam.parts/_temporary/0 dst=null perm=runner:supergroup:rwxr-xr-x proto=rpc
14:51:48.685 INFO SparkContext - Starting job: runJob at SparkHadoopWriter.scala:83
14:51:48.686 INFO DAGScheduler - Registering RDD 1155 (mapToPair at SparkUtils.java:161) as input to shuffle 48
14:51:48.686 INFO DAGScheduler - Got job 181 (runJob at SparkHadoopWriter.scala:83) with 1 output partitions
14:51:48.686 INFO DAGScheduler - Final stage: ResultStage 241 (runJob at SparkHadoopWriter.scala:83)
14:51:48.686 INFO DAGScheduler - Parents of final stage: List(ShuffleMapStage 240)
14:51:48.686 INFO DAGScheduler - Missing parents: List(ShuffleMapStage 240)
14:51:48.686 INFO DAGScheduler - Submitting ShuffleMapStage 240 (MapPartitionsRDD[1155] at mapToPair at SparkUtils.java:161), which has no missing parents
14:51:48.703 INFO MemoryStore - Block broadcast_484 stored as values in memory (estimated size 520.4 KiB, free 1915.5 MiB)
14:51:48.705 INFO MemoryStore - Block broadcast_484_piece0 stored as bytes in memory (estimated size 166.1 KiB, free 1915.3 MiB)
14:51:48.705 INFO BlockManagerInfo - Added broadcast_484_piece0 in memory on localhost:44923 (size: 166.1 KiB, free: 1919.1 MiB)
14:51:48.705 INFO SparkContext - Created broadcast 484 from broadcast at DAGScheduler.scala:1580
14:51:48.705 INFO DAGScheduler - Submitting 1 missing tasks from ShuffleMapStage 240 (MapPartitionsRDD[1155] at mapToPair at SparkUtils.java:161) (first 15 tasks are for partitions Vector(0))
14:51:48.705 INFO TaskSchedulerImpl - Adding task set 240.0 with 1 tasks resource profile 0
14:51:48.706 INFO TaskSetManager - Starting task 0.0 in stage 240.0 (TID 296) (localhost, executor driver, partition 0, PROCESS_LOCAL, 7901 bytes)
14:51:48.706 INFO Executor - Running task 0.0 in stage 240.0 (TID 296)
14:51:48.739 INFO NewHadoopRDD - Input split: file:/home/runner/work/gatk/gatk/src/test/resources/org/broadinstitute/hellbender/tools/BQSR/expected.HiSeq.1mb.1RG.2k_lines.alternate.recalibrated.DIQ.bam:0+216896
14:51:48.755 INFO Executor - Finished task 0.0 in stage 240.0 (TID 296). 1148 bytes result sent to driver
14:51:48.756 INFO TaskSetManager - Finished task 0.0 in stage 240.0 (TID 296) in 50 ms on localhost (executor driver) (1/1)
14:51:48.756 INFO TaskSchedulerImpl - Removed TaskSet 240.0, whose tasks have all completed, from pool
14:51:48.756 INFO DAGScheduler - ShuffleMapStage 240 (mapToPair at SparkUtils.java:161) finished in 0.070 s
14:51:48.756 INFO DAGScheduler - looking for newly runnable stages
14:51:48.756 INFO DAGScheduler - running: HashSet()
14:51:48.756 INFO DAGScheduler - waiting: HashSet(ResultStage 241)
14:51:48.756 INFO DAGScheduler - failed: HashSet()
14:51:48.756 INFO DAGScheduler - Submitting ResultStage 241 (MapPartitionsRDD[1160] at mapToPair at BamSink.java:91), which has no missing parents
14:51:48.763 INFO MemoryStore - Block broadcast_485 stored as values in memory (estimated size 241.5 KiB, free 1915.1 MiB)
14:51:48.764 INFO MemoryStore - Block broadcast_485_piece0 stored as bytes in memory (estimated size 67.1 KiB, free 1915.0 MiB)
14:51:48.764 INFO BlockManagerInfo - Added broadcast_485_piece0 in memory on localhost:44923 (size: 67.1 KiB, free: 1919.0 MiB)
14:51:48.764 INFO SparkContext - Created broadcast 485 from broadcast at DAGScheduler.scala:1580
14:51:48.765 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 241 (MapPartitionsRDD[1160] at mapToPair at BamSink.java:91) (first 15 tasks are for partitions Vector(0))
14:51:48.765 INFO TaskSchedulerImpl - Adding task set 241.0 with 1 tasks resource profile 0
14:51:48.765 INFO TaskSetManager - Starting task 0.0 in stage 241.0 (TID 297) (localhost, executor driver, partition 0, NODE_LOCAL, 7513 bytes)
14:51:48.765 INFO Executor - Running task 0.0 in stage 241.0 (TID 297)
14:51:48.770 INFO ShuffleBlockFetcherIterator - Getting 1 (343.8 KiB) non-empty blocks including 1 (343.8 KiB) local and 0 (0.0 B) host-local and 0 (0.0 B) push-merged-local and 0 (0.0 B) remote blocks
14:51:48.770 INFO ShuffleBlockFetcherIterator - Started 0 remote fetches in 0 ms
14:51:48.781 INFO PathOutputCommitterFactory - No output committer factory defined, defaulting to FileOutputCommitterFactory
14:51:48.781 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
14:51:48.781 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
14:51:48.781 INFO PathOutputCommitterFactory - No output committer factory defined, defaulting to FileOutputCommitterFactory
14:51:48.781 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
14:51:48.781 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
14:51:48.782 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=create src=/user/runner/ReadsSparkSinkUnitTest2_825df899-4a57-4a10-8230-b0f5031b4021.bam.parts/_temporary/0/_temporary/attempt_202603041451482289489742977075876_1160_r_000000_0/part-r-00000 dst=null perm=runner:supergroup:rw-r--r-- proto=rpc
14:51:48.784 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=create src=/user/runner/ReadsSparkSinkUnitTest2_825df899-4a57-4a10-8230-b0f5031b4021.bam.parts/_temporary/0/_temporary/attempt_202603041451482289489742977075876_1160_r_000000_0/.part-r-00000.sbi dst=null perm=runner:supergroup:rw-r--r-- proto=rpc
14:51:48.785 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=create src=/user/runner/ReadsSparkSinkUnitTest2_825df899-4a57-4a10-8230-b0f5031b4021.bam.parts/_temporary/0/_temporary/attempt_202603041451482289489742977075876_1160_r_000000_0/.part-r-00000.bai dst=null perm=runner:supergroup:rw-r--r-- proto=rpc
14:51:48.787 INFO StateChange - BLOCK* allocate blk_1073741898_1074, replicas=127.0.0.1:34059 for /user/runner/ReadsSparkSinkUnitTest2_825df899-4a57-4a10-8230-b0f5031b4021.bam.parts/_temporary/0/_temporary/attempt_202603041451482289489742977075876_1160_r_000000_0/part-r-00000
14:51:48.788 INFO DataNode - Receiving BP-1768883704-10.1.0.125-1772635868711:blk_1073741898_1074 src: /127.0.0.1:54666 dest: /127.0.0.1:34059
14:51:48.791 INFO clienttrace - src: /127.0.0.1:54666, dest: /127.0.0.1:34059, bytes: 229774, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-951833550_1, offset: 0, srvID: 3c05d188-bc2c-46c8-958b-79f7184a02a2, blockid: BP-1768883704-10.1.0.125-1772635868711:blk_1073741898_1074, duration(ns): 1436695
14:51:48.791 INFO DataNode - PacketResponder: BP-1768883704-10.1.0.125-1772635868711:blk_1073741898_1074, type=LAST_IN_PIPELINE terminating
14:51:48.791 INFO StateChange - DIR* completeFile: /user/runner/ReadsSparkSinkUnitTest2_825df899-4a57-4a10-8230-b0f5031b4021.bam.parts/_temporary/0/_temporary/attempt_202603041451482289489742977075876_1160_r_000000_0/part-r-00000 is closed by DFSClient_NONMAPREDUCE_-951833550_1
14:51:48.792 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest2_825df899-4a57-4a10-8230-b0f5031b4021.bam.parts/_temporary/0/_temporary/attempt_202603041451482289489742977075876_1160_r_000000_0/part-r-00000 dst=null perm=null proto=rpc
14:51:48.793 INFO StateChange - BLOCK* allocate blk_1073741899_1075, replicas=127.0.0.1:34059 for /user/runner/ReadsSparkSinkUnitTest2_825df899-4a57-4a10-8230-b0f5031b4021.bam.parts/_temporary/0/_temporary/attempt_202603041451482289489742977075876_1160_r_000000_0/.part-r-00000.sbi
14:51:48.793 INFO DataNode - Receiving BP-1768883704-10.1.0.125-1772635868711:blk_1073741899_1075 src: /127.0.0.1:54676 dest: /127.0.0.1:34059
14:51:48.795 INFO clienttrace - src: /127.0.0.1:54676, dest: /127.0.0.1:34059, bytes: 212, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-951833550_1, offset: 0, srvID: 3c05d188-bc2c-46c8-958b-79f7184a02a2, blockid: BP-1768883704-10.1.0.125-1772635868711:blk_1073741899_1075, duration(ns): 368081
14:51:48.795 INFO DataNode - PacketResponder: BP-1768883704-10.1.0.125-1772635868711:blk_1073741899_1075, type=LAST_IN_PIPELINE terminating
14:51:48.795 INFO StateChange - DIR* completeFile: /user/runner/ReadsSparkSinkUnitTest2_825df899-4a57-4a10-8230-b0f5031b4021.bam.parts/_temporary/0/_temporary/attempt_202603041451482289489742977075876_1160_r_000000_0/.part-r-00000.sbi is closed by DFSClient_NONMAPREDUCE_-951833550_1
14:51:48.801 INFO BlockManagerInfo - Removed broadcast_476_piece0 on localhost:44923 in memory (size: 50.2 KiB, free: 1919.1 MiB)
14:51:48.802 INFO StateChange - BLOCK* allocate blk_1073741900_1076, replicas=127.0.0.1:34059 for /user/runner/ReadsSparkSinkUnitTest2_825df899-4a57-4a10-8230-b0f5031b4021.bam.parts/_temporary/0/_temporary/attempt_202603041451482289489742977075876_1160_r_000000_0/.part-r-00000.bai
14:51:48.802 INFO BlockManagerInfo - Removed broadcast_477_piece0 on localhost:44923 in memory (size: 153.7 KiB, free: 1919.2 MiB)
14:51:48.803 INFO DataNode - Receiving BP-1768883704-10.1.0.125-1772635868711:blk_1073741900_1076 src: /127.0.0.1:54692 dest: /127.0.0.1:34059
14:51:48.803 INFO BlockManagerInfo - Removed broadcast_479_piece0 on localhost:44923 in memory (size: 153.6 KiB, free: 1919.4 MiB)
14:51:48.804 INFO BlockManagerInfo - Removed broadcast_473_piece0 on localhost:44923 in memory (size: 9.6 KiB, free: 1919.4 MiB)
14:51:48.804 INFO clienttrace - src: /127.0.0.1:54692, dest: /127.0.0.1:34059, bytes: 5472, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-951833550_1, offset: 0, srvID: 3c05d188-bc2c-46c8-958b-79f7184a02a2, blockid: BP-1768883704-10.1.0.125-1772635868711:blk_1073741900_1076, duration(ns): 940526
14:51:48.804 INFO DataNode - PacketResponder: BP-1768883704-10.1.0.125-1772635868711:blk_1073741900_1076, type=LAST_IN_PIPELINE terminating
14:51:48.805 INFO BlockManagerInfo - Removed broadcast_475_piece0 on localhost:44923 in memory (size: 67.1 KiB, free: 1919.4 MiB)
14:51:48.806 INFO FSNamesystem - BLOCK* blk_1073741900_1076 is COMMITTED but not COMPLETE(numNodes= 0 < minimum = 1) in file /user/runner/ReadsSparkSinkUnitTest2_825df899-4a57-4a10-8230-b0f5031b4021.bam.parts/_temporary/0/_temporary/attempt_202603041451482289489742977075876_1160_r_000000_0/.part-r-00000.bai
14:51:48.806 INFO BlockManagerInfo - Removed broadcast_472_piece0 on localhost:44923 in memory (size: 9.6 KiB, free: 1919.5 MiB)
14:51:48.806 INFO BlockManagerInfo - Removed broadcast_481_piece0 on localhost:44923 in memory (size: 50.3 KiB, free: 1919.5 MiB)
14:51:48.807 INFO BlockManagerInfo - Removed broadcast_470_piece0 on localhost:44923 in memory (size: 50.2 KiB, free: 1919.6 MiB)
14:51:48.807 INFO BlockManagerInfo - Removed broadcast_478_piece0 on localhost:44923 in memory (size: 153.6 KiB, free: 1919.7 MiB)
14:51:48.807 INFO BlockManagerInfo - Removed broadcast_484_piece0 on localhost:44923 in memory (size: 166.1 KiB, free: 1919.9 MiB)
14:51:49.032 INFO FsDatasetAsyncDiskService - Scheduling blk_1073741891_1067 replica FinalizedReplica, blk_1073741891_1067, FINALIZED
getNumBytes() = 212
getBytesOnDisk() = 212
getVisibleLength()= 212
getVolume() = /tmp/minicluster_storage16268522075870465194/data/data1
getBlockURI() = file:/tmp/minicluster_storage16268522075870465194/data/data1/current/BP-1768883704-10.1.0.125-1772635868711/current/finalized/subdir0/subdir0/blk_1073741891 for deletion
14:51:49.032 INFO FsDatasetAsyncDiskService - Deleted BP-1768883704-10.1.0.125-1772635868711 blk_1073741891_1067 URI file:/tmp/minicluster_storage16268522075870465194/data/data1/current/BP-1768883704-10.1.0.125-1772635868711/current/finalized/subdir0/subdir0/blk_1073741891
14:51:49.207 INFO StateChange - DIR* completeFile: /user/runner/ReadsSparkSinkUnitTest2_825df899-4a57-4a10-8230-b0f5031b4021.bam.parts/_temporary/0/_temporary/attempt_202603041451482289489742977075876_1160_r_000000_0/.part-r-00000.bai is closed by DFSClient_NONMAPREDUCE_-951833550_1
14:51:49.207 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest2_825df899-4a57-4a10-8230-b0f5031b4021.bam.parts/_temporary/0/_temporary/attempt_202603041451482289489742977075876_1160_r_000000_0 dst=null perm=null proto=rpc
14:51:49.208 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest2_825df899-4a57-4a10-8230-b0f5031b4021.bam.parts/_temporary/0/_temporary/attempt_202603041451482289489742977075876_1160_r_000000_0 dst=null perm=null proto=rpc
14:51:49.208 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest2_825df899-4a57-4a10-8230-b0f5031b4021.bam.parts/_temporary/0/task_202603041451482289489742977075876_1160_r_000000 dst=null perm=null proto=rpc
14:51:49.209 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=rename src=/user/runner/ReadsSparkSinkUnitTest2_825df899-4a57-4a10-8230-b0f5031b4021.bam.parts/_temporary/0/_temporary/attempt_202603041451482289489742977075876_1160_r_000000_0 dst=/user/runner/ReadsSparkSinkUnitTest2_825df899-4a57-4a10-8230-b0f5031b4021.bam.parts/_temporary/0/task_202603041451482289489742977075876_1160_r_000000 perm=runner:supergroup:rwxr-xr-x proto=rpc
14:51:49.209 INFO FileOutputCommitter - Saved output of task 'attempt_202603041451482289489742977075876_1160_r_000000_0' to hdfs://localhost:43595/user/runner/ReadsSparkSinkUnitTest2_825df899-4a57-4a10-8230-b0f5031b4021.bam.parts/_temporary/0/task_202603041451482289489742977075876_1160_r_000000
14:51:49.209 INFO SparkHadoopMapRedUtil - attempt_202603041451482289489742977075876_1160_r_000000_0: Committed. Elapsed time: 1 ms.
14:51:49.210 INFO Executor - Finished task 0.0 in stage 241.0 (TID 297). 1901 bytes result sent to driver
14:51:49.210 INFO TaskSetManager - Finished task 0.0 in stage 241.0 (TID 297) in 445 ms on localhost (executor driver) (1/1)
14:51:49.210 INFO TaskSchedulerImpl - Removed TaskSet 241.0, whose tasks have all completed, from pool
14:51:49.210 INFO DAGScheduler - ResultStage 241 (runJob at SparkHadoopWriter.scala:83) finished in 0.453 s
14:51:49.210 INFO DAGScheduler - Job 181 is finished. Cancelling potential speculative or zombie tasks for this job
14:51:49.210 INFO TaskSchedulerImpl - Killing all running tasks in stage 241: Stage finished
14:51:49.210 INFO DAGScheduler - Job 181 finished: runJob at SparkHadoopWriter.scala:83, took 0.525144 s
14:51:49.211 INFO SparkHadoopWriter - Start to commit write Job job_202603041451482289489742977075876_1160.
14:51:49.211 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=listStatus src=/user/runner/ReadsSparkSinkUnitTest2_825df899-4a57-4a10-8230-b0f5031b4021.bam.parts/_temporary/0 dst=null perm=null proto=rpc
14:51:49.212 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest2_825df899-4a57-4a10-8230-b0f5031b4021.bam.parts dst=null perm=null proto=rpc
14:51:49.212 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=listStatus src=/user/runner/ReadsSparkSinkUnitTest2_825df899-4a57-4a10-8230-b0f5031b4021.bam.parts/_temporary/0/task_202603041451482289489742977075876_1160_r_000000 dst=null perm=null proto=rpc
14:51:49.212 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest2_825df899-4a57-4a10-8230-b0f5031b4021.bam.parts/.part-r-00000.bai dst=null perm=null proto=rpc
14:51:49.213 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=rename src=/user/runner/ReadsSparkSinkUnitTest2_825df899-4a57-4a10-8230-b0f5031b4021.bam.parts/_temporary/0/task_202603041451482289489742977075876_1160_r_000000/.part-r-00000.bai dst=/user/runner/ReadsSparkSinkUnitTest2_825df899-4a57-4a10-8230-b0f5031b4021.bam.parts/.part-r-00000.bai perm=runner:supergroup:rw-r--r-- proto=rpc
14:51:49.213 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest2_825df899-4a57-4a10-8230-b0f5031b4021.bam.parts/.part-r-00000.sbi dst=null perm=null proto=rpc
14:51:49.214 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=rename src=/user/runner/ReadsSparkSinkUnitTest2_825df899-4a57-4a10-8230-b0f5031b4021.bam.parts/_temporary/0/task_202603041451482289489742977075876_1160_r_000000/.part-r-00000.sbi dst=/user/runner/ReadsSparkSinkUnitTest2_825df899-4a57-4a10-8230-b0f5031b4021.bam.parts/.part-r-00000.sbi perm=runner:supergroup:rw-r--r-- proto=rpc
14:51:49.214 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest2_825df899-4a57-4a10-8230-b0f5031b4021.bam.parts/part-r-00000 dst=null perm=null proto=rpc
14:51:49.215 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=rename src=/user/runner/ReadsSparkSinkUnitTest2_825df899-4a57-4a10-8230-b0f5031b4021.bam.parts/_temporary/0/task_202603041451482289489742977075876_1160_r_000000/part-r-00000 dst=/user/runner/ReadsSparkSinkUnitTest2_825df899-4a57-4a10-8230-b0f5031b4021.bam.parts/part-r-00000 perm=runner:supergroup:rw-r--r-- proto=rpc
14:51:49.215 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=delete src=/user/runner/ReadsSparkSinkUnitTest2_825df899-4a57-4a10-8230-b0f5031b4021.bam.parts/_temporary dst=null perm=null proto=rpc
14:51:49.216 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=create src=/user/runner/ReadsSparkSinkUnitTest2_825df899-4a57-4a10-8230-b0f5031b4021.bam.parts/_SUCCESS dst=null perm=runner:supergroup:rw-r--r-- proto=rpc
14:51:49.216 INFO StateChange - DIR* completeFile: /user/runner/ReadsSparkSinkUnitTest2_825df899-4a57-4a10-8230-b0f5031b4021.bam.parts/_SUCCESS is closed by DFSClient_NONMAPREDUCE_-951833550_1
14:51:49.217 INFO audit - allowed=false ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=delete src=/user/runner/ReadsSparkSinkUnitTest2_825df899-4a57-4a10-8230-b0f5031b4021.bam.parts/.spark-staging-1160 dst=null perm=null proto=rpc
14:51:49.217 INFO SparkHadoopWriter - Write Job job_202603041451482289489742977075876_1160 committed. Elapsed time: 6 ms.
14:51:49.218 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=create src=/user/runner/ReadsSparkSinkUnitTest2_825df899-4a57-4a10-8230-b0f5031b4021.bam.parts/header dst=null perm=runner:supergroup:rw-r--r-- proto=rpc
14:51:49.219 INFO StateChange - BLOCK* allocate blk_1073741901_1077, replicas=127.0.0.1:34059 for /user/runner/ReadsSparkSinkUnitTest2_825df899-4a57-4a10-8230-b0f5031b4021.bam.parts/header
14:51:49.220 INFO DataNode - Receiving BP-1768883704-10.1.0.125-1772635868711:blk_1073741901_1077 src: /127.0.0.1:54702 dest: /127.0.0.1:34059
14:51:49.221 INFO clienttrace - src: /127.0.0.1:54702, dest: /127.0.0.1:34059, bytes: 5712, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-951833550_1, offset: 0, srvID: 3c05d188-bc2c-46c8-958b-79f7184a02a2, blockid: BP-1768883704-10.1.0.125-1772635868711:blk_1073741901_1077, duration(ns): 513456
14:51:49.221 INFO DataNode - PacketResponder: BP-1768883704-10.1.0.125-1772635868711:blk_1073741901_1077, type=LAST_IN_PIPELINE terminating
14:51:49.222 INFO FSNamesystem - BLOCK* blk_1073741901_1077 is COMMITTED but not COMPLETE(numNodes= 0 < minimum = 1) in file /user/runner/ReadsSparkSinkUnitTest2_825df899-4a57-4a10-8230-b0f5031b4021.bam.parts/header
14:51:49.622 INFO StateChange - DIR* completeFile: /user/runner/ReadsSparkSinkUnitTest2_825df899-4a57-4a10-8230-b0f5031b4021.bam.parts/header is closed by DFSClient_NONMAPREDUCE_-951833550_1
14:51:49.623 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=create src=/user/runner/ReadsSparkSinkUnitTest2_825df899-4a57-4a10-8230-b0f5031b4021.bam.parts/terminator dst=null perm=runner:supergroup:rw-r--r-- proto=rpc
14:51:49.624 INFO StateChange - BLOCK* allocate blk_1073741902_1078, replicas=127.0.0.1:34059 for /user/runner/ReadsSparkSinkUnitTest2_825df899-4a57-4a10-8230-b0f5031b4021.bam.parts/terminator
14:51:49.625 INFO DataNode - Receiving BP-1768883704-10.1.0.125-1772635868711:blk_1073741902_1078 src: /127.0.0.1:54714 dest: /127.0.0.1:34059
14:51:49.626 INFO clienttrace - src: /127.0.0.1:54714, dest: /127.0.0.1:34059, bytes: 28, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-951833550_1, offset: 0, srvID: 3c05d188-bc2c-46c8-958b-79f7184a02a2, blockid: BP-1768883704-10.1.0.125-1772635868711:blk_1073741902_1078, duration(ns): 422488
14:51:49.626 INFO DataNode - PacketResponder: BP-1768883704-10.1.0.125-1772635868711:blk_1073741902_1078, type=LAST_IN_PIPELINE terminating
14:51:49.627 INFO StateChange - DIR* completeFile: /user/runner/ReadsSparkSinkUnitTest2_825df899-4a57-4a10-8230-b0f5031b4021.bam.parts/terminator is closed by DFSClient_NONMAPREDUCE_-951833550_1
14:51:49.627 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=listStatus src=/user/runner/ReadsSparkSinkUnitTest2_825df899-4a57-4a10-8230-b0f5031b4021.bam.parts dst=null perm=null proto=rpc
14:51:49.628 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=create src=/user/runner/ReadsSparkSinkUnitTest2_825df899-4a57-4a10-8230-b0f5031b4021.bam.parts/output dst=null perm=runner:supergroup:rw-r--r-- proto=rpc
14:51:49.629 INFO StateChange - DIR* completeFile: /user/runner/ReadsSparkSinkUnitTest2_825df899-4a57-4a10-8230-b0f5031b4021.bam.parts/output is closed by DFSClient_NONMAPREDUCE_-951833550_1
14:51:49.629 INFO HadoopFileSystemWrapper - Concatenating 3 parts to hdfs://localhost:43595/user/runner/ReadsSparkSinkUnitTest2_825df899-4a57-4a10-8230-b0f5031b4021.bam
14:51:49.630 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=concat src=[/user/runner/ReadsSparkSinkUnitTest2_825df899-4a57-4a10-8230-b0f5031b4021.bam.parts/header, /user/runner/ReadsSparkSinkUnitTest2_825df899-4a57-4a10-8230-b0f5031b4021.bam.parts/part-r-00000, /user/runner/ReadsSparkSinkUnitTest2_825df899-4a57-4a10-8230-b0f5031b4021.bam.parts/terminator] dst=/user/runner/ReadsSparkSinkUnitTest2_825df899-4a57-4a10-8230-b0f5031b4021.bam.parts/output perm=runner:supergroup:rw-r--r-- proto=rpc
14:51:49.630 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest2_825df899-4a57-4a10-8230-b0f5031b4021.bam dst=null perm=null proto=rpc
14:51:49.631 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=delete src=/user/runner/ReadsSparkSinkUnitTest2_825df899-4a57-4a10-8230-b0f5031b4021.bam dst=null perm=null proto=rpc
14:51:49.631 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=rename src=/user/runner/ReadsSparkSinkUnitTest2_825df899-4a57-4a10-8230-b0f5031b4021.bam.parts/output dst=/user/runner/ReadsSparkSinkUnitTest2_825df899-4a57-4a10-8230-b0f5031b4021.bam perm=runner:supergroup:rw-r--r-- proto=rpc
14:51:49.631 INFO HadoopFileSystemWrapper - Concatenating to hdfs://localhost:43595/user/runner/ReadsSparkSinkUnitTest2_825df899-4a57-4a10-8230-b0f5031b4021.bam done
14:51:49.632 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest2_825df899-4a57-4a10-8230-b0f5031b4021.bam dst=null perm=null proto=rpc
14:51:49.632 INFO IndexFileMerger - Merging .sbi files in temp directory hdfs://localhost:43595/user/runner/ReadsSparkSinkUnitTest2_825df899-4a57-4a10-8230-b0f5031b4021.bam.parts/ to hdfs://localhost:43595/user/runner/ReadsSparkSinkUnitTest2_825df899-4a57-4a10-8230-b0f5031b4021.bam.sbi
14:51:49.632 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=listStatus src=/user/runner/ReadsSparkSinkUnitTest2_825df899-4a57-4a10-8230-b0f5031b4021.bam.parts dst=null perm=null proto=rpc
14:51:49.633 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=create src=/user/runner/ReadsSparkSinkUnitTest2_825df899-4a57-4a10-8230-b0f5031b4021.bam.sbi dst=null perm=runner:supergroup:rw-r--r-- proto=rpc
14:51:49.634 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest2_825df899-4a57-4a10-8230-b0f5031b4021.bam.parts/.part-r-00000.sbi dst=null perm=null proto=rpc
14:51:49.634 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest2_825df899-4a57-4a10-8230-b0f5031b4021.bam.parts/.part-r-00000.sbi dst=null perm=null proto=rpc
14:51:49.635 WARN DFSUtil - Unexpected value for data transfer bytes=216 duration=0
14:51:49.636 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=delete src=/user/runner/ReadsSparkSinkUnitTest2_825df899-4a57-4a10-8230-b0f5031b4021.bam.parts/.part-r-00000.sbi dst=null perm=null proto=rpc
14:51:49.636 INFO StateChange - BLOCK* allocate blk_1073741903_1079, replicas=127.0.0.1:34059 for /user/runner/ReadsSparkSinkUnitTest2_825df899-4a57-4a10-8230-b0f5031b4021.bam.sbi
14:51:49.637 INFO DataNode - Receiving BP-1768883704-10.1.0.125-1772635868711:blk_1073741903_1079 src: /127.0.0.1:54728 dest: /127.0.0.1:34059
14:51:49.639 INFO clienttrace - src: /127.0.0.1:54728, dest: /127.0.0.1:34059, bytes: 212, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-951833550_1, offset: 0, srvID: 3c05d188-bc2c-46c8-958b-79f7184a02a2, blockid: BP-1768883704-10.1.0.125-1772635868711:blk_1073741903_1079, duration(ns): 482793
14:51:49.639 INFO DataNode - PacketResponder: BP-1768883704-10.1.0.125-1772635868711:blk_1073741903_1079, type=LAST_IN_PIPELINE terminating
14:51:49.639 INFO FSNamesystem - BLOCK* blk_1073741903_1079 is COMMITTED but not COMPLETE(numNodes= 0 < minimum = 1) in file /user/runner/ReadsSparkSinkUnitTest2_825df899-4a57-4a10-8230-b0f5031b4021.bam.sbi
14:51:50.040 INFO StateChange - DIR* completeFile: /user/runner/ReadsSparkSinkUnitTest2_825df899-4a57-4a10-8230-b0f5031b4021.bam.sbi is closed by DFSClient_NONMAPREDUCE_-951833550_1
14:51:50.041 INFO IndexFileMerger - Done merging .sbi files
14:51:50.041 INFO IndexFileMerger - Merging .bai files in temp directory hdfs://localhost:43595/user/runner/ReadsSparkSinkUnitTest2_825df899-4a57-4a10-8230-b0f5031b4021.bam.parts/ to hdfs://localhost:43595/user/runner/ReadsSparkSinkUnitTest2_825df899-4a57-4a10-8230-b0f5031b4021.bam.bai
14:51:50.041 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=listStatus src=/user/runner/ReadsSparkSinkUnitTest2_825df899-4a57-4a10-8230-b0f5031b4021.bam.parts dst=null perm=null proto=rpc
14:51:50.042 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=create src=/user/runner/ReadsSparkSinkUnitTest2_825df899-4a57-4a10-8230-b0f5031b4021.bam.bai dst=null perm=runner:supergroup:rw-r--r-- proto=rpc
14:51:50.043 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest2_825df899-4a57-4a10-8230-b0f5031b4021.bam.parts/.part-r-00000.bai dst=null perm=null proto=rpc
14:51:50.043 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest2_825df899-4a57-4a10-8230-b0f5031b4021.bam.parts/.part-r-00000.bai dst=null perm=null proto=rpc
14:51:50.044 WARN DFSUtil - Unexpected value for data transfer bytes=5516 duration=0
14:51:50.045 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=delete src=/user/runner/ReadsSparkSinkUnitTest2_825df899-4a57-4a10-8230-b0f5031b4021.bam.parts/.part-r-00000.bai dst=null perm=null proto=rpc
14:51:50.046 INFO StateChange - BLOCK* allocate blk_1073741904_1080, replicas=127.0.0.1:34059 for /user/runner/ReadsSparkSinkUnitTest2_825df899-4a57-4a10-8230-b0f5031b4021.bam.bai
14:51:50.046 INFO DataNode - Receiving BP-1768883704-10.1.0.125-1772635868711:blk_1073741904_1080 src: /127.0.0.1:54734 dest: /127.0.0.1:34059
14:51:50.048 INFO clienttrace - src: /127.0.0.1:54734, dest: /127.0.0.1:34059, bytes: 5472, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-951833550_1, offset: 0, srvID: 3c05d188-bc2c-46c8-958b-79f7184a02a2, blockid: BP-1768883704-10.1.0.125-1772635868711:blk_1073741904_1080, duration(ns): 444879
14:51:50.048 INFO DataNode - PacketResponder: BP-1768883704-10.1.0.125-1772635868711:blk_1073741904_1080, type=LAST_IN_PIPELINE terminating
14:51:50.048 INFO FSNamesystem - BLOCK* blk_1073741904_1080 is COMMITTED but not COMPLETE(numNodes= 0 < minimum = 1) in file /user/runner/ReadsSparkSinkUnitTest2_825df899-4a57-4a10-8230-b0f5031b4021.bam.bai
14:51:50.449 INFO StateChange - DIR* completeFile: /user/runner/ReadsSparkSinkUnitTest2_825df899-4a57-4a10-8230-b0f5031b4021.bam.bai is closed by DFSClient_NONMAPREDUCE_-951833550_1
14:51:50.449 INFO IndexFileMerger - Done merging .bai files
14:51:50.450 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=delete src=/user/runner/ReadsSparkSinkUnitTest2_825df899-4a57-4a10-8230-b0f5031b4021.bam.parts dst=null perm=null proto=rpc
14:51:50.459 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest2_825df899-4a57-4a10-8230-b0f5031b4021.bam.bai dst=null perm=null proto=rpc
14:51:50.466 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest2_825df899-4a57-4a10-8230-b0f5031b4021.bam.sbi dst=null perm=null proto=rpc
14:51:50.467 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest2_825df899-4a57-4a10-8230-b0f5031b4021.bam.sbi dst=null perm=null proto=rpc
14:51:50.467 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest2_825df899-4a57-4a10-8230-b0f5031b4021.bam.sbi dst=null perm=null proto=rpc
14:51:50.468 WARN DFSUtil - Unexpected value for data transfer bytes=216 duration=0
14:51:50.468 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest2_825df899-4a57-4a10-8230-b0f5031b4021.bam dst=null perm=null proto=rpc
14:51:50.469 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest2_825df899-4a57-4a10-8230-b0f5031b4021.bam dst=null perm=null proto=rpc
14:51:50.469 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest2_825df899-4a57-4a10-8230-b0f5031b4021.bam dst=null perm=null proto=rpc
14:51:50.469 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest2_825df899-4a57-4a10-8230-b0f5031b4021.bam dst=null perm=null proto=rpc
14:51:50.470 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest2_825df899-4a57-4a10-8230-b0f5031b4021.bam.bai dst=null perm=null proto=rpc
14:51:50.470 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest2_825df899-4a57-4a10-8230-b0f5031b4021.bam.bai dst=null perm=null proto=rpc
14:51:50.471 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest2_825df899-4a57-4a10-8230-b0f5031b4021.bam.bai dst=null perm=null proto=rpc
14:51:50.472 WARN DFSUtil - Unexpected value for data transfer bytes=5760 duration=0
14:51:50.473 WARN DFSUtil - Unexpected value for data transfer bytes=5516 duration=0
14:51:50.474 WARN DFSUtil - Unexpected value for data transfer bytes=5516 duration=0
14:51:50.474 WARN DFSUtil - Unexpected value for data transfer bytes=231570 duration=0
14:51:50.474 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest2_825df899-4a57-4a10-8230-b0f5031b4021.bam.sbi dst=null perm=null proto=rpc
14:51:50.475 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest2_825df899-4a57-4a10-8230-b0f5031b4021.bam.sbi dst=null perm=null proto=rpc
14:51:50.475 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest2_825df899-4a57-4a10-8230-b0f5031b4021.bam.sbi dst=null perm=null proto=rpc
14:51:50.476 WARN DFSUtil - Unexpected value for data transfer bytes=216 duration=0
14:51:50.476 INFO MemoryStore - Block broadcast_486 stored as values in memory (estimated size 320.0 B, free 1919.0 MiB)
14:51:50.476 INFO MemoryStore - Block broadcast_486_piece0 stored as bytes in memory (estimated size 233.0 B, free 1919.0 MiB)
14:51:50.477 INFO BlockManagerInfo - Added broadcast_486_piece0 in memory on localhost:44923 (size: 233.0 B, free: 1919.9 MiB)
14:51:50.477 INFO SparkContext - Created broadcast 486 from broadcast at BamSource.java:104
14:51:50.478 INFO MemoryStore - Block broadcast_487 stored as values in memory (estimated size 297.9 KiB, free 1918.7 MiB)
14:51:50.484 INFO MemoryStore - Block broadcast_487_piece0 stored as bytes in memory (estimated size 50.2 KiB, free 1918.7 MiB)
14:51:50.484 INFO BlockManagerInfo - Added broadcast_487_piece0 in memory on localhost:44923 (size: 50.2 KiB, free: 1919.8 MiB)
14:51:50.484 INFO SparkContext - Created broadcast 487 from newAPIHadoopFile at PathSplitSource.java:96
14:51:50.493 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest2_825df899-4a57-4a10-8230-b0f5031b4021.bam dst=null perm=null proto=rpc
14:51:50.493 INFO FileInputFormat - Total input files to process : 1
14:51:50.494 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest2_825df899-4a57-4a10-8230-b0f5031b4021.bam dst=null perm=null proto=rpc
14:51:50.508 INFO SparkContext - Starting job: collect at ReadsSparkSinkUnitTest.java:182
14:51:50.508 INFO DAGScheduler - Got job 182 (collect at ReadsSparkSinkUnitTest.java:182) with 1 output partitions
14:51:50.508 INFO DAGScheduler - Final stage: ResultStage 242 (collect at ReadsSparkSinkUnitTest.java:182)
14:51:50.508 INFO DAGScheduler - Parents of final stage: List()
14:51:50.508 INFO DAGScheduler - Missing parents: List()
14:51:50.509 INFO DAGScheduler - Submitting ResultStage 242 (MapPartitionsRDD[1166] at filter at ReadsSparkSource.java:96), which has no missing parents
14:51:50.515 INFO MemoryStore - Block broadcast_488 stored as values in memory (estimated size 148.2 KiB, free 1918.5 MiB)
14:51:50.516 INFO MemoryStore - Block broadcast_488_piece0 stored as bytes in memory (estimated size 54.6 KiB, free 1918.5 MiB)
14:51:50.516 INFO BlockManagerInfo - Added broadcast_488_piece0 in memory on localhost:44923 (size: 54.6 KiB, free: 1919.8 MiB)
14:51:50.516 INFO SparkContext - Created broadcast 488 from broadcast at DAGScheduler.scala:1580
14:51:50.516 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 242 (MapPartitionsRDD[1166] at filter at ReadsSparkSource.java:96) (first 15 tasks are for partitions Vector(0))
14:51:50.516 INFO TaskSchedulerImpl - Adding task set 242.0 with 1 tasks resource profile 0
14:51:50.517 INFO TaskSetManager - Starting task 0.0 in stage 242.0 (TID 298) (localhost, executor driver, partition 0, ANY, 7852 bytes)
14:51:50.517 INFO Executor - Running task 0.0 in stage 242.0 (TID 298)
14:51:50.529 INFO NewHadoopRDD - Input split: hdfs://localhost:43595/user/runner/ReadsSparkSinkUnitTest2_825df899-4a57-4a10-8230-b0f5031b4021.bam:0+235514
14:51:50.529 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest2_825df899-4a57-4a10-8230-b0f5031b4021.bam dst=null perm=null proto=rpc
14:51:50.530 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest2_825df899-4a57-4a10-8230-b0f5031b4021.bam dst=null perm=null proto=rpc
14:51:50.531 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest2_825df899-4a57-4a10-8230-b0f5031b4021.bam.bai dst=null perm=null proto=rpc
14:51:50.531 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest2_825df899-4a57-4a10-8230-b0f5031b4021.bam.bai dst=null perm=null proto=rpc
14:51:50.531 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest2_825df899-4a57-4a10-8230-b0f5031b4021.bam.bai dst=null perm=null proto=rpc
14:51:50.533 WARN DFSUtil - Unexpected value for data transfer bytes=5760 duration=0
14:51:50.535 WARN DFSUtil - Unexpected value for data transfer bytes=5516 duration=0
14:51:50.535 WARN DFSUtil - Unexpected value for data transfer bytes=5516 duration=0
14:51:50.537 WARN DFSUtil - Unexpected value for data transfer bytes=32 duration=0
14:51:50.539 INFO Executor - Finished task 0.0 in stage 242.0 (TID 298). 650141 bytes result sent to driver
14:51:50.541 INFO TaskSetManager - Finished task 0.0 in stage 242.0 (TID 298) in 24 ms on localhost (executor driver) (1/1)
14:51:50.541 INFO TaskSchedulerImpl - Removed TaskSet 242.0, whose tasks have all completed, from pool
14:51:50.541 INFO DAGScheduler - ResultStage 242 (collect at ReadsSparkSinkUnitTest.java:182) finished in 0.032 s
14:51:50.541 INFO DAGScheduler - Job 182 is finished. Cancelling potential speculative or zombie tasks for this job
14:51:50.541 INFO TaskSchedulerImpl - Killing all running tasks in stage 242: Stage finished
14:51:50.541 INFO DAGScheduler - Job 182 finished: collect at ReadsSparkSinkUnitTest.java:182, took 0.032919 s
14:51:50.550 INFO SparkContext - Starting job: count at ReadsSparkSinkUnitTest.java:185
14:51:50.551 INFO DAGScheduler - Got job 183 (count at ReadsSparkSinkUnitTest.java:185) with 1 output partitions
14:51:50.551 INFO DAGScheduler - Final stage: ResultStage 243 (count at ReadsSparkSinkUnitTest.java:185)
14:51:50.551 INFO DAGScheduler - Parents of final stage: List()
14:51:50.551 INFO DAGScheduler - Missing parents: List()
14:51:50.551 INFO DAGScheduler - Submitting ResultStage 243 (MapPartitionsRDD[1148] at filter at ReadsSparkSource.java:96), which has no missing parents
14:51:50.568 INFO MemoryStore - Block broadcast_489 stored as values in memory (estimated size 426.1 KiB, free 1918.1 MiB)
14:51:50.569 INFO MemoryStore - Block broadcast_489_piece0 stored as bytes in memory (estimated size 153.6 KiB, free 1917.9 MiB)
14:51:50.569 INFO BlockManagerInfo - Added broadcast_489_piece0 in memory on localhost:44923 (size: 153.6 KiB, free: 1919.6 MiB)
14:51:50.570 INFO SparkContext - Created broadcast 489 from broadcast at DAGScheduler.scala:1580
14:51:50.570 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 243 (MapPartitionsRDD[1148] at filter at ReadsSparkSource.java:96) (first 15 tasks are for partitions Vector(0))
14:51:50.570 INFO TaskSchedulerImpl - Adding task set 243.0 with 1 tasks resource profile 0
14:51:50.570 INFO TaskSetManager - Starting task 0.0 in stage 243.0 (TID 299) (localhost, executor driver, partition 0, PROCESS_LOCAL, 7912 bytes)
14:51:50.571 INFO Executor - Running task 0.0 in stage 243.0 (TID 299)
14:51:50.602 INFO NewHadoopRDD - Input split: file:/home/runner/work/gatk/gatk/src/test/resources/org/broadinstitute/hellbender/tools/BQSR/expected.HiSeq.1mb.1RG.2k_lines.alternate.recalibrated.DIQ.bam:0+216896
14:51:50.614 INFO Executor - Finished task 0.0 in stage 243.0 (TID 299). 989 bytes result sent to driver
14:51:50.614 INFO TaskSetManager - Finished task 0.0 in stage 243.0 (TID 299) in 44 ms on localhost (executor driver) (1/1)
14:51:50.614 INFO TaskSchedulerImpl - Removed TaskSet 243.0, whose tasks have all completed, from pool
14:51:50.614 INFO DAGScheduler - ResultStage 243 (count at ReadsSparkSinkUnitTest.java:185) finished in 0.063 s
14:51:50.614 INFO DAGScheduler - Job 183 is finished. Cancelling potential speculative or zombie tasks for this job
14:51:50.614 INFO TaskSchedulerImpl - Killing all running tasks in stage 243: Stage finished
14:51:50.614 INFO DAGScheduler - Job 183 finished: count at ReadsSparkSinkUnitTest.java:185, took 0.063820 s
14:51:50.618 INFO SparkContext - Starting job: count at ReadsSparkSinkUnitTest.java:185
14:51:50.618 INFO DAGScheduler - Got job 184 (count at ReadsSparkSinkUnitTest.java:185) with 1 output partitions
14:51:50.618 INFO DAGScheduler - Final stage: ResultStage 244 (count at ReadsSparkSinkUnitTest.java:185)
14:51:50.618 INFO DAGScheduler - Parents of final stage: List()
14:51:50.618 INFO DAGScheduler - Missing parents: List()
14:51:50.618 INFO DAGScheduler - Submitting ResultStage 244 (MapPartitionsRDD[1166] at filter at ReadsSparkSource.java:96), which has no missing parents
14:51:50.624 INFO MemoryStore - Block broadcast_490 stored as values in memory (estimated size 148.1 KiB, free 1917.8 MiB)
14:51:50.625 INFO MemoryStore - Block broadcast_490_piece0 stored as bytes in memory (estimated size 54.6 KiB, free 1917.7 MiB)
14:51:50.625 INFO BlockManagerInfo - Added broadcast_490_piece0 in memory on localhost:44923 (size: 54.6 KiB, free: 1919.6 MiB)
14:51:50.625 INFO SparkContext - Created broadcast 490 from broadcast at DAGScheduler.scala:1580
14:51:50.626 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 244 (MapPartitionsRDD[1166] at filter at ReadsSparkSource.java:96) (first 15 tasks are for partitions Vector(0))
14:51:50.626 INFO TaskSchedulerImpl - Adding task set 244.0 with 1 tasks resource profile 0
14:51:50.626 INFO TaskSetManager - Starting task 0.0 in stage 244.0 (TID 300) (localhost, executor driver, partition 0, ANY, 7852 bytes)
14:51:50.626 INFO Executor - Running task 0.0 in stage 244.0 (TID 300)
14:51:50.638 INFO NewHadoopRDD - Input split: hdfs://localhost:43595/user/runner/ReadsSparkSinkUnitTest2_825df899-4a57-4a10-8230-b0f5031b4021.bam:0+235514
14:51:50.639 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest2_825df899-4a57-4a10-8230-b0f5031b4021.bam dst=null perm=null proto=rpc
14:51:50.640 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest2_825df899-4a57-4a10-8230-b0f5031b4021.bam dst=null perm=null proto=rpc
14:51:50.640 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest2_825df899-4a57-4a10-8230-b0f5031b4021.bam.bai dst=null perm=null proto=rpc
14:51:50.641 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest2_825df899-4a57-4a10-8230-b0f5031b4021.bam.bai dst=null perm=null proto=rpc
14:51:50.641 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest2_825df899-4a57-4a10-8230-b0f5031b4021.bam.bai dst=null perm=null proto=rpc
14:51:50.643 WARN DFSUtil - Unexpected value for data transfer bytes=5760 duration=0
14:51:50.644 WARN DFSUtil - Unexpected value for data transfer bytes=5516 duration=0
14:51:50.644 WARN DFSUtil - Unexpected value for data transfer bytes=5516 duration=0
14:51:50.646 WARN DFSUtil - Unexpected value for data transfer bytes=231570 duration=0
14:51:50.646 WARN DFSUtil - Unexpected value for data transfer bytes=32 duration=0
14:51:50.647 INFO Executor - Finished task 0.0 in stage 244.0 (TID 300). 989 bytes result sent to driver
14:51:50.648 INFO TaskSetManager - Finished task 0.0 in stage 244.0 (TID 300) in 22 ms on localhost (executor driver) (1/1)
14:51:50.648 INFO TaskSchedulerImpl - Removed TaskSet 244.0, whose tasks have all completed, from pool
14:51:50.648 INFO DAGScheduler - ResultStage 244 (count at ReadsSparkSinkUnitTest.java:185) finished in 0.030 s
14:51:50.648 INFO DAGScheduler - Job 184 is finished. Cancelling potential speculative or zombie tasks for this job
14:51:50.648 INFO TaskSchedulerImpl - Killing all running tasks in stage 244: Stage finished
14:51:50.648 INFO DAGScheduler - Job 184 finished: count at ReadsSparkSinkUnitTest.java:185, took 0.030189 s
14:51:50.657 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest3_195088ac-e76d-41ce-8376-1ba8fc8dab50.bam dst=null perm=null proto=rpc
14:51:50.658 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=create src=/user/runner/ReadsSparkSinkUnitTest3_195088ac-e76d-41ce-8376-1ba8fc8dab50.bam dst=null perm=runner:supergroup:rw-r--r-- proto=rpc
14:51:50.659 INFO StateChange - DIR* completeFile: /user/runner/ReadsSparkSinkUnitTest3_195088ac-e76d-41ce-8376-1ba8fc8dab50.bam is closed by DFSClient_NONMAPREDUCE_-951833550_1
14:51:50.660 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest3_195088ac-e76d-41ce-8376-1ba8fc8dab50.bam dst=null perm=null proto=rpc
14:51:50.662 INFO MemoryStore - Block broadcast_491 stored as values in memory (estimated size 298.0 KiB, free 1917.4 MiB)
14:51:50.668 INFO MemoryStore - Block broadcast_491_piece0 stored as bytes in memory (estimated size 50.2 KiB, free 1917.4 MiB)
14:51:50.668 INFO BlockManagerInfo - Added broadcast_491_piece0 in memory on localhost:44923 (size: 50.2 KiB, free: 1919.5 MiB)
14:51:50.668 INFO SparkContext - Created broadcast 491 from newAPIHadoopFile at PathSplitSource.java:96
14:51:50.690 INFO MemoryStore - Block broadcast_492 stored as values in memory (estimated size 298.0 KiB, free 1917.1 MiB)
14:51:50.696 INFO MemoryStore - Block broadcast_492_piece0 stored as bytes in memory (estimated size 50.2 KiB, free 1917.0 MiB)
14:51:50.696 INFO BlockManagerInfo - Added broadcast_492_piece0 in memory on localhost:44923 (size: 50.2 KiB, free: 1919.5 MiB)
14:51:50.696 INFO SparkContext - Created broadcast 492 from newAPIHadoopFile at PathSplitSource.java:96
14:51:50.716 INFO FileInputFormat - Total input files to process : 1
14:51:50.718 INFO MemoryStore - Block broadcast_493 stored as values in memory (estimated size 19.6 KiB, free 1917.0 MiB)
14:51:50.718 INFO MemoryStore - Block broadcast_493_piece0 stored as bytes in memory (estimated size 1890.0 B, free 1917.0 MiB)
14:51:50.718 INFO BlockManagerInfo - Added broadcast_493_piece0 in memory on localhost:44923 (size: 1890.0 B, free: 1919.5 MiB)
14:51:50.718 INFO SparkContext - Created broadcast 493 from broadcast at ReadsSparkSink.java:133
14:51:50.719 INFO MemoryStore - Block broadcast_494 stored as values in memory (estimated size 20.0 KiB, free 1917.0 MiB)
14:51:50.719 INFO MemoryStore - Block broadcast_494_piece0 stored as bytes in memory (estimated size 1890.0 B, free 1917.0 MiB)
14:51:50.719 INFO BlockManagerInfo - Added broadcast_494_piece0 in memory on localhost:44923 (size: 1890.0 B, free: 1919.5 MiB)
14:51:50.720 INFO SparkContext - Created broadcast 494 from broadcast at BamSink.java:76
14:51:50.722 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest3_195088ac-e76d-41ce-8376-1ba8fc8dab50.bam.parts dst=null perm=null proto=rpc
14:51:50.722 INFO PathOutputCommitterFactory - No output committer factory defined, defaulting to FileOutputCommitterFactory
14:51:50.722 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
14:51:50.722 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
14:51:50.723 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=mkdirs src=/user/runner/ReadsSparkSinkUnitTest3_195088ac-e76d-41ce-8376-1ba8fc8dab50.bam.parts/_temporary/0 dst=null perm=runner:supergroup:rwxr-xr-x proto=rpc
14:51:50.729 INFO SparkContext - Starting job: runJob at SparkHadoopWriter.scala:83
14:51:50.729 INFO DAGScheduler - Registering RDD 1180 (mapToPair at SparkUtils.java:161) as input to shuffle 49
14:51:50.729 INFO DAGScheduler - Got job 185 (runJob at SparkHadoopWriter.scala:83) with 1 output partitions
14:51:50.729 INFO DAGScheduler - Final stage: ResultStage 246 (runJob at SparkHadoopWriter.scala:83)
14:51:50.729 INFO DAGScheduler - Parents of final stage: List(ShuffleMapStage 245)
14:51:50.729 INFO DAGScheduler - Missing parents: List(ShuffleMapStage 245)
14:51:50.730 INFO DAGScheduler - Submitting ShuffleMapStage 245 (MapPartitionsRDD[1180] at mapToPair at SparkUtils.java:161), which has no missing parents
14:51:50.747 INFO MemoryStore - Block broadcast_495 stored as values in memory (estimated size 434.3 KiB, free 1916.6 MiB)
14:51:50.748 INFO MemoryStore - Block broadcast_495_piece0 stored as bytes in memory (estimated size 157.6 KiB, free 1916.4 MiB)
14:51:50.748 INFO BlockManagerInfo - Added broadcast_495_piece0 in memory on localhost:44923 (size: 157.6 KiB, free: 1919.3 MiB)
14:51:50.748 INFO SparkContext - Created broadcast 495 from broadcast at DAGScheduler.scala:1580
14:51:50.749 INFO DAGScheduler - Submitting 1 missing tasks from ShuffleMapStage 245 (MapPartitionsRDD[1180] at mapToPair at SparkUtils.java:161) (first 15 tasks are for partitions Vector(0))
14:51:50.749 INFO TaskSchedulerImpl - Adding task set 245.0 with 1 tasks resource profile 0
14:51:50.749 INFO TaskSetManager - Starting task 0.0 in stage 245.0 (TID 301) (localhost, executor driver, partition 0, PROCESS_LOCAL, 7882 bytes)
14:51:50.749 INFO Executor - Running task 0.0 in stage 245.0 (TID 301)
14:51:50.782 INFO NewHadoopRDD - Input split: file:/home/runner/work/gatk/gatk/src/test/resources/org/broadinstitute/hellbender/tools/BQSR/CEUTrio.HiSeq.WGS.b37.ch20.1m-1m1k.NA12878.bam:0+211123
14:51:50.795 INFO Executor - Finished task 0.0 in stage 245.0 (TID 301). 1148 bytes result sent to driver
14:51:50.795 INFO TaskSetManager - Finished task 0.0 in stage 245.0 (TID 301) in 46 ms on localhost (executor driver) (1/1)
14:51:50.795 INFO TaskSchedulerImpl - Removed TaskSet 245.0, whose tasks have all completed, from pool
14:51:50.796 INFO DAGScheduler - ShuffleMapStage 245 (mapToPair at SparkUtils.java:161) finished in 0.066 s
14:51:50.796 INFO DAGScheduler - looking for newly runnable stages
14:51:50.796 INFO DAGScheduler - running: HashSet()
14:51:50.796 INFO DAGScheduler - waiting: HashSet(ResultStage 246)
14:51:50.796 INFO DAGScheduler - failed: HashSet()
14:51:50.796 INFO DAGScheduler - Submitting ResultStage 246 (MapPartitionsRDD[1185] at mapToPair at BamSink.java:91), which has no missing parents
14:51:50.802 INFO MemoryStore - Block broadcast_496 stored as values in memory (estimated size 155.4 KiB, free 1916.3 MiB)
14:51:50.803 INFO MemoryStore - Block broadcast_496_piece0 stored as bytes in memory (estimated size 58.6 KiB, free 1916.2 MiB)
14:51:50.803 INFO BlockManagerInfo - Added broadcast_496_piece0 in memory on localhost:44923 (size: 58.6 KiB, free: 1919.2 MiB)
14:51:50.803 INFO SparkContext - Created broadcast 496 from broadcast at DAGScheduler.scala:1580
14:51:50.804 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 246 (MapPartitionsRDD[1185] at mapToPair at BamSink.java:91) (first 15 tasks are for partitions Vector(0))
14:51:50.804 INFO TaskSchedulerImpl - Adding task set 246.0 with 1 tasks resource profile 0
14:51:50.804 INFO TaskSetManager - Starting task 0.0 in stage 246.0 (TID 302) (localhost, executor driver, partition 0, NODE_LOCAL, 7513 bytes)
14:51:50.804 INFO Executor - Running task 0.0 in stage 246.0 (TID 302)
14:51:50.808 INFO ShuffleBlockFetcherIterator - Getting 1 (312.6 KiB) non-empty blocks including 1 (312.6 KiB) local and 0 (0.0 B) host-local and 0 (0.0 B) push-merged-local and 0 (0.0 B) remote blocks
14:51:50.808 INFO ShuffleBlockFetcherIterator - Started 0 remote fetches in 0 ms
14:51:50.820 INFO PathOutputCommitterFactory - No output committer factory defined, defaulting to FileOutputCommitterFactory
14:51:50.820 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
14:51:50.820 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
14:51:50.820 INFO PathOutputCommitterFactory - No output committer factory defined, defaulting to FileOutputCommitterFactory
14:51:50.820 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
14:51:50.820 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
14:51:50.821 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=create src=/user/runner/ReadsSparkSinkUnitTest3_195088ac-e76d-41ce-8376-1ba8fc8dab50.bam.parts/_temporary/0/_temporary/attempt_202603041451502624650439358907579_1185_r_000000_0/part-r-00000 dst=null perm=runner:supergroup:rw-r--r-- proto=rpc
14:51:50.822 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=create src=/user/runner/ReadsSparkSinkUnitTest3_195088ac-e76d-41ce-8376-1ba8fc8dab50.bam.parts/_temporary/0/_temporary/attempt_202603041451502624650439358907579_1185_r_000000_0/.part-r-00000.sbi dst=null perm=runner:supergroup:rw-r--r-- proto=rpc
14:51:50.823 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=create src=/user/runner/ReadsSparkSinkUnitTest3_195088ac-e76d-41ce-8376-1ba8fc8dab50.bam.parts/_temporary/0/_temporary/attempt_202603041451502624650439358907579_1185_r_000000_0/.part-r-00000.bai dst=null perm=runner:supergroup:rw-r--r-- proto=rpc
14:51:50.825 INFO StateChange - BLOCK* allocate blk_1073741905_1081, replicas=127.0.0.1:34059 for /user/runner/ReadsSparkSinkUnitTest3_195088ac-e76d-41ce-8376-1ba8fc8dab50.bam.parts/_temporary/0/_temporary/attempt_202603041451502624650439358907579_1185_r_000000_0/part-r-00000
14:51:50.826 INFO DataNode - Receiving BP-1768883704-10.1.0.125-1772635868711:blk_1073741905_1081 src: /127.0.0.1:54752 dest: /127.0.0.1:34059
14:51:50.828 INFO clienttrace - src: /127.0.0.1:54752, dest: /127.0.0.1:34059, bytes: 235299, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-951833550_1, offset: 0, srvID: 3c05d188-bc2c-46c8-958b-79f7184a02a2, blockid: BP-1768883704-10.1.0.125-1772635868711:blk_1073741905_1081, duration(ns): 1148913
14:51:50.828 INFO DataNode - PacketResponder: BP-1768883704-10.1.0.125-1772635868711:blk_1073741905_1081, type=LAST_IN_PIPELINE terminating
14:51:50.828 INFO FSNamesystem - BLOCK* blk_1073741905_1081 is COMMITTED but not COMPLETE(numNodes= 0 < minimum = 1) in file /user/runner/ReadsSparkSinkUnitTest3_195088ac-e76d-41ce-8376-1ba8fc8dab50.bam.parts/_temporary/0/_temporary/attempt_202603041451502624650439358907579_1185_r_000000_0/part-r-00000
14:51:51.229 INFO StateChange - DIR* completeFile: /user/runner/ReadsSparkSinkUnitTest3_195088ac-e76d-41ce-8376-1ba8fc8dab50.bam.parts/_temporary/0/_temporary/attempt_202603041451502624650439358907579_1185_r_000000_0/part-r-00000 is closed by DFSClient_NONMAPREDUCE_-951833550_1
14:51:51.230 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest3_195088ac-e76d-41ce-8376-1ba8fc8dab50.bam.parts/_temporary/0/_temporary/attempt_202603041451502624650439358907579_1185_r_000000_0/part-r-00000 dst=null perm=null proto=rpc
14:51:51.231 INFO StateChange - BLOCK* allocate blk_1073741906_1082, replicas=127.0.0.1:34059 for /user/runner/ReadsSparkSinkUnitTest3_195088ac-e76d-41ce-8376-1ba8fc8dab50.bam.parts/_temporary/0/_temporary/attempt_202603041451502624650439358907579_1185_r_000000_0/.part-r-00000.sbi
14:51:51.232 INFO DataNode - Receiving BP-1768883704-10.1.0.125-1772635868711:blk_1073741906_1082 src: /127.0.0.1:54764 dest: /127.0.0.1:34059
14:51:51.233 INFO clienttrace - src: /127.0.0.1:54764, dest: /127.0.0.1:34059, bytes: 204, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-951833550_1, offset: 0, srvID: 3c05d188-bc2c-46c8-958b-79f7184a02a2, blockid: BP-1768883704-10.1.0.125-1772635868711:blk_1073741906_1082, duration(ns): 473210
14:51:51.233 INFO DataNode - PacketResponder: BP-1768883704-10.1.0.125-1772635868711:blk_1073741906_1082, type=LAST_IN_PIPELINE terminating
14:51:51.234 INFO FSNamesystem - BLOCK* blk_1073741906_1082 is COMMITTED but not COMPLETE(numNodes= 0 < minimum = 1) in file /user/runner/ReadsSparkSinkUnitTest3_195088ac-e76d-41ce-8376-1ba8fc8dab50.bam.parts/_temporary/0/_temporary/attempt_202603041451502624650439358907579_1185_r_000000_0/.part-r-00000.sbi
14:51:51.635 INFO StateChange - DIR* completeFile: /user/runner/ReadsSparkSinkUnitTest3_195088ac-e76d-41ce-8376-1ba8fc8dab50.bam.parts/_temporary/0/_temporary/attempt_202603041451502624650439358907579_1185_r_000000_0/.part-r-00000.sbi is closed by DFSClient_NONMAPREDUCE_-951833550_1
14:51:51.636 INFO StateChange - BLOCK* allocate blk_1073741907_1083, replicas=127.0.0.1:34059 for /user/runner/ReadsSparkSinkUnitTest3_195088ac-e76d-41ce-8376-1ba8fc8dab50.bam.parts/_temporary/0/_temporary/attempt_202603041451502624650439358907579_1185_r_000000_0/.part-r-00000.bai
14:51:51.637 INFO DataNode - Receiving BP-1768883704-10.1.0.125-1772635868711:blk_1073741907_1083 src: /127.0.0.1:54766 dest: /127.0.0.1:34059
14:51:51.638 INFO clienttrace - src: /127.0.0.1:54766, dest: /127.0.0.1:34059, bytes: 592, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-951833550_1, offset: 0, srvID: 3c05d188-bc2c-46c8-958b-79f7184a02a2, blockid: BP-1768883704-10.1.0.125-1772635868711:blk_1073741907_1083, duration(ns): 472764
14:51:51.638 INFO DataNode - PacketResponder: BP-1768883704-10.1.0.125-1772635868711:blk_1073741907_1083, type=LAST_IN_PIPELINE terminating
14:51:51.638 INFO FSNamesystem - BLOCK* blk_1073741907_1083 is COMMITTED but not COMPLETE(numNodes= 0 < minimum = 1) in file /user/runner/ReadsSparkSinkUnitTest3_195088ac-e76d-41ce-8376-1ba8fc8dab50.bam.parts/_temporary/0/_temporary/attempt_202603041451502624650439358907579_1185_r_000000_0/.part-r-00000.bai
14:51:52.031 INFO FsDatasetAsyncDiskService - Scheduling blk_1073741899_1075 replica FinalizedReplica, blk_1073741899_1075, FINALIZED
getNumBytes() = 212
getBytesOnDisk() = 212
getVisibleLength()= 212
getVolume() = /tmp/minicluster_storage16268522075870465194/data/data1
getBlockURI() = file:/tmp/minicluster_storage16268522075870465194/data/data1/current/BP-1768883704-10.1.0.125-1772635868711/current/finalized/subdir0/subdir0/blk_1073741899 for deletion
14:51:52.031 INFO FsDatasetAsyncDiskService - Scheduling blk_1073741900_1076 replica FinalizedReplica, blk_1073741900_1076, FINALIZED
getNumBytes() = 5472
getBytesOnDisk() = 5472
getVisibleLength()= 5472
getVolume() = /tmp/minicluster_storage16268522075870465194/data/data2
getBlockURI() = file:/tmp/minicluster_storage16268522075870465194/data/data2/current/BP-1768883704-10.1.0.125-1772635868711/current/finalized/subdir0/subdir0/blk_1073741900 for deletion
14:51:52.031 INFO FsDatasetAsyncDiskService - Deleted BP-1768883704-10.1.0.125-1772635868711 blk_1073741899_1075 URI file:/tmp/minicluster_storage16268522075870465194/data/data1/current/BP-1768883704-10.1.0.125-1772635868711/current/finalized/subdir0/subdir0/blk_1073741899
14:51:52.032 INFO FsDatasetAsyncDiskService - Deleted BP-1768883704-10.1.0.125-1772635868711 blk_1073741900_1076 URI file:/tmp/minicluster_storage16268522075870465194/data/data2/current/BP-1768883704-10.1.0.125-1772635868711/current/finalized/subdir0/subdir0/blk_1073741900
14:51:52.039 INFO StateChange - DIR* completeFile: /user/runner/ReadsSparkSinkUnitTest3_195088ac-e76d-41ce-8376-1ba8fc8dab50.bam.parts/_temporary/0/_temporary/attempt_202603041451502624650439358907579_1185_r_000000_0/.part-r-00000.bai is closed by DFSClient_NONMAPREDUCE_-951833550_1
14:51:52.040 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest3_195088ac-e76d-41ce-8376-1ba8fc8dab50.bam.parts/_temporary/0/_temporary/attempt_202603041451502624650439358907579_1185_r_000000_0 dst=null perm=null proto=rpc
14:51:52.041 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest3_195088ac-e76d-41ce-8376-1ba8fc8dab50.bam.parts/_temporary/0/_temporary/attempt_202603041451502624650439358907579_1185_r_000000_0 dst=null perm=null proto=rpc
14:51:52.041 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest3_195088ac-e76d-41ce-8376-1ba8fc8dab50.bam.parts/_temporary/0/task_202603041451502624650439358907579_1185_r_000000 dst=null perm=null proto=rpc
14:51:52.042 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=rename src=/user/runner/ReadsSparkSinkUnitTest3_195088ac-e76d-41ce-8376-1ba8fc8dab50.bam.parts/_temporary/0/_temporary/attempt_202603041451502624650439358907579_1185_r_000000_0 dst=/user/runner/ReadsSparkSinkUnitTest3_195088ac-e76d-41ce-8376-1ba8fc8dab50.bam.parts/_temporary/0/task_202603041451502624650439358907579_1185_r_000000 perm=runner:supergroup:rwxr-xr-x proto=rpc
14:51:52.042 INFO FileOutputCommitter - Saved output of task 'attempt_202603041451502624650439358907579_1185_r_000000_0' to hdfs://localhost:43595/user/runner/ReadsSparkSinkUnitTest3_195088ac-e76d-41ce-8376-1ba8fc8dab50.bam.parts/_temporary/0/task_202603041451502624650439358907579_1185_r_000000
14:51:52.042 INFO SparkHadoopMapRedUtil - attempt_202603041451502624650439358907579_1185_r_000000_0: Committed. Elapsed time: 1 ms.
14:51:52.042 INFO Executor - Finished task 0.0 in stage 246.0 (TID 302). 1858 bytes result sent to driver
14:51:52.043 INFO TaskSetManager - Finished task 0.0 in stage 246.0 (TID 302) in 1239 ms on localhost (executor driver) (1/1)
14:51:52.043 INFO TaskSchedulerImpl - Removed TaskSet 246.0, whose tasks have all completed, from pool
14:51:52.043 INFO DAGScheduler - ResultStage 246 (runJob at SparkHadoopWriter.scala:83) finished in 1.247 s
14:51:52.043 INFO DAGScheduler - Job 185 is finished. Cancelling potential speculative or zombie tasks for this job
14:51:52.043 INFO TaskSchedulerImpl - Killing all running tasks in stage 246: Stage finished
14:51:52.043 INFO DAGScheduler - Job 185 finished: runJob at SparkHadoopWriter.scala:83, took 1.314399 s
14:51:52.044 INFO SparkHadoopWriter - Start to commit write Job job_202603041451502624650439358907579_1185.
14:51:52.044 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=listStatus src=/user/runner/ReadsSparkSinkUnitTest3_195088ac-e76d-41ce-8376-1ba8fc8dab50.bam.parts/_temporary/0 dst=null perm=null proto=rpc
14:51:52.045 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest3_195088ac-e76d-41ce-8376-1ba8fc8dab50.bam.parts dst=null perm=null proto=rpc
14:51:52.045 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=listStatus src=/user/runner/ReadsSparkSinkUnitTest3_195088ac-e76d-41ce-8376-1ba8fc8dab50.bam.parts/_temporary/0/task_202603041451502624650439358907579_1185_r_000000 dst=null perm=null proto=rpc
14:51:52.045 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest3_195088ac-e76d-41ce-8376-1ba8fc8dab50.bam.parts/.part-r-00000.bai dst=null perm=null proto=rpc
14:51:52.046 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=rename src=/user/runner/ReadsSparkSinkUnitTest3_195088ac-e76d-41ce-8376-1ba8fc8dab50.bam.parts/_temporary/0/task_202603041451502624650439358907579_1185_r_000000/.part-r-00000.bai dst=/user/runner/ReadsSparkSinkUnitTest3_195088ac-e76d-41ce-8376-1ba8fc8dab50.bam.parts/.part-r-00000.bai perm=runner:supergroup:rw-r--r-- proto=rpc
14:51:52.046 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest3_195088ac-e76d-41ce-8376-1ba8fc8dab50.bam.parts/.part-r-00000.sbi dst=null perm=null proto=rpc
14:51:52.047 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=rename src=/user/runner/ReadsSparkSinkUnitTest3_195088ac-e76d-41ce-8376-1ba8fc8dab50.bam.parts/_temporary/0/task_202603041451502624650439358907579_1185_r_000000/.part-r-00000.sbi dst=/user/runner/ReadsSparkSinkUnitTest3_195088ac-e76d-41ce-8376-1ba8fc8dab50.bam.parts/.part-r-00000.sbi perm=runner:supergroup:rw-r--r-- proto=rpc
14:51:52.047 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest3_195088ac-e76d-41ce-8376-1ba8fc8dab50.bam.parts/part-r-00000 dst=null perm=null proto=rpc
14:51:52.047 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=rename src=/user/runner/ReadsSparkSinkUnitTest3_195088ac-e76d-41ce-8376-1ba8fc8dab50.bam.parts/_temporary/0/task_202603041451502624650439358907579_1185_r_000000/part-r-00000 dst=/user/runner/ReadsSparkSinkUnitTest3_195088ac-e76d-41ce-8376-1ba8fc8dab50.bam.parts/part-r-00000 perm=runner:supergroup:rw-r--r-- proto=rpc
14:51:52.048 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=delete src=/user/runner/ReadsSparkSinkUnitTest3_195088ac-e76d-41ce-8376-1ba8fc8dab50.bam.parts/_temporary dst=null perm=null proto=rpc
14:51:52.049 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=create src=/user/runner/ReadsSparkSinkUnitTest3_195088ac-e76d-41ce-8376-1ba8fc8dab50.bam.parts/_SUCCESS dst=null perm=runner:supergroup:rw-r--r-- proto=rpc
14:51:52.050 INFO StateChange - DIR* completeFile: /user/runner/ReadsSparkSinkUnitTest3_195088ac-e76d-41ce-8376-1ba8fc8dab50.bam.parts/_SUCCESS is closed by DFSClient_NONMAPREDUCE_-951833550_1
14:51:52.050 INFO audit - allowed=false ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=delete src=/user/runner/ReadsSparkSinkUnitTest3_195088ac-e76d-41ce-8376-1ba8fc8dab50.bam.parts/.spark-staging-1185 dst=null perm=null proto=rpc
14:51:52.050 INFO SparkHadoopWriter - Write Job job_202603041451502624650439358907579_1185 committed. Elapsed time: 6 ms.
14:51:52.051 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=create src=/user/runner/ReadsSparkSinkUnitTest3_195088ac-e76d-41ce-8376-1ba8fc8dab50.bam.parts/header dst=null perm=runner:supergroup:rw-r--r-- proto=rpc
14:51:52.052 INFO StateChange - BLOCK* allocate blk_1073741908_1084, replicas=127.0.0.1:34059 for /user/runner/ReadsSparkSinkUnitTest3_195088ac-e76d-41ce-8376-1ba8fc8dab50.bam.parts/header
14:51:52.053 INFO DataNode - Receiving BP-1768883704-10.1.0.125-1772635868711:blk_1073741908_1084 src: /127.0.0.1:54778 dest: /127.0.0.1:34059
14:51:52.055 INFO clienttrace - src: /127.0.0.1:54778, dest: /127.0.0.1:34059, bytes: 1190, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-951833550_1, offset: 0, srvID: 3c05d188-bc2c-46c8-958b-79f7184a02a2, blockid: BP-1768883704-10.1.0.125-1772635868711:blk_1073741908_1084, duration(ns): 495788
14:51:52.055 INFO DataNode - PacketResponder: BP-1768883704-10.1.0.125-1772635868711:blk_1073741908_1084, type=LAST_IN_PIPELINE terminating
14:51:52.055 INFO FSNamesystem - BLOCK* blk_1073741908_1084 is COMMITTED but not COMPLETE(numNodes= 0 < minimum = 1) in file /user/runner/ReadsSparkSinkUnitTest3_195088ac-e76d-41ce-8376-1ba8fc8dab50.bam.parts/header
14:51:52.456 INFO StateChange - DIR* completeFile: /user/runner/ReadsSparkSinkUnitTest3_195088ac-e76d-41ce-8376-1ba8fc8dab50.bam.parts/header is closed by DFSClient_NONMAPREDUCE_-951833550_1
14:51:52.457 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=create src=/user/runner/ReadsSparkSinkUnitTest3_195088ac-e76d-41ce-8376-1ba8fc8dab50.bam.parts/terminator dst=null perm=runner:supergroup:rw-r--r-- proto=rpc
14:51:52.458 INFO StateChange - BLOCK* allocate blk_1073741909_1085, replicas=127.0.0.1:34059 for /user/runner/ReadsSparkSinkUnitTest3_195088ac-e76d-41ce-8376-1ba8fc8dab50.bam.parts/terminator
14:51:52.459 INFO DataNode - Receiving BP-1768883704-10.1.0.125-1772635868711:blk_1073741909_1085 src: /127.0.0.1:54794 dest: /127.0.0.1:34059
14:51:52.460 INFO clienttrace - src: /127.0.0.1:54794, dest: /127.0.0.1:34059, bytes: 28, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-951833550_1, offset: 0, srvID: 3c05d188-bc2c-46c8-958b-79f7184a02a2, blockid: BP-1768883704-10.1.0.125-1772635868711:blk_1073741909_1085, duration(ns): 502076
14:51:52.460 INFO DataNode - PacketResponder: BP-1768883704-10.1.0.125-1772635868711:blk_1073741909_1085, type=LAST_IN_PIPELINE terminating
14:51:52.461 INFO FSNamesystem - BLOCK* blk_1073741909_1085 is COMMITTED but not COMPLETE(numNodes= 0 < minimum = 1) in file /user/runner/ReadsSparkSinkUnitTest3_195088ac-e76d-41ce-8376-1ba8fc8dab50.bam.parts/terminator
14:51:52.862 INFO StateChange - DIR* completeFile: /user/runner/ReadsSparkSinkUnitTest3_195088ac-e76d-41ce-8376-1ba8fc8dab50.bam.parts/terminator is closed by DFSClient_NONMAPREDUCE_-951833550_1
14:51:52.863 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=listStatus src=/user/runner/ReadsSparkSinkUnitTest3_195088ac-e76d-41ce-8376-1ba8fc8dab50.bam.parts dst=null perm=null proto=rpc
14:51:52.864 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=create src=/user/runner/ReadsSparkSinkUnitTest3_195088ac-e76d-41ce-8376-1ba8fc8dab50.bam.parts/output dst=null perm=runner:supergroup:rw-r--r-- proto=rpc
14:51:52.864 INFO StateChange - DIR* completeFile: /user/runner/ReadsSparkSinkUnitTest3_195088ac-e76d-41ce-8376-1ba8fc8dab50.bam.parts/output is closed by DFSClient_NONMAPREDUCE_-951833550_1
14:51:52.864 INFO HadoopFileSystemWrapper - Concatenating 3 parts to hdfs://localhost:43595/user/runner/ReadsSparkSinkUnitTest3_195088ac-e76d-41ce-8376-1ba8fc8dab50.bam
14:51:52.865 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=concat src=[/user/runner/ReadsSparkSinkUnitTest3_195088ac-e76d-41ce-8376-1ba8fc8dab50.bam.parts/header, /user/runner/ReadsSparkSinkUnitTest3_195088ac-e76d-41ce-8376-1ba8fc8dab50.bam.parts/part-r-00000, /user/runner/ReadsSparkSinkUnitTest3_195088ac-e76d-41ce-8376-1ba8fc8dab50.bam.parts/terminator] dst=/user/runner/ReadsSparkSinkUnitTest3_195088ac-e76d-41ce-8376-1ba8fc8dab50.bam.parts/output perm=runner:supergroup:rw-r--r-- proto=rpc
14:51:52.866 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest3_195088ac-e76d-41ce-8376-1ba8fc8dab50.bam dst=null perm=null proto=rpc
14:51:52.866 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=delete src=/user/runner/ReadsSparkSinkUnitTest3_195088ac-e76d-41ce-8376-1ba8fc8dab50.bam dst=null perm=null proto=rpc
14:51:52.867 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=rename src=/user/runner/ReadsSparkSinkUnitTest3_195088ac-e76d-41ce-8376-1ba8fc8dab50.bam.parts/output dst=/user/runner/ReadsSparkSinkUnitTest3_195088ac-e76d-41ce-8376-1ba8fc8dab50.bam perm=runner:supergroup:rw-r--r-- proto=rpc
14:51:52.867 INFO HadoopFileSystemWrapper - Concatenating to hdfs://localhost:43595/user/runner/ReadsSparkSinkUnitTest3_195088ac-e76d-41ce-8376-1ba8fc8dab50.bam done
14:51:52.867 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest3_195088ac-e76d-41ce-8376-1ba8fc8dab50.bam dst=null perm=null proto=rpc
14:51:52.867 INFO IndexFileMerger - Merging .sbi files in temp directory hdfs://localhost:43595/user/runner/ReadsSparkSinkUnitTest3_195088ac-e76d-41ce-8376-1ba8fc8dab50.bam.parts/ to hdfs://localhost:43595/user/runner/ReadsSparkSinkUnitTest3_195088ac-e76d-41ce-8376-1ba8fc8dab50.bam.sbi
14:51:52.868 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=listStatus src=/user/runner/ReadsSparkSinkUnitTest3_195088ac-e76d-41ce-8376-1ba8fc8dab50.bam.parts dst=null perm=null proto=rpc
14:51:52.869 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=create src=/user/runner/ReadsSparkSinkUnitTest3_195088ac-e76d-41ce-8376-1ba8fc8dab50.bam.sbi dst=null perm=runner:supergroup:rw-r--r-- proto=rpc
14:51:52.870 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest3_195088ac-e76d-41ce-8376-1ba8fc8dab50.bam.parts/.part-r-00000.sbi dst=null perm=null proto=rpc
14:51:52.870 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest3_195088ac-e76d-41ce-8376-1ba8fc8dab50.bam.parts/.part-r-00000.sbi dst=null perm=null proto=rpc
14:51:52.871 WARN DFSUtil - Unexpected value for data transfer bytes=208 duration=0
14:51:52.871 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=delete src=/user/runner/ReadsSparkSinkUnitTest3_195088ac-e76d-41ce-8376-1ba8fc8dab50.bam.parts/.part-r-00000.sbi dst=null perm=null proto=rpc
14:51:52.872 INFO StateChange - BLOCK* allocate blk_1073741910_1086, replicas=127.0.0.1:34059 for /user/runner/ReadsSparkSinkUnitTest3_195088ac-e76d-41ce-8376-1ba8fc8dab50.bam.sbi
14:51:52.873 INFO DataNode - Receiving BP-1768883704-10.1.0.125-1772635868711:blk_1073741910_1086 src: /127.0.0.1:54806 dest: /127.0.0.1:34059
14:51:52.875 INFO clienttrace - src: /127.0.0.1:54806, dest: /127.0.0.1:34059, bytes: 204, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-951833550_1, offset: 0, srvID: 3c05d188-bc2c-46c8-958b-79f7184a02a2, blockid: BP-1768883704-10.1.0.125-1772635868711:blk_1073741910_1086, duration(ns): 526869
14:51:52.875 INFO DataNode - PacketResponder: BP-1768883704-10.1.0.125-1772635868711:blk_1073741910_1086, type=LAST_IN_PIPELINE terminating
14:51:52.875 INFO FSNamesystem - BLOCK* blk_1073741910_1086 is COMMITTED but not COMPLETE(numNodes= 0 < minimum = 1) in file /user/runner/ReadsSparkSinkUnitTest3_195088ac-e76d-41ce-8376-1ba8fc8dab50.bam.sbi
14:51:53.276 INFO StateChange - DIR* completeFile: /user/runner/ReadsSparkSinkUnitTest3_195088ac-e76d-41ce-8376-1ba8fc8dab50.bam.sbi is closed by DFSClient_NONMAPREDUCE_-951833550_1
14:51:53.277 INFO IndexFileMerger - Done merging .sbi files
14:51:53.277 INFO IndexFileMerger - Merging .bai files in temp directory hdfs://localhost:43595/user/runner/ReadsSparkSinkUnitTest3_195088ac-e76d-41ce-8376-1ba8fc8dab50.bam.parts/ to hdfs://localhost:43595/user/runner/ReadsSparkSinkUnitTest3_195088ac-e76d-41ce-8376-1ba8fc8dab50.bam.bai
14:51:53.277 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=listStatus src=/user/runner/ReadsSparkSinkUnitTest3_195088ac-e76d-41ce-8376-1ba8fc8dab50.bam.parts dst=null perm=null proto=rpc
14:51:53.278 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=create src=/user/runner/ReadsSparkSinkUnitTest3_195088ac-e76d-41ce-8376-1ba8fc8dab50.bam.bai dst=null perm=runner:supergroup:rw-r--r-- proto=rpc
14:51:53.279 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest3_195088ac-e76d-41ce-8376-1ba8fc8dab50.bam.parts/.part-r-00000.bai dst=null perm=null proto=rpc
14:51:53.279 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest3_195088ac-e76d-41ce-8376-1ba8fc8dab50.bam.parts/.part-r-00000.bai dst=null perm=null proto=rpc
14:51:53.280 WARN DFSUtil - Unexpected value for data transfer bytes=600 duration=0
14:51:53.280 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=delete src=/user/runner/ReadsSparkSinkUnitTest3_195088ac-e76d-41ce-8376-1ba8fc8dab50.bam.parts/.part-r-00000.bai dst=null perm=null proto=rpc
14:51:53.281 INFO StateChange - BLOCK* allocate blk_1073741911_1087, replicas=127.0.0.1:34059 for /user/runner/ReadsSparkSinkUnitTest3_195088ac-e76d-41ce-8376-1ba8fc8dab50.bam.bai
14:51:53.282 INFO DataNode - Receiving BP-1768883704-10.1.0.125-1772635868711:blk_1073741911_1087 src: /127.0.0.1:54810 dest: /127.0.0.1:34059
14:51:53.283 INFO clienttrace - src: /127.0.0.1:54810, dest: /127.0.0.1:34059, bytes: 592, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-951833550_1, offset: 0, srvID: 3c05d188-bc2c-46c8-958b-79f7184a02a2, blockid: BP-1768883704-10.1.0.125-1772635868711:blk_1073741911_1087, duration(ns): 438547
14:51:53.283 INFO DataNode - PacketResponder: BP-1768883704-10.1.0.125-1772635868711:blk_1073741911_1087, type=LAST_IN_PIPELINE terminating
14:51:53.284 INFO FSNamesystem - BLOCK* blk_1073741911_1087 is COMMITTED but not COMPLETE(numNodes= 0 < minimum = 1) in file /user/runner/ReadsSparkSinkUnitTest3_195088ac-e76d-41ce-8376-1ba8fc8dab50.bam.bai
14:51:53.684 INFO StateChange - DIR* completeFile: /user/runner/ReadsSparkSinkUnitTest3_195088ac-e76d-41ce-8376-1ba8fc8dab50.bam.bai is closed by DFSClient_NONMAPREDUCE_-951833550_1
14:51:53.685 INFO IndexFileMerger - Done merging .bai files
14:51:53.685 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=delete src=/user/runner/ReadsSparkSinkUnitTest3_195088ac-e76d-41ce-8376-1ba8fc8dab50.bam.parts dst=null perm=null proto=rpc
14:51:53.695 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest3_195088ac-e76d-41ce-8376-1ba8fc8dab50.bam.bai dst=null perm=null proto=rpc
14:51:53.704 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest3_195088ac-e76d-41ce-8376-1ba8fc8dab50.bam.sbi dst=null perm=null proto=rpc
14:51:53.704 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest3_195088ac-e76d-41ce-8376-1ba8fc8dab50.bam.sbi dst=null perm=null proto=rpc
14:51:53.705 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest3_195088ac-e76d-41ce-8376-1ba8fc8dab50.bam.sbi dst=null perm=null proto=rpc
14:51:53.705 WARN DFSUtil - Unexpected value for data transfer bytes=208 duration=0
14:51:53.706 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest3_195088ac-e76d-41ce-8376-1ba8fc8dab50.bam dst=null perm=null proto=rpc
14:51:53.706 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest3_195088ac-e76d-41ce-8376-1ba8fc8dab50.bam dst=null perm=null proto=rpc
14:51:53.707 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest3_195088ac-e76d-41ce-8376-1ba8fc8dab50.bam dst=null perm=null proto=rpc
14:51:53.707 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest3_195088ac-e76d-41ce-8376-1ba8fc8dab50.bam dst=null perm=null proto=rpc
14:51:53.708 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest3_195088ac-e76d-41ce-8376-1ba8fc8dab50.bam.bai dst=null perm=null proto=rpc
14:51:53.708 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest3_195088ac-e76d-41ce-8376-1ba8fc8dab50.bam.bai dst=null perm=null proto=rpc
14:51:53.708 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest3_195088ac-e76d-41ce-8376-1ba8fc8dab50.bam.bai dst=null perm=null proto=rpc
14:51:53.710 WARN DFSUtil - Unexpected value for data transfer bytes=1202 duration=0
14:51:53.711 WARN DFSUtil - Unexpected value for data transfer bytes=600 duration=0
14:51:53.711 WARN DFSUtil - Unexpected value for data transfer bytes=600 duration=0
14:51:53.711 WARN DFSUtil - Unexpected value for data transfer bytes=237139 duration=0
14:51:53.712 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest3_195088ac-e76d-41ce-8376-1ba8fc8dab50.bam.sbi dst=null perm=null proto=rpc
14:51:53.712 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest3_195088ac-e76d-41ce-8376-1ba8fc8dab50.bam.sbi dst=null perm=null proto=rpc
14:51:53.712 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest3_195088ac-e76d-41ce-8376-1ba8fc8dab50.bam.sbi dst=null perm=null proto=rpc
14:51:53.713 WARN DFSUtil - Unexpected value for data transfer bytes=208 duration=0
14:51:53.713 INFO MemoryStore - Block broadcast_497 stored as values in memory (estimated size 312.0 B, free 1916.2 MiB)
14:51:53.714 INFO MemoryStore - Block broadcast_497_piece0 stored as bytes in memory (estimated size 231.0 B, free 1916.2 MiB)
14:51:53.714 INFO BlockManagerInfo - Added broadcast_497_piece0 in memory on localhost:44923 (size: 231.0 B, free: 1919.2 MiB)
14:51:53.714 INFO SparkContext - Created broadcast 497 from broadcast at BamSource.java:104
14:51:53.715 INFO MemoryStore - Block broadcast_498 stored as values in memory (estimated size 297.9 KiB, free 1915.9 MiB)
14:51:53.721 INFO BlockManagerInfo - Removed broadcast_483_piece0 on localhost:44923 in memory (size: 9.6 KiB, free: 1919.3 MiB)
14:51:53.721 INFO BlockManagerInfo - Removed broadcast_496_piece0 on localhost:44923 in memory (size: 58.6 KiB, free: 1919.3 MiB)
14:51:53.722 INFO BlockManagerInfo - Removed broadcast_494_piece0 on localhost:44923 in memory (size: 1890.0 B, free: 1919.3 MiB)
14:51:53.722 INFO BlockManagerInfo - Removed broadcast_489_piece0 on localhost:44923 in memory (size: 153.6 KiB, free: 1919.5 MiB)
14:51:53.722 INFO BlockManagerInfo - Removed broadcast_493_piece0 on localhost:44923 in memory (size: 1890.0 B, free: 1919.5 MiB)
14:51:53.723 INFO BlockManagerInfo - Removed broadcast_487_piece0 on localhost:44923 in memory (size: 50.2 KiB, free: 1919.5 MiB)
14:51:53.723 INFO BlockManagerInfo - Removed broadcast_486_piece0 on localhost:44923 in memory (size: 233.0 B, free: 1919.5 MiB)
14:51:53.724 INFO BlockManagerInfo - Removed broadcast_492_piece0 on localhost:44923 in memory (size: 50.2 KiB, free: 1919.6 MiB)
14:51:53.724 INFO BlockManagerInfo - Removed broadcast_480_piece0 on localhost:44923 in memory (size: 50.3 KiB, free: 1919.6 MiB)
14:51:53.725 INFO BlockManagerInfo - Removed broadcast_482_piece0 on localhost:44923 in memory (size: 9.6 KiB, free: 1919.6 MiB)
14:51:53.725 INFO BlockManagerInfo - Removed broadcast_485_piece0 on localhost:44923 in memory (size: 67.1 KiB, free: 1919.7 MiB)
14:51:53.725 INFO BlockManagerInfo - Removed broadcast_488_piece0 on localhost:44923 in memory (size: 54.6 KiB, free: 1919.7 MiB)
14:51:53.726 INFO BlockManagerInfo - Removed broadcast_495_piece0 on localhost:44923 in memory (size: 157.6 KiB, free: 1919.9 MiB)
14:51:53.726 INFO BlockManagerInfo - Removed broadcast_490_piece0 on localhost:44923 in memory (size: 54.6 KiB, free: 1920.0 MiB)
14:51:53.728 INFO MemoryStore - Block broadcast_498_piece0 stored as bytes in memory (estimated size 50.2 KiB, free 1919.3 MiB)
14:51:53.728 INFO BlockManagerInfo - Added broadcast_498_piece0 in memory on localhost:44923 (size: 50.2 KiB, free: 1919.9 MiB)
14:51:53.728 INFO SparkContext - Created broadcast 498 from newAPIHadoopFile at PathSplitSource.java:96
14:51:53.737 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest3_195088ac-e76d-41ce-8376-1ba8fc8dab50.bam dst=null perm=null proto=rpc
14:51:53.737 INFO FileInputFormat - Total input files to process : 1
14:51:53.738 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest3_195088ac-e76d-41ce-8376-1ba8fc8dab50.bam dst=null perm=null proto=rpc
14:51:53.752 INFO SparkContext - Starting job: collect at ReadsSparkSinkUnitTest.java:182
14:51:53.753 INFO DAGScheduler - Got job 186 (collect at ReadsSparkSinkUnitTest.java:182) with 1 output partitions
14:51:53.753 INFO DAGScheduler - Final stage: ResultStage 247 (collect at ReadsSparkSinkUnitTest.java:182)
14:51:53.753 INFO DAGScheduler - Parents of final stage: List()
14:51:53.753 INFO DAGScheduler - Missing parents: List()
14:51:53.753 INFO DAGScheduler - Submitting ResultStage 247 (MapPartitionsRDD[1191] at filter at ReadsSparkSource.java:96), which has no missing parents
14:51:53.759 INFO MemoryStore - Block broadcast_499 stored as values in memory (estimated size 148.2 KiB, free 1919.2 MiB)
14:51:53.760 INFO MemoryStore - Block broadcast_499_piece0 stored as bytes in memory (estimated size 54.6 KiB, free 1919.1 MiB)
14:51:53.760 INFO BlockManagerInfo - Added broadcast_499_piece0 in memory on localhost:44923 (size: 54.6 KiB, free: 1919.8 MiB)
14:51:53.760 INFO SparkContext - Created broadcast 499 from broadcast at DAGScheduler.scala:1580
14:51:53.761 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 247 (MapPartitionsRDD[1191] at filter at ReadsSparkSource.java:96) (first 15 tasks are for partitions Vector(0))
14:51:53.761 INFO TaskSchedulerImpl - Adding task set 247.0 with 1 tasks resource profile 0
14:51:53.761 INFO TaskSetManager - Starting task 0.0 in stage 247.0 (TID 303) (localhost, executor driver, partition 0, ANY, 7852 bytes)
14:51:53.762 INFO Executor - Running task 0.0 in stage 247.0 (TID 303)
14:51:53.775 INFO NewHadoopRDD - Input split: hdfs://localhost:43595/user/runner/ReadsSparkSinkUnitTest3_195088ac-e76d-41ce-8376-1ba8fc8dab50.bam:0+236517
14:51:53.776 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest3_195088ac-e76d-41ce-8376-1ba8fc8dab50.bam dst=null perm=null proto=rpc
14:51:53.777 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest3_195088ac-e76d-41ce-8376-1ba8fc8dab50.bam dst=null perm=null proto=rpc
14:51:53.777 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest3_195088ac-e76d-41ce-8376-1ba8fc8dab50.bam.bai dst=null perm=null proto=rpc
14:51:53.778 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest3_195088ac-e76d-41ce-8376-1ba8fc8dab50.bam.bai dst=null perm=null proto=rpc
14:51:53.778 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest3_195088ac-e76d-41ce-8376-1ba8fc8dab50.bam.bai dst=null perm=null proto=rpc
14:51:53.780 WARN DFSUtil - Unexpected value for data transfer bytes=1202 duration=0
14:51:53.781 WARN DFSUtil - Unexpected value for data transfer bytes=600 duration=0
14:51:53.782 WARN DFSUtil - Unexpected value for data transfer bytes=600 duration=0
14:51:53.783 WARN DFSUtil - Unexpected value for data transfer bytes=237139 duration=0
14:51:53.784 WARN DFSUtil - Unexpected value for data transfer bytes=32 duration=0
14:51:53.787 INFO Executor - Finished task 0.0 in stage 247.0 (TID 303). 749513 bytes result sent to driver
14:51:53.790 INFO TaskSetManager - Finished task 0.0 in stage 247.0 (TID 303) in 28 ms on localhost (executor driver) (1/1)
14:51:53.790 INFO TaskSchedulerImpl - Removed TaskSet 247.0, whose tasks have all completed, from pool
14:51:53.790 INFO DAGScheduler - ResultStage 247 (collect at ReadsSparkSinkUnitTest.java:182) finished in 0.037 s
14:51:53.790 INFO DAGScheduler - Job 186 is finished. Cancelling potential speculative or zombie tasks for this job
14:51:53.790 INFO TaskSchedulerImpl - Killing all running tasks in stage 247: Stage finished
14:51:53.790 INFO DAGScheduler - Job 186 finished: collect at ReadsSparkSinkUnitTest.java:182, took 0.037510 s
14:51:53.800 INFO SparkContext - Starting job: count at ReadsSparkSinkUnitTest.java:185
14:51:53.801 INFO DAGScheduler - Got job 187 (count at ReadsSparkSinkUnitTest.java:185) with 1 output partitions
14:51:53.801 INFO DAGScheduler - Final stage: ResultStage 248 (count at ReadsSparkSinkUnitTest.java:185)
14:51:53.801 INFO DAGScheduler - Parents of final stage: List()
14:51:53.801 INFO DAGScheduler - Missing parents: List()
14:51:53.801 INFO DAGScheduler - Submitting ResultStage 248 (MapPartitionsRDD[1173] at filter at ReadsSparkSource.java:96), which has no missing parents
14:51:53.823 INFO MemoryStore - Block broadcast_500 stored as values in memory (estimated size 426.1 KiB, free 1918.7 MiB)
14:51:53.825 INFO MemoryStore - Block broadcast_500_piece0 stored as bytes in memory (estimated size 153.6 KiB, free 1918.6 MiB)
14:51:53.825 INFO BlockManagerInfo - Added broadcast_500_piece0 in memory on localhost:44923 (size: 153.6 KiB, free: 1919.7 MiB)
14:51:53.825 INFO SparkContext - Created broadcast 500 from broadcast at DAGScheduler.scala:1580
14:51:53.825 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 248 (MapPartitionsRDD[1173] at filter at ReadsSparkSource.java:96) (first 15 tasks are for partitions Vector(0))
14:51:53.825 INFO TaskSchedulerImpl - Adding task set 248.0 with 1 tasks resource profile 0
14:51:53.826 INFO TaskSetManager - Starting task 0.0 in stage 248.0 (TID 304) (localhost, executor driver, partition 0, PROCESS_LOCAL, 7893 bytes)
14:51:53.826 INFO Executor - Running task 0.0 in stage 248.0 (TID 304)
14:51:53.866 INFO NewHadoopRDD - Input split: file:/home/runner/work/gatk/gatk/src/test/resources/org/broadinstitute/hellbender/tools/BQSR/CEUTrio.HiSeq.WGS.b37.ch20.1m-1m1k.NA12878.bam:0+211123
14:51:53.874 INFO Executor - Finished task 0.0 in stage 248.0 (TID 304). 989 bytes result sent to driver
14:51:53.874 INFO TaskSetManager - Finished task 0.0 in stage 248.0 (TID 304) in 48 ms on localhost (executor driver) (1/1)
14:51:53.874 INFO TaskSchedulerImpl - Removed TaskSet 248.0, whose tasks have all completed, from pool
14:51:53.875 INFO DAGScheduler - ResultStage 248 (count at ReadsSparkSinkUnitTest.java:185) finished in 0.074 s
14:51:53.875 INFO DAGScheduler - Job 187 is finished. Cancelling potential speculative or zombie tasks for this job
14:51:53.875 INFO TaskSchedulerImpl - Killing all running tasks in stage 248: Stage finished
14:51:53.875 INFO DAGScheduler - Job 187 finished: count at ReadsSparkSinkUnitTest.java:185, took 0.074437 s
14:51:53.878 INFO SparkContext - Starting job: count at ReadsSparkSinkUnitTest.java:185
14:51:53.879 INFO DAGScheduler - Got job 188 (count at ReadsSparkSinkUnitTest.java:185) with 1 output partitions
14:51:53.879 INFO DAGScheduler - Final stage: ResultStage 249 (count at ReadsSparkSinkUnitTest.java:185)
14:51:53.879 INFO DAGScheduler - Parents of final stage: List()
14:51:53.879 INFO DAGScheduler - Missing parents: List()
14:51:53.879 INFO DAGScheduler - Submitting ResultStage 249 (MapPartitionsRDD[1191] at filter at ReadsSparkSource.java:96), which has no missing parents
14:51:53.885 INFO MemoryStore - Block broadcast_501 stored as values in memory (estimated size 148.1 KiB, free 1918.4 MiB)
14:51:53.886 INFO MemoryStore - Block broadcast_501_piece0 stored as bytes in memory (estimated size 54.6 KiB, free 1918.4 MiB)
14:51:53.886 INFO BlockManagerInfo - Added broadcast_501_piece0 in memory on localhost:44923 (size: 54.6 KiB, free: 1919.6 MiB)
14:51:53.886 INFO SparkContext - Created broadcast 501 from broadcast at DAGScheduler.scala:1580
14:51:53.886 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 249 (MapPartitionsRDD[1191] at filter at ReadsSparkSource.java:96) (first 15 tasks are for partitions Vector(0))
14:51:53.886 INFO TaskSchedulerImpl - Adding task set 249.0 with 1 tasks resource profile 0
14:51:53.887 INFO TaskSetManager - Starting task 0.0 in stage 249.0 (TID 305) (localhost, executor driver, partition 0, ANY, 7852 bytes)
14:51:53.887 INFO Executor - Running task 0.0 in stage 249.0 (TID 305)
14:51:53.900 INFO NewHadoopRDD - Input split: hdfs://localhost:43595/user/runner/ReadsSparkSinkUnitTest3_195088ac-e76d-41ce-8376-1ba8fc8dab50.bam:0+236517
14:51:53.901 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest3_195088ac-e76d-41ce-8376-1ba8fc8dab50.bam dst=null perm=null proto=rpc
14:51:53.901 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest3_195088ac-e76d-41ce-8376-1ba8fc8dab50.bam dst=null perm=null proto=rpc
14:51:53.902 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest3_195088ac-e76d-41ce-8376-1ba8fc8dab50.bam.bai dst=null perm=null proto=rpc
14:51:53.902 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest3_195088ac-e76d-41ce-8376-1ba8fc8dab50.bam.bai dst=null perm=null proto=rpc
14:51:53.903 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest3_195088ac-e76d-41ce-8376-1ba8fc8dab50.bam.bai dst=null perm=null proto=rpc
14:51:53.905 WARN DFSUtil - Unexpected value for data transfer bytes=600 duration=0
14:51:53.906 WARN DFSUtil - Unexpected value for data transfer bytes=600 duration=0
14:51:53.907 WARN DFSUtil - Unexpected value for data transfer bytes=237139 duration=0
14:51:53.907 WARN DFSUtil - Unexpected value for data transfer bytes=32 duration=0
14:51:53.909 INFO Executor - Finished task 0.0 in stage 249.0 (TID 305). 989 bytes result sent to driver
14:51:53.909 INFO TaskSetManager - Finished task 0.0 in stage 249.0 (TID 305) in 22 ms on localhost (executor driver) (1/1)
14:51:53.909 INFO TaskSchedulerImpl - Removed TaskSet 249.0, whose tasks have all completed, from pool
14:51:53.909 INFO DAGScheduler - ResultStage 249 (count at ReadsSparkSinkUnitTest.java:185) finished in 0.030 s
14:51:53.909 INFO DAGScheduler - Job 188 is finished. Cancelling potential speculative or zombie tasks for this job
14:51:53.909 INFO TaskSchedulerImpl - Killing all running tasks in stage 249: Stage finished
14:51:53.909 INFO DAGScheduler - Job 188 finished: count at ReadsSparkSinkUnitTest.java:185, took 0.031055 s
14:51:53.919 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest5_e0e1c377-3edc-462e-961b-1533ab78e3b7.cram dst=null perm=null proto=rpc
14:51:53.920 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=create src=/user/runner/ReadsSparkSinkUnitTest5_e0e1c377-3edc-462e-961b-1533ab78e3b7.cram dst=null perm=runner:supergroup:rw-r--r-- proto=rpc
14:51:53.921 INFO StateChange - DIR* completeFile: /user/runner/ReadsSparkSinkUnitTest5_e0e1c377-3edc-462e-961b-1533ab78e3b7.cram is closed by DFSClient_NONMAPREDUCE_-951833550_1
14:51:53.921 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest5_e0e1c377-3edc-462e-961b-1533ab78e3b7.cram dst=null perm=null proto=rpc
14:51:53.923 INFO MemoryStore - Block broadcast_502 stored as values in memory (estimated size 576.0 B, free 1918.4 MiB)
14:51:53.924 INFO MemoryStore - Block broadcast_502_piece0 stored as bytes in memory (estimated size 228.0 B, free 1918.4 MiB)
14:51:53.924 INFO BlockManagerInfo - Added broadcast_502_piece0 in memory on localhost:44923 (size: 228.0 B, free: 1919.6 MiB)
14:51:53.924 INFO SparkContext - Created broadcast 502 from broadcast at CramSource.java:114
14:51:53.925 INFO MemoryStore - Block broadcast_503 stored as values in memory (estimated size 297.9 KiB, free 1918.1 MiB)
14:51:53.931 INFO MemoryStore - Block broadcast_503_piece0 stored as bytes in memory (estimated size 50.2 KiB, free 1918.0 MiB)
14:51:53.931 INFO BlockManagerInfo - Added broadcast_503_piece0 in memory on localhost:44923 (size: 50.2 KiB, free: 1919.6 MiB)
14:51:53.932 INFO SparkContext - Created broadcast 503 from newAPIHadoopFile at PathSplitSource.java:96
14:51:53.948 INFO MemoryStore - Block broadcast_504 stored as values in memory (estimated size 576.0 B, free 1918.0 MiB)
14:51:53.948 INFO MemoryStore - Block broadcast_504_piece0 stored as bytes in memory (estimated size 228.0 B, free 1918.0 MiB)
14:51:53.948 INFO BlockManagerInfo - Added broadcast_504_piece0 in memory on localhost:44923 (size: 228.0 B, free: 1919.6 MiB)
14:51:53.949 INFO SparkContext - Created broadcast 504 from broadcast at CramSource.java:114
14:51:53.950 INFO MemoryStore - Block broadcast_505 stored as values in memory (estimated size 297.9 KiB, free 1917.7 MiB)
14:51:53.956 INFO MemoryStore - Block broadcast_505_piece0 stored as bytes in memory (estimated size 50.2 KiB, free 1917.7 MiB)
14:51:53.956 INFO BlockManagerInfo - Added broadcast_505_piece0 in memory on localhost:44923 (size: 50.2 KiB, free: 1919.5 MiB)
14:51:53.956 INFO SparkContext - Created broadcast 505 from newAPIHadoopFile at PathSplitSource.java:96
14:51:53.971 INFO FileInputFormat - Total input files to process : 1
14:51:53.972 INFO MemoryStore - Block broadcast_506 stored as values in memory (estimated size 6.0 KiB, free 1917.7 MiB)
14:51:53.973 INFO MemoryStore - Block broadcast_506_piece0 stored as bytes in memory (estimated size 1473.0 B, free 1917.7 MiB)
14:51:53.973 INFO BlockManagerInfo - Added broadcast_506_piece0 in memory on localhost:44923 (size: 1473.0 B, free: 1919.5 MiB)
14:51:53.973 INFO SparkContext - Created broadcast 506 from broadcast at ReadsSparkSink.java:133
14:51:53.974 INFO MemoryStore - Block broadcast_507 stored as values in memory (estimated size 6.2 KiB, free 1917.7 MiB)
14:51:53.974 INFO MemoryStore - Block broadcast_507_piece0 stored as bytes in memory (estimated size 1473.0 B, free 1917.7 MiB)
14:51:53.974 INFO BlockManagerInfo - Added broadcast_507_piece0 in memory on localhost:44923 (size: 1473.0 B, free: 1919.5 MiB)
14:51:53.975 INFO SparkContext - Created broadcast 507 from broadcast at CramSink.java:76
14:51:53.977 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest5_e0e1c377-3edc-462e-961b-1533ab78e3b7.cram.parts dst=null perm=null proto=rpc
14:51:53.977 INFO PathOutputCommitterFactory - No output committer factory defined, defaulting to FileOutputCommitterFactory
14:51:53.977 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
14:51:53.977 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
14:51:53.978 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=mkdirs src=/user/runner/ReadsSparkSinkUnitTest5_e0e1c377-3edc-462e-961b-1533ab78e3b7.cram.parts/_temporary/0 dst=null perm=runner:supergroup:rwxr-xr-x proto=rpc
14:51:53.984 INFO SparkContext - Starting job: runJob at SparkHadoopWriter.scala:83
14:51:53.985 INFO DAGScheduler - Registering RDD 1203 (mapToPair at SparkUtils.java:161) as input to shuffle 50
14:51:53.985 INFO DAGScheduler - Got job 189 (runJob at SparkHadoopWriter.scala:83) with 1 output partitions
14:51:53.985 INFO DAGScheduler - Final stage: ResultStage 251 (runJob at SparkHadoopWriter.scala:83)
14:51:53.985 INFO DAGScheduler - Parents of final stage: List(ShuffleMapStage 250)
14:51:53.985 INFO DAGScheduler - Missing parents: List(ShuffleMapStage 250)
14:51:53.985 INFO DAGScheduler - Submitting ShuffleMapStage 250 (MapPartitionsRDD[1203] at mapToPair at SparkUtils.java:161), which has no missing parents
14:51:53.997 INFO MemoryStore - Block broadcast_508 stored as values in memory (estimated size 292.8 KiB, free 1917.4 MiB)
14:51:53.998 INFO MemoryStore - Block broadcast_508_piece0 stored as bytes in memory (estimated size 107.3 KiB, free 1917.3 MiB)
14:51:53.998 INFO BlockManagerInfo - Added broadcast_508_piece0 in memory on localhost:44923 (size: 107.3 KiB, free: 1919.4 MiB)
14:51:53.999 INFO SparkContext - Created broadcast 508 from broadcast at DAGScheduler.scala:1580
14:51:53.999 INFO DAGScheduler - Submitting 1 missing tasks from ShuffleMapStage 250 (MapPartitionsRDD[1203] at mapToPair at SparkUtils.java:161) (first 15 tasks are for partitions Vector(0))
14:51:53.999 INFO TaskSchedulerImpl - Adding task set 250.0 with 1 tasks resource profile 0
14:51:53.999 INFO TaskSetManager - Starting task 0.0 in stage 250.0 (TID 306) (localhost, executor driver, partition 0, PROCESS_LOCAL, 7869 bytes)
14:51:54.000 INFO Executor - Running task 0.0 in stage 250.0 (TID 306)
14:51:54.023 INFO NewHadoopRDD - Input split: file:/home/runner/work/gatk/gatk/src/test/resources/org/broadinstitute/hellbender/tools/BQSR/NA12878.chr17_69k_70k.dictFix.cram:0+50619
14:51:54.033 INFO Executor - Finished task 0.0 in stage 250.0 (TID 306). 1148 bytes result sent to driver
14:51:54.033 INFO TaskSetManager - Finished task 0.0 in stage 250.0 (TID 306) in 34 ms on localhost (executor driver) (1/1)
14:51:54.033 INFO TaskSchedulerImpl - Removed TaskSet 250.0, whose tasks have all completed, from pool
14:51:54.034 INFO DAGScheduler - ShuffleMapStage 250 (mapToPair at SparkUtils.java:161) finished in 0.048 s
14:51:54.034 INFO DAGScheduler - looking for newly runnable stages
14:51:54.034 INFO DAGScheduler - running: HashSet()
14:51:54.034 INFO DAGScheduler - waiting: HashSet(ResultStage 251)
14:51:54.034 INFO DAGScheduler - failed: HashSet()
14:51:54.034 INFO DAGScheduler - Submitting ResultStage 251 (MapPartitionsRDD[1208] at mapToPair at CramSink.java:89), which has no missing parents
14:51:54.045 INFO MemoryStore - Block broadcast_509 stored as values in memory (estimated size 153.3 KiB, free 1917.1 MiB)
14:51:54.046 INFO MemoryStore - Block broadcast_509_piece0 stored as bytes in memory (estimated size 58.1 KiB, free 1917.1 MiB)
14:51:54.046 INFO BlockManagerInfo - Added broadcast_509_piece0 in memory on localhost:44923 (size: 58.1 KiB, free: 1919.4 MiB)
14:51:54.046 INFO SparkContext - Created broadcast 509 from broadcast at DAGScheduler.scala:1580
14:51:54.046 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 251 (MapPartitionsRDD[1208] at mapToPair at CramSink.java:89) (first 15 tasks are for partitions Vector(0))
14:51:54.046 INFO TaskSchedulerImpl - Adding task set 251.0 with 1 tasks resource profile 0
14:51:54.047 INFO TaskSetManager - Starting task 0.0 in stage 251.0 (TID 307) (localhost, executor driver, partition 0, NODE_LOCAL, 7513 bytes)
14:51:54.047 INFO Executor - Running task 0.0 in stage 251.0 (TID 307)
14:51:54.051 INFO ShuffleBlockFetcherIterator - Getting 1 (82.3 KiB) non-empty blocks including 1 (82.3 KiB) local and 0 (0.0 B) host-local and 0 (0.0 B) push-merged-local and 0 (0.0 B) remote blocks
14:51:54.051 INFO ShuffleBlockFetcherIterator - Started 0 remote fetches in 0 ms
14:51:54.058 INFO PathOutputCommitterFactory - No output committer factory defined, defaulting to FileOutputCommitterFactory
14:51:54.058 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
14:51:54.058 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
14:51:54.059 INFO PathOutputCommitterFactory - No output committer factory defined, defaulting to FileOutputCommitterFactory
14:51:54.059 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
14:51:54.059 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
14:51:54.060 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=create src=/user/runner/ReadsSparkSinkUnitTest5_e0e1c377-3edc-462e-961b-1533ab78e3b7.cram.parts/_temporary/0/_temporary/attempt_202603041451534216970786347592322_1208_r_000000_0/part-r-00000 dst=null perm=runner:supergroup:rw-r--r-- proto=rpc
14:51:54.085 INFO StateChange - BLOCK* allocate blk_1073741912_1088, replicas=127.0.0.1:34059 for /user/runner/ReadsSparkSinkUnitTest5_e0e1c377-3edc-462e-961b-1533ab78e3b7.cram.parts/_temporary/0/_temporary/attempt_202603041451534216970786347592322_1208_r_000000_0/part-r-00000
14:51:54.086 INFO DataNode - Receiving BP-1768883704-10.1.0.125-1772635868711:blk_1073741912_1088 src: /127.0.0.1:54816 dest: /127.0.0.1:34059
14:51:54.088 INFO clienttrace - src: /127.0.0.1:54816, dest: /127.0.0.1:34059, bytes: 42659, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-951833550_1, offset: 0, srvID: 3c05d188-bc2c-46c8-958b-79f7184a02a2, blockid: BP-1768883704-10.1.0.125-1772635868711:blk_1073741912_1088, duration(ns): 527801
14:51:54.088 INFO DataNode - PacketResponder: BP-1768883704-10.1.0.125-1772635868711:blk_1073741912_1088, type=LAST_IN_PIPELINE terminating
14:51:54.089 INFO FSNamesystem - BLOCK* blk_1073741912_1088 is COMMITTED but not COMPLETE(numNodes= 0 < minimum = 1) in file /user/runner/ReadsSparkSinkUnitTest5_e0e1c377-3edc-462e-961b-1533ab78e3b7.cram.parts/_temporary/0/_temporary/attempt_202603041451534216970786347592322_1208_r_000000_0/part-r-00000
14:51:54.490 INFO StateChange - DIR* completeFile: /user/runner/ReadsSparkSinkUnitTest5_e0e1c377-3edc-462e-961b-1533ab78e3b7.cram.parts/_temporary/0/_temporary/attempt_202603041451534216970786347592322_1208_r_000000_0/part-r-00000 is closed by DFSClient_NONMAPREDUCE_-951833550_1
14:51:54.491 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest5_e0e1c377-3edc-462e-961b-1533ab78e3b7.cram.parts/_temporary/0/_temporary/attempt_202603041451534216970786347592322_1208_r_000000_0 dst=null perm=null proto=rpc
14:51:54.491 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest5_e0e1c377-3edc-462e-961b-1533ab78e3b7.cram.parts/_temporary/0/_temporary/attempt_202603041451534216970786347592322_1208_r_000000_0 dst=null perm=null proto=rpc
14:51:54.491 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest5_e0e1c377-3edc-462e-961b-1533ab78e3b7.cram.parts/_temporary/0/task_202603041451534216970786347592322_1208_r_000000 dst=null perm=null proto=rpc
14:51:54.492 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=rename src=/user/runner/ReadsSparkSinkUnitTest5_e0e1c377-3edc-462e-961b-1533ab78e3b7.cram.parts/_temporary/0/_temporary/attempt_202603041451534216970786347592322_1208_r_000000_0 dst=/user/runner/ReadsSparkSinkUnitTest5_e0e1c377-3edc-462e-961b-1533ab78e3b7.cram.parts/_temporary/0/task_202603041451534216970786347592322_1208_r_000000 perm=runner:supergroup:rwxr-xr-x proto=rpc
14:51:54.492 INFO FileOutputCommitter - Saved output of task 'attempt_202603041451534216970786347592322_1208_r_000000_0' to hdfs://localhost:43595/user/runner/ReadsSparkSinkUnitTest5_e0e1c377-3edc-462e-961b-1533ab78e3b7.cram.parts/_temporary/0/task_202603041451534216970786347592322_1208_r_000000
14:51:54.492 INFO SparkHadoopMapRedUtil - attempt_202603041451534216970786347592322_1208_r_000000_0: Committed. Elapsed time: 1 ms.
14:51:54.497 INFO Executor - Finished task 0.0 in stage 251.0 (TID 307). 1944 bytes result sent to driver
14:51:54.497 INFO BlockManagerInfo - Removed broadcast_499_piece0 on localhost:44923 in memory (size: 54.6 KiB, free: 1919.4 MiB)
14:51:54.498 INFO BlockManagerInfo - Removed broadcast_505_piece0 on localhost:44923 in memory (size: 50.2 KiB, free: 1919.5 MiB)
14:51:54.498 INFO TaskSetManager - Finished task 0.0 in stage 251.0 (TID 307) in 451 ms on localhost (executor driver) (1/1)
14:51:54.498 INFO TaskSchedulerImpl - Removed TaskSet 251.0, whose tasks have all completed, from pool
14:51:54.498 INFO DAGScheduler - ResultStage 251 (runJob at SparkHadoopWriter.scala:83) finished in 0.464 s
14:51:54.498 INFO DAGScheduler - Job 189 is finished. Cancelling potential speculative or zombie tasks for this job
14:51:54.498 INFO TaskSchedulerImpl - Killing all running tasks in stage 251: Stage finished
14:51:54.498 INFO DAGScheduler - Job 189 finished: runJob at SparkHadoopWriter.scala:83, took 0.513488 s
14:51:54.498 INFO BlockManagerInfo - Removed broadcast_501_piece0 on localhost:44923 in memory (size: 54.6 KiB, free: 1919.5 MiB)
14:51:54.498 INFO SparkHadoopWriter - Start to commit write Job job_202603041451534216970786347592322_1208.
14:51:54.498 INFO BlockManagerInfo - Removed broadcast_508_piece0 on localhost:44923 in memory (size: 107.3 KiB, free: 1919.6 MiB)
14:51:54.499 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=listStatus src=/user/runner/ReadsSparkSinkUnitTest5_e0e1c377-3edc-462e-961b-1533ab78e3b7.cram.parts/_temporary/0 dst=null perm=null proto=rpc
14:51:54.499 INFO BlockManagerInfo - Removed broadcast_504_piece0 on localhost:44923 in memory (size: 228.0 B, free: 1919.6 MiB)
14:51:54.499 INFO BlockManagerInfo - Removed broadcast_500_piece0 on localhost:44923 in memory (size: 153.6 KiB, free: 1919.8 MiB)
14:51:54.500 INFO BlockManagerInfo - Removed broadcast_498_piece0 on localhost:44923 in memory (size: 50.2 KiB, free: 1919.8 MiB)
14:51:54.500 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest5_e0e1c377-3edc-462e-961b-1533ab78e3b7.cram.parts dst=null perm=null proto=rpc
14:51:54.500 INFO BlockManagerInfo - Removed broadcast_497_piece0 on localhost:44923 in memory (size: 231.0 B, free: 1919.8 MiB)
14:51:54.500 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=listStatus src=/user/runner/ReadsSparkSinkUnitTest5_e0e1c377-3edc-462e-961b-1533ab78e3b7.cram.parts/_temporary/0/task_202603041451534216970786347592322_1208_r_000000 dst=null perm=null proto=rpc
14:51:54.501 INFO BlockManagerInfo - Removed broadcast_491_piece0 on localhost:44923 in memory (size: 50.2 KiB, free: 1919.9 MiB)
14:51:54.501 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest5_e0e1c377-3edc-462e-961b-1533ab78e3b7.cram.parts/part-r-00000 dst=null perm=null proto=rpc
14:51:54.502 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=rename src=/user/runner/ReadsSparkSinkUnitTest5_e0e1c377-3edc-462e-961b-1533ab78e3b7.cram.parts/_temporary/0/task_202603041451534216970786347592322_1208_r_000000/part-r-00000 dst=/user/runner/ReadsSparkSinkUnitTest5_e0e1c377-3edc-462e-961b-1533ab78e3b7.cram.parts/part-r-00000 perm=runner:supergroup:rw-r--r-- proto=rpc
14:51:54.502 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=delete src=/user/runner/ReadsSparkSinkUnitTest5_e0e1c377-3edc-462e-961b-1533ab78e3b7.cram.parts/_temporary dst=null perm=null proto=rpc
14:51:54.503 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=create src=/user/runner/ReadsSparkSinkUnitTest5_e0e1c377-3edc-462e-961b-1533ab78e3b7.cram.parts/_SUCCESS dst=null perm=runner:supergroup:rw-r--r-- proto=rpc
14:51:54.503 INFO StateChange - DIR* completeFile: /user/runner/ReadsSparkSinkUnitTest5_e0e1c377-3edc-462e-961b-1533ab78e3b7.cram.parts/_SUCCESS is closed by DFSClient_NONMAPREDUCE_-951833550_1
14:51:54.504 INFO audit - allowed=false ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=delete src=/user/runner/ReadsSparkSinkUnitTest5_e0e1c377-3edc-462e-961b-1533ab78e3b7.cram.parts/.spark-staging-1208 dst=null perm=null proto=rpc
14:51:54.504 INFO SparkHadoopWriter - Write Job job_202603041451534216970786347592322_1208 committed. Elapsed time: 5 ms.
14:51:54.505 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=create src=/user/runner/ReadsSparkSinkUnitTest5_e0e1c377-3edc-462e-961b-1533ab78e3b7.cram.parts/header dst=null perm=runner:supergroup:rw-r--r-- proto=rpc
14:51:54.507 INFO StateChange - BLOCK* allocate blk_1073741913_1089, replicas=127.0.0.1:34059 for /user/runner/ReadsSparkSinkUnitTest5_e0e1c377-3edc-462e-961b-1533ab78e3b7.cram.parts/header
14:51:54.508 INFO DataNode - Receiving BP-1768883704-10.1.0.125-1772635868711:blk_1073741913_1089 src: /127.0.0.1:54830 dest: /127.0.0.1:34059
14:51:54.509 INFO clienttrace - src: /127.0.0.1:54830, dest: /127.0.0.1:34059, bytes: 1016, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-951833550_1, offset: 0, srvID: 3c05d188-bc2c-46c8-958b-79f7184a02a2, blockid: BP-1768883704-10.1.0.125-1772635868711:blk_1073741913_1089, duration(ns): 453258
14:51:54.509 INFO DataNode - PacketResponder: BP-1768883704-10.1.0.125-1772635868711:blk_1073741913_1089, type=LAST_IN_PIPELINE terminating
14:51:54.510 INFO FSNamesystem - BLOCK* blk_1073741913_1089 is COMMITTED but not COMPLETE(numNodes= 0 < minimum = 1) in file /user/runner/ReadsSparkSinkUnitTest5_e0e1c377-3edc-462e-961b-1533ab78e3b7.cram.parts/header
14:51:54.911 INFO StateChange - DIR* completeFile: /user/runner/ReadsSparkSinkUnitTest5_e0e1c377-3edc-462e-961b-1533ab78e3b7.cram.parts/header is closed by DFSClient_NONMAPREDUCE_-951833550_1
14:51:54.912 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=create src=/user/runner/ReadsSparkSinkUnitTest5_e0e1c377-3edc-462e-961b-1533ab78e3b7.cram.parts/terminator dst=null perm=runner:supergroup:rw-r--r-- proto=rpc
14:51:54.913 INFO StateChange - BLOCK* allocate blk_1073741914_1090, replicas=127.0.0.1:34059 for /user/runner/ReadsSparkSinkUnitTest5_e0e1c377-3edc-462e-961b-1533ab78e3b7.cram.parts/terminator
14:51:54.914 INFO DataNode - Receiving BP-1768883704-10.1.0.125-1772635868711:blk_1073741914_1090 src: /127.0.0.1:54840 dest: /127.0.0.1:34059
14:51:54.915 INFO clienttrace - src: /127.0.0.1:54840, dest: /127.0.0.1:34059, bytes: 38, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-951833550_1, offset: 0, srvID: 3c05d188-bc2c-46c8-958b-79f7184a02a2, blockid: BP-1768883704-10.1.0.125-1772635868711:blk_1073741914_1090, duration(ns): 439272
14:51:54.915 INFO DataNode - PacketResponder: BP-1768883704-10.1.0.125-1772635868711:blk_1073741914_1090, type=LAST_IN_PIPELINE terminating
14:51:54.916 INFO FSNamesystem - BLOCK* blk_1073741914_1090 is COMMITTED but not COMPLETE(numNodes= 0 < minimum = 1) in file /user/runner/ReadsSparkSinkUnitTest5_e0e1c377-3edc-462e-961b-1533ab78e3b7.cram.parts/terminator
14:51:55.032 INFO FsDatasetAsyncDiskService - Scheduling blk_1073741906_1082 replica FinalizedReplica, blk_1073741906_1082, FINALIZED
getNumBytes() = 204
getBytesOnDisk() = 204
getVisibleLength()= 204
getVolume() = /tmp/minicluster_storage16268522075870465194/data/data2
getBlockURI() = file:/tmp/minicluster_storage16268522075870465194/data/data2/current/BP-1768883704-10.1.0.125-1772635868711/current/finalized/subdir0/subdir0/blk_1073741906 for deletion
14:51:55.032 INFO FsDatasetAsyncDiskService - Scheduling blk_1073741907_1083 replica FinalizedReplica, blk_1073741907_1083, FINALIZED
getNumBytes() = 592
getBytesOnDisk() = 592
getVisibleLength()= 592
getVolume() = /tmp/minicluster_storage16268522075870465194/data/data1
getBlockURI() = file:/tmp/minicluster_storage16268522075870465194/data/data1/current/BP-1768883704-10.1.0.125-1772635868711/current/finalized/subdir0/subdir0/blk_1073741907 for deletion
14:51:55.032 INFO FsDatasetAsyncDiskService - Deleted BP-1768883704-10.1.0.125-1772635868711 blk_1073741906_1082 URI file:/tmp/minicluster_storage16268522075870465194/data/data2/current/BP-1768883704-10.1.0.125-1772635868711/current/finalized/subdir0/subdir0/blk_1073741906
14:51:55.032 INFO FsDatasetAsyncDiskService - Deleted BP-1768883704-10.1.0.125-1772635868711 blk_1073741907_1083 URI file:/tmp/minicluster_storage16268522075870465194/data/data1/current/BP-1768883704-10.1.0.125-1772635868711/current/finalized/subdir0/subdir0/blk_1073741907
14:51:55.317 INFO StateChange - DIR* completeFile: /user/runner/ReadsSparkSinkUnitTest5_e0e1c377-3edc-462e-961b-1533ab78e3b7.cram.parts/terminator is closed by DFSClient_NONMAPREDUCE_-951833550_1
14:51:55.317 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=listStatus src=/user/runner/ReadsSparkSinkUnitTest5_e0e1c377-3edc-462e-961b-1533ab78e3b7.cram.parts dst=null perm=null proto=rpc
14:51:55.318 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=create src=/user/runner/ReadsSparkSinkUnitTest5_e0e1c377-3edc-462e-961b-1533ab78e3b7.cram.parts/output dst=null perm=runner:supergroup:rw-r--r-- proto=rpc
14:51:55.319 INFO StateChange - DIR* completeFile: /user/runner/ReadsSparkSinkUnitTest5_e0e1c377-3edc-462e-961b-1533ab78e3b7.cram.parts/output is closed by DFSClient_NONMAPREDUCE_-951833550_1
14:51:55.319 INFO HadoopFileSystemWrapper - Concatenating 3 parts to hdfs://localhost:43595/user/runner/ReadsSparkSinkUnitTest5_e0e1c377-3edc-462e-961b-1533ab78e3b7.cram
14:51:55.320 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=concat src=[/user/runner/ReadsSparkSinkUnitTest5_e0e1c377-3edc-462e-961b-1533ab78e3b7.cram.parts/header, /user/runner/ReadsSparkSinkUnitTest5_e0e1c377-3edc-462e-961b-1533ab78e3b7.cram.parts/part-r-00000, /user/runner/ReadsSparkSinkUnitTest5_e0e1c377-3edc-462e-961b-1533ab78e3b7.cram.parts/terminator] dst=/user/runner/ReadsSparkSinkUnitTest5_e0e1c377-3edc-462e-961b-1533ab78e3b7.cram.parts/output perm=runner:supergroup:rw-r--r-- proto=rpc
14:51:55.320 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest5_e0e1c377-3edc-462e-961b-1533ab78e3b7.cram dst=null perm=null proto=rpc
14:51:55.320 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=delete src=/user/runner/ReadsSparkSinkUnitTest5_e0e1c377-3edc-462e-961b-1533ab78e3b7.cram dst=null perm=null proto=rpc
14:51:55.321 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=rename src=/user/runner/ReadsSparkSinkUnitTest5_e0e1c377-3edc-462e-961b-1533ab78e3b7.cram.parts/output dst=/user/runner/ReadsSparkSinkUnitTest5_e0e1c377-3edc-462e-961b-1533ab78e3b7.cram perm=runner:supergroup:rw-r--r-- proto=rpc
14:51:55.321 INFO HadoopFileSystemWrapper - Concatenating to hdfs://localhost:43595/user/runner/ReadsSparkSinkUnitTest5_e0e1c377-3edc-462e-961b-1533ab78e3b7.cram done
14:51:55.321 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=delete src=/user/runner/ReadsSparkSinkUnitTest5_e0e1c377-3edc-462e-961b-1533ab78e3b7.cram.parts dst=null perm=null proto=rpc
14:51:55.322 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest5_e0e1c377-3edc-462e-961b-1533ab78e3b7.cram dst=null perm=null proto=rpc
14:51:55.322 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest5_e0e1c377-3edc-462e-961b-1533ab78e3b7.cram dst=null perm=null proto=rpc
14:51:55.322 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest5_e0e1c377-3edc-462e-961b-1533ab78e3b7.cram dst=null perm=null proto=rpc
14:51:55.323 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest5_e0e1c377-3edc-462e-961b-1533ab78e3b7.cram dst=null perm=null proto=rpc
14:51:55.323 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest5_e0e1c377-3edc-462e-961b-1533ab78e3b7.cram.crai dst=null perm=null proto=rpc
14:51:55.324 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest5_e0e1c377-3edc-462e-961b-1533ab78e3b7.crai dst=null perm=null proto=rpc
14:51:55.326 WARN DFSUtil - Unexpected value for data transfer bytes=1024 duration=0
14:51:55.326 WARN DFSUtil - Unexpected value for data transfer bytes=42995 duration=0
14:51:55.327 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest5_e0e1c377-3edc-462e-961b-1533ab78e3b7.cram dst=null perm=null proto=rpc
14:51:55.327 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest5_e0e1c377-3edc-462e-961b-1533ab78e3b7.cram dst=null perm=null proto=rpc
14:51:55.328 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest5_e0e1c377-3edc-462e-961b-1533ab78e3b7.cram.crai dst=null perm=null proto=rpc
14:51:55.328 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest5_e0e1c377-3edc-462e-961b-1533ab78e3b7.crai dst=null perm=null proto=rpc
14:51:55.328 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest5_e0e1c377-3edc-462e-961b-1533ab78e3b7.cram dst=null perm=null proto=rpc
14:51:55.329 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest5_e0e1c377-3edc-462e-961b-1533ab78e3b7.cram dst=null perm=null proto=rpc
14:51:55.330 WARN DFSUtil - Unexpected value for data transfer bytes=1024 duration=0
14:51:55.330 WARN DFSUtil - Unexpected value for data transfer bytes=42995 duration=0
14:51:55.331 WARN DFSUtil - Unexpected value for data transfer bytes=42 duration=0
14:51:55.331 INFO MemoryStore - Block broadcast_510 stored as values in memory (estimated size 528.0 B, free 1919.4 MiB)
14:51:55.332 INFO MemoryStore - Block broadcast_510_piece0 stored as bytes in memory (estimated size 187.0 B, free 1919.4 MiB)
14:51:55.332 INFO BlockManagerInfo - Added broadcast_510_piece0 in memory on localhost:44923 (size: 187.0 B, free: 1919.9 MiB)
14:51:55.332 INFO SparkContext - Created broadcast 510 from broadcast at CramSource.java:114
14:51:55.333 INFO MemoryStore - Block broadcast_511 stored as values in memory (estimated size 297.9 KiB, free 1919.1 MiB)
14:51:55.342 INFO MemoryStore - Block broadcast_511_piece0 stored as bytes in memory (estimated size 50.2 KiB, free 1919.1 MiB)
14:51:55.342 INFO BlockManagerInfo - Added broadcast_511_piece0 in memory on localhost:44923 (size: 50.2 KiB, free: 1919.8 MiB)
14:51:55.342 INFO SparkContext - Created broadcast 511 from newAPIHadoopFile at PathSplitSource.java:96
14:51:55.357 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest5_e0e1c377-3edc-462e-961b-1533ab78e3b7.cram dst=null perm=null proto=rpc
14:51:55.357 INFO FileInputFormat - Total input files to process : 1
14:51:55.358 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest5_e0e1c377-3edc-462e-961b-1533ab78e3b7.cram dst=null perm=null proto=rpc
14:51:55.384 INFO SparkContext - Starting job: collect at ReadsSparkSinkUnitTest.java:182
14:51:55.384 INFO DAGScheduler - Got job 190 (collect at ReadsSparkSinkUnitTest.java:182) with 1 output partitions
14:51:55.384 INFO DAGScheduler - Final stage: ResultStage 252 (collect at ReadsSparkSinkUnitTest.java:182)
14:51:55.384 INFO DAGScheduler - Parents of final stage: List()
14:51:55.384 INFO DAGScheduler - Missing parents: List()
14:51:55.384 INFO DAGScheduler - Submitting ResultStage 252 (MapPartitionsRDD[1214] at filter at ReadsSparkSource.java:96), which has no missing parents
14:51:55.396 INFO MemoryStore - Block broadcast_512 stored as values in memory (estimated size 286.8 KiB, free 1918.8 MiB)
14:51:55.397 INFO MemoryStore - Block broadcast_512_piece0 stored as bytes in memory (estimated size 103.6 KiB, free 1918.7 MiB)
14:51:55.397 INFO BlockManagerInfo - Added broadcast_512_piece0 in memory on localhost:44923 (size: 103.6 KiB, free: 1919.7 MiB)
14:51:55.397 INFO SparkContext - Created broadcast 512 from broadcast at DAGScheduler.scala:1580
14:51:55.398 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 252 (MapPartitionsRDD[1214] at filter at ReadsSparkSource.java:96) (first 15 tasks are for partitions Vector(0))
14:51:55.398 INFO TaskSchedulerImpl - Adding task set 252.0 with 1 tasks resource profile 0
14:51:55.398 INFO TaskSetManager - Starting task 0.0 in stage 252.0 (TID 308) (localhost, executor driver, partition 0, ANY, 7853 bytes)
14:51:55.399 INFO Executor - Running task 0.0 in stage 252.0 (TID 308)
14:51:55.421 INFO NewHadoopRDD - Input split: hdfs://localhost:43595/user/runner/ReadsSparkSinkUnitTest5_e0e1c377-3edc-462e-961b-1533ab78e3b7.cram:0+43713
14:51:55.421 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest5_e0e1c377-3edc-462e-961b-1533ab78e3b7.cram dst=null perm=null proto=rpc
14:51:55.422 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest5_e0e1c377-3edc-462e-961b-1533ab78e3b7.cram dst=null perm=null proto=rpc
14:51:55.423 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest5_e0e1c377-3edc-462e-961b-1533ab78e3b7.cram.crai dst=null perm=null proto=rpc
14:51:55.423 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest5_e0e1c377-3edc-462e-961b-1533ab78e3b7.crai dst=null perm=null proto=rpc
14:51:55.425 WARN DFSUtil - Unexpected value for data transfer bytes=1024 duration=0
14:51:55.426 WARN DFSUtil - Unexpected value for data transfer bytes=42995 duration=0
14:51:55.426 WARN DFSUtil - Unexpected value for data transfer bytes=42 duration=0
14:51:55.437 INFO Executor - Finished task 0.0 in stage 252.0 (TID 308). 154058 bytes result sent to driver
14:51:55.438 INFO TaskSetManager - Finished task 0.0 in stage 252.0 (TID 308) in 39 ms on localhost (executor driver) (1/1)
14:51:55.438 INFO TaskSchedulerImpl - Removed TaskSet 252.0, whose tasks have all completed, from pool
14:51:55.438 INFO DAGScheduler - ResultStage 252 (collect at ReadsSparkSinkUnitTest.java:182) finished in 0.053 s
14:51:55.438 INFO DAGScheduler - Job 190 is finished. Cancelling potential speculative or zombie tasks for this job
14:51:55.438 INFO TaskSchedulerImpl - Killing all running tasks in stage 252: Stage finished
14:51:55.438 INFO DAGScheduler - Job 190 finished: collect at ReadsSparkSinkUnitTest.java:182, took 0.054096 s
14:51:55.445 INFO SparkContext - Starting job: count at ReadsSparkSinkUnitTest.java:185
14:51:55.446 INFO DAGScheduler - Got job 191 (count at ReadsSparkSinkUnitTest.java:185) with 1 output partitions
14:51:55.446 INFO DAGScheduler - Final stage: ResultStage 253 (count at ReadsSparkSinkUnitTest.java:185)
14:51:55.446 INFO DAGScheduler - Parents of final stage: List()
14:51:55.446 INFO DAGScheduler - Missing parents: List()
14:51:55.446 INFO DAGScheduler - Submitting ResultStage 253 (MapPartitionsRDD[1197] at filter at ReadsSparkSource.java:96), which has no missing parents
14:51:55.464 INFO MemoryStore - Block broadcast_513 stored as values in memory (estimated size 286.8 KiB, free 1918.4 MiB)
14:51:55.465 INFO MemoryStore - Block broadcast_513_piece0 stored as bytes in memory (estimated size 103.6 KiB, free 1918.3 MiB)
14:51:55.465 INFO BlockManagerInfo - Added broadcast_513_piece0 in memory on localhost:44923 (size: 103.6 KiB, free: 1919.6 MiB)
14:51:55.465 INFO SparkContext - Created broadcast 513 from broadcast at DAGScheduler.scala:1580
14:51:55.465 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 253 (MapPartitionsRDD[1197] at filter at ReadsSparkSource.java:96) (first 15 tasks are for partitions Vector(0))
14:51:55.465 INFO TaskSchedulerImpl - Adding task set 253.0 with 1 tasks resource profile 0
14:51:55.466 INFO TaskSetManager - Starting task 0.0 in stage 253.0 (TID 309) (localhost, executor driver, partition 0, PROCESS_LOCAL, 7880 bytes)
14:51:55.466 INFO Executor - Running task 0.0 in stage 253.0 (TID 309)
14:51:55.488 INFO NewHadoopRDD - Input split: file:/home/runner/work/gatk/gatk/src/test/resources/org/broadinstitute/hellbender/tools/BQSR/NA12878.chr17_69k_70k.dictFix.cram:0+50619
14:51:55.494 INFO Executor - Finished task 0.0 in stage 253.0 (TID 309). 989 bytes result sent to driver
14:51:55.494 INFO TaskSetManager - Finished task 0.0 in stage 253.0 (TID 309) in 28 ms on localhost (executor driver) (1/1)
14:51:55.494 INFO TaskSchedulerImpl - Removed TaskSet 253.0, whose tasks have all completed, from pool
14:51:55.495 INFO DAGScheduler - ResultStage 253 (count at ReadsSparkSinkUnitTest.java:185) finished in 0.049 s
14:51:55.495 INFO DAGScheduler - Job 191 is finished. Cancelling potential speculative or zombie tasks for this job
14:51:55.495 INFO TaskSchedulerImpl - Killing all running tasks in stage 253: Stage finished
14:51:55.495 INFO DAGScheduler - Job 191 finished: count at ReadsSparkSinkUnitTest.java:185, took 0.049291 s
14:51:55.498 INFO SparkContext - Starting job: count at ReadsSparkSinkUnitTest.java:185
14:51:55.499 INFO DAGScheduler - Got job 192 (count at ReadsSparkSinkUnitTest.java:185) with 1 output partitions
14:51:55.499 INFO DAGScheduler - Final stage: ResultStage 254 (count at ReadsSparkSinkUnitTest.java:185)
14:51:55.499 INFO DAGScheduler - Parents of final stage: List()
14:51:55.499 INFO DAGScheduler - Missing parents: List()
14:51:55.499 INFO DAGScheduler - Submitting ResultStage 254 (MapPartitionsRDD[1214] at filter at ReadsSparkSource.java:96), which has no missing parents
14:51:55.510 INFO MemoryStore - Block broadcast_514 stored as values in memory (estimated size 286.8 KiB, free 1918.1 MiB)
14:51:55.511 INFO MemoryStore - Block broadcast_514_piece0 stored as bytes in memory (estimated size 103.6 KiB, free 1918.0 MiB)
14:51:55.512 INFO BlockManagerInfo - Added broadcast_514_piece0 in memory on localhost:44923 (size: 103.6 KiB, free: 1919.5 MiB)
14:51:55.512 INFO SparkContext - Created broadcast 514 from broadcast at DAGScheduler.scala:1580
14:51:55.512 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 254 (MapPartitionsRDD[1214] at filter at ReadsSparkSource.java:96) (first 15 tasks are for partitions Vector(0))
14:51:55.512 INFO TaskSchedulerImpl - Adding task set 254.0 with 1 tasks resource profile 0
14:51:55.513 INFO TaskSetManager - Starting task 0.0 in stage 254.0 (TID 310) (localhost, executor driver, partition 0, ANY, 7853 bytes)
14:51:55.513 INFO Executor - Running task 0.0 in stage 254.0 (TID 310)
14:51:55.534 INFO NewHadoopRDD - Input split: hdfs://localhost:43595/user/runner/ReadsSparkSinkUnitTest5_e0e1c377-3edc-462e-961b-1533ab78e3b7.cram:0+43713
14:51:55.535 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest5_e0e1c377-3edc-462e-961b-1533ab78e3b7.cram dst=null perm=null proto=rpc
14:51:55.536 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest5_e0e1c377-3edc-462e-961b-1533ab78e3b7.cram dst=null perm=null proto=rpc
14:51:55.536 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest5_e0e1c377-3edc-462e-961b-1533ab78e3b7.cram.crai dst=null perm=null proto=rpc
14:51:55.537 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest5_e0e1c377-3edc-462e-961b-1533ab78e3b7.crai dst=null perm=null proto=rpc
14:51:55.538 WARN DFSUtil - Unexpected value for data transfer bytes=1024 duration=0
14:51:55.539 WARN DFSUtil - Unexpected value for data transfer bytes=42995 duration=0
14:51:55.539 WARN DFSUtil - Unexpected value for data transfer bytes=42 duration=0
14:51:55.548 INFO Executor - Finished task 0.0 in stage 254.0 (TID 310). 989 bytes result sent to driver
14:51:55.549 INFO TaskSetManager - Finished task 0.0 in stage 254.0 (TID 310) in 37 ms on localhost (executor driver) (1/1)
14:51:55.549 INFO TaskSchedulerImpl - Removed TaskSet 254.0, whose tasks have all completed, from pool
14:51:55.549 INFO DAGScheduler - ResultStage 254 (count at ReadsSparkSinkUnitTest.java:185) finished in 0.050 s
14:51:55.549 INFO DAGScheduler - Job 192 is finished. Cancelling potential speculative or zombie tasks for this job
14:51:55.549 INFO TaskSchedulerImpl - Killing all running tasks in stage 254: Stage finished
14:51:55.549 INFO DAGScheduler - Job 192 finished: count at ReadsSparkSinkUnitTest.java:185, took 0.050577 s
14:51:55.559 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest6_f9cf6b32-8558-4e38-8bd0-e043a275e62a.sam dst=null perm=null proto=rpc
14:51:55.560 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=create src=/user/runner/ReadsSparkSinkUnitTest6_f9cf6b32-8558-4e38-8bd0-e043a275e62a.sam dst=null perm=runner:supergroup:rw-r--r-- proto=rpc
14:51:55.560 INFO StateChange - DIR* completeFile: /user/runner/ReadsSparkSinkUnitTest6_f9cf6b32-8558-4e38-8bd0-e043a275e62a.sam is closed by DFSClient_NONMAPREDUCE_-951833550_1
14:51:55.561 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest6_f9cf6b32-8558-4e38-8bd0-e043a275e62a.sam dst=null perm=null proto=rpc
14:51:55.563 INFO MemoryStore - Block broadcast_515 stored as values in memory (estimated size 297.9 KiB, free 1917.7 MiB)
14:51:55.570 INFO MemoryStore - Block broadcast_515_piece0 stored as bytes in memory (estimated size 50.2 KiB, free 1917.6 MiB)
14:51:55.570 INFO BlockManagerInfo - Added broadcast_515_piece0 in memory on localhost:44923 (size: 50.2 KiB, free: 1919.5 MiB)
14:51:55.570 INFO SparkContext - Created broadcast 515 from newAPIHadoopFile at PathSplitSource.java:96
14:51:55.594 INFO MemoryStore - Block broadcast_516 stored as values in memory (estimated size 297.9 KiB, free 1917.3 MiB)
14:51:55.600 INFO MemoryStore - Block broadcast_516_piece0 stored as bytes in memory (estimated size 50.2 KiB, free 1917.3 MiB)
14:51:55.600 INFO BlockManagerInfo - Added broadcast_516_piece0 in memory on localhost:44923 (size: 50.2 KiB, free: 1919.4 MiB)
14:51:55.600 INFO SparkContext - Created broadcast 516 from newAPIHadoopFile at PathSplitSource.java:96
14:51:55.623 INFO FileInputFormat - Total input files to process : 1
14:51:55.625 INFO MemoryStore - Block broadcast_517 stored as values in memory (estimated size 160.7 KiB, free 1917.1 MiB)
14:51:55.625 INFO MemoryStore - Block broadcast_517_piece0 stored as bytes in memory (estimated size 9.6 KiB, free 1917.1 MiB)
14:51:55.625 INFO BlockManagerInfo - Added broadcast_517_piece0 in memory on localhost:44923 (size: 9.6 KiB, free: 1919.4 MiB)
14:51:55.626 INFO SparkContext - Created broadcast 517 from broadcast at ReadsSparkSink.java:133
14:51:55.630 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest6_f9cf6b32-8558-4e38-8bd0-e043a275e62a.sam.parts dst=null perm=null proto=rpc
14:51:55.630 INFO HadoopMapRedCommitProtocol - Using output committer class org.apache.hadoop.mapred.FileOutputCommitter
14:51:55.630 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
14:51:55.630 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
14:51:55.631 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=mkdirs src=/user/runner/ReadsSparkSinkUnitTest6_f9cf6b32-8558-4e38-8bd0-e043a275e62a.sam.parts/_temporary/0 dst=null perm=runner:supergroup:rwxr-xr-x proto=rpc
14:51:55.642 INFO SparkContext - Starting job: runJob at SparkHadoopWriter.scala:83
14:51:55.642 INFO DAGScheduler - Registering RDD 1228 (mapToPair at SparkUtils.java:161) as input to shuffle 51
14:51:55.642 INFO DAGScheduler - Got job 193 (runJob at SparkHadoopWriter.scala:83) with 1 output partitions
14:51:55.642 INFO DAGScheduler - Final stage: ResultStage 256 (runJob at SparkHadoopWriter.scala:83)
14:51:55.642 INFO DAGScheduler - Parents of final stage: List(ShuffleMapStage 255)
14:51:55.642 INFO DAGScheduler - Missing parents: List(ShuffleMapStage 255)
14:51:55.642 INFO DAGScheduler - Submitting ShuffleMapStage 255 (MapPartitionsRDD[1228] at mapToPair at SparkUtils.java:161), which has no missing parents
14:51:55.660 INFO MemoryStore - Block broadcast_518 stored as values in memory (estimated size 520.4 KiB, free 1916.6 MiB)
14:51:55.662 INFO MemoryStore - Block broadcast_518_piece0 stored as bytes in memory (estimated size 166.1 KiB, free 1916.4 MiB)
14:51:55.662 INFO BlockManagerInfo - Added broadcast_518_piece0 in memory on localhost:44923 (size: 166.1 KiB, free: 1919.3 MiB)
14:51:55.662 INFO SparkContext - Created broadcast 518 from broadcast at DAGScheduler.scala:1580
14:51:55.662 INFO DAGScheduler - Submitting 1 missing tasks from ShuffleMapStage 255 (MapPartitionsRDD[1228] at mapToPair at SparkUtils.java:161) (first 15 tasks are for partitions Vector(0))
14:51:55.662 INFO TaskSchedulerImpl - Adding task set 255.0 with 1 tasks resource profile 0
14:51:55.663 INFO TaskSetManager - Starting task 0.0 in stage 255.0 (TID 311) (localhost, executor driver, partition 0, PROCESS_LOCAL, 7861 bytes)
14:51:55.663 INFO Executor - Running task 0.0 in stage 255.0 (TID 311)
14:51:55.697 INFO NewHadoopRDD - Input split: file:/home/runner/work/gatk/gatk/src/test/resources/org/broadinstitute/hellbender/tools/BQSR/HiSeq.1mb.1RG.2k_lines.bam:0+222075
14:51:55.713 INFO Executor - Finished task 0.0 in stage 255.0 (TID 311). 1148 bytes result sent to driver
14:51:55.713 INFO TaskSetManager - Finished task 0.0 in stage 255.0 (TID 311) in 50 ms on localhost (executor driver) (1/1)
14:51:55.713 INFO TaskSchedulerImpl - Removed TaskSet 255.0, whose tasks have all completed, from pool
14:51:55.713 INFO DAGScheduler - ShuffleMapStage 255 (mapToPair at SparkUtils.java:161) finished in 0.070 s
14:51:55.713 INFO DAGScheduler - looking for newly runnable stages
14:51:55.713 INFO DAGScheduler - running: HashSet()
14:51:55.713 INFO DAGScheduler - waiting: HashSet(ResultStage 256)
14:51:55.713 INFO DAGScheduler - failed: HashSet()
14:51:55.713 INFO DAGScheduler - Submitting ResultStage 256 (MapPartitionsRDD[1234] at saveAsTextFile at SamSink.java:65), which has no missing parents
14:51:55.720 INFO MemoryStore - Block broadcast_519 stored as values in memory (estimated size 241.1 KiB, free 1916.2 MiB)
14:51:55.721 INFO MemoryStore - Block broadcast_519_piece0 stored as bytes in memory (estimated size 67.0 KiB, free 1916.1 MiB)
14:51:55.721 INFO BlockManagerInfo - Added broadcast_519_piece0 in memory on localhost:44923 (size: 67.0 KiB, free: 1919.2 MiB)
14:51:55.721 INFO SparkContext - Created broadcast 519 from broadcast at DAGScheduler.scala:1580
14:51:55.722 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 256 (MapPartitionsRDD[1234] at saveAsTextFile at SamSink.java:65) (first 15 tasks are for partitions Vector(0))
14:51:55.722 INFO TaskSchedulerImpl - Adding task set 256.0 with 1 tasks resource profile 0
14:51:55.722 INFO TaskSetManager - Starting task 0.0 in stage 256.0 (TID 312) (localhost, executor driver, partition 0, NODE_LOCAL, 7513 bytes)
14:51:55.723 INFO Executor - Running task 0.0 in stage 256.0 (TID 312)
14:51:55.727 INFO ShuffleBlockFetcherIterator - Getting 1 (343.8 KiB) non-empty blocks including 1 (343.8 KiB) local and 0 (0.0 B) host-local and 0 (0.0 B) push-merged-local and 0 (0.0 B) remote blocks
14:51:55.727 INFO ShuffleBlockFetcherIterator - Started 0 remote fetches in 0 ms
14:51:55.739 INFO HadoopMapRedCommitProtocol - Using output committer class org.apache.hadoop.mapred.FileOutputCommitter
14:51:55.739 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
14:51:55.739 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
14:51:55.740 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=create src=/user/runner/ReadsSparkSinkUnitTest6_f9cf6b32-8558-4e38-8bd0-e043a275e62a.sam.parts/_temporary/0/_temporary/attempt_202603041451551295878687725791912_1234_m_000000_0/part-00000 dst=null perm=runner:supergroup:rw-r--r-- proto=rpc
14:51:55.742 INFO StateChange - BLOCK* allocate blk_1073741915_1091, replicas=127.0.0.1:34059 for /user/runner/ReadsSparkSinkUnitTest6_f9cf6b32-8558-4e38-8bd0-e043a275e62a.sam.parts/_temporary/0/_temporary/attempt_202603041451551295878687725791912_1234_m_000000_0/part-00000
14:51:55.743 INFO DataNode - Receiving BP-1768883704-10.1.0.125-1772635868711:blk_1073741915_1091 src: /127.0.0.1:47066 dest: /127.0.0.1:34059
14:51:55.748 INFO clienttrace - src: /127.0.0.1:47066, dest: /127.0.0.1:34059, bytes: 761729, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-951833550_1, offset: 0, srvID: 3c05d188-bc2c-46c8-958b-79f7184a02a2, blockid: BP-1768883704-10.1.0.125-1772635868711:blk_1073741915_1091, duration(ns): 4025917
14:51:55.748 INFO DataNode - PacketResponder: BP-1768883704-10.1.0.125-1772635868711:blk_1073741915_1091, type=LAST_IN_PIPELINE terminating
14:51:55.749 INFO FSNamesystem - BLOCK* blk_1073741915_1091 is COMMITTED but not COMPLETE(numNodes= 0 < minimum = 1) in file /user/runner/ReadsSparkSinkUnitTest6_f9cf6b32-8558-4e38-8bd0-e043a275e62a.sam.parts/_temporary/0/_temporary/attempt_202603041451551295878687725791912_1234_m_000000_0/part-00000
14:51:56.149 INFO StateChange - DIR* completeFile: /user/runner/ReadsSparkSinkUnitTest6_f9cf6b32-8558-4e38-8bd0-e043a275e62a.sam.parts/_temporary/0/_temporary/attempt_202603041451551295878687725791912_1234_m_000000_0/part-00000 is closed by DFSClient_NONMAPREDUCE_-951833550_1
14:51:56.150 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest6_f9cf6b32-8558-4e38-8bd0-e043a275e62a.sam.parts/_temporary/0/_temporary/attempt_202603041451551295878687725791912_1234_m_000000_0 dst=null perm=null proto=rpc
14:51:56.151 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest6_f9cf6b32-8558-4e38-8bd0-e043a275e62a.sam.parts/_temporary/0/_temporary/attempt_202603041451551295878687725791912_1234_m_000000_0 dst=null perm=null proto=rpc
14:51:56.151 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest6_f9cf6b32-8558-4e38-8bd0-e043a275e62a.sam.parts/_temporary/0/task_202603041451551295878687725791912_1234_m_000000 dst=null perm=null proto=rpc
14:51:56.152 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=rename src=/user/runner/ReadsSparkSinkUnitTest6_f9cf6b32-8558-4e38-8bd0-e043a275e62a.sam.parts/_temporary/0/_temporary/attempt_202603041451551295878687725791912_1234_m_000000_0 dst=/user/runner/ReadsSparkSinkUnitTest6_f9cf6b32-8558-4e38-8bd0-e043a275e62a.sam.parts/_temporary/0/task_202603041451551295878687725791912_1234_m_000000 perm=runner:supergroup:rwxr-xr-x proto=rpc
14:51:56.152 INFO FileOutputCommitter - Saved output of task 'attempt_202603041451551295878687725791912_1234_m_000000_0' to hdfs://localhost:43595/user/runner/ReadsSparkSinkUnitTest6_f9cf6b32-8558-4e38-8bd0-e043a275e62a.sam.parts/_temporary/0/task_202603041451551295878687725791912_1234_m_000000
14:51:56.152 INFO SparkHadoopMapRedUtil - attempt_202603041451551295878687725791912_1234_m_000000_0: Committed. Elapsed time: 1 ms.
14:51:56.152 INFO Executor - Finished task 0.0 in stage 256.0 (TID 312). 1858 bytes result sent to driver
14:51:56.153 INFO TaskSetManager - Finished task 0.0 in stage 256.0 (TID 312) in 431 ms on localhost (executor driver) (1/1)
14:51:56.153 INFO TaskSchedulerImpl - Removed TaskSet 256.0, whose tasks have all completed, from pool
14:51:56.153 INFO DAGScheduler - ResultStage 256 (runJob at SparkHadoopWriter.scala:83) finished in 0.439 s
14:51:56.153 INFO DAGScheduler - Job 193 is finished. Cancelling potential speculative or zombie tasks for this job
14:51:56.153 INFO TaskSchedulerImpl - Killing all running tasks in stage 256: Stage finished
14:51:56.153 INFO DAGScheduler - Job 193 finished: runJob at SparkHadoopWriter.scala:83, took 0.511660 s
14:51:56.154 INFO SparkHadoopWriter - Start to commit write Job job_202603041451551295878687725791912_1234.
14:51:56.154 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=listStatus src=/user/runner/ReadsSparkSinkUnitTest6_f9cf6b32-8558-4e38-8bd0-e043a275e62a.sam.parts/_temporary/0 dst=null perm=null proto=rpc
14:51:56.155 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest6_f9cf6b32-8558-4e38-8bd0-e043a275e62a.sam.parts dst=null perm=null proto=rpc
14:51:56.155 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=listStatus src=/user/runner/ReadsSparkSinkUnitTest6_f9cf6b32-8558-4e38-8bd0-e043a275e62a.sam.parts/_temporary/0/task_202603041451551295878687725791912_1234_m_000000 dst=null perm=null proto=rpc
14:51:56.156 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest6_f9cf6b32-8558-4e38-8bd0-e043a275e62a.sam.parts/part-00000 dst=null perm=null proto=rpc
14:51:56.156 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=rename src=/user/runner/ReadsSparkSinkUnitTest6_f9cf6b32-8558-4e38-8bd0-e043a275e62a.sam.parts/_temporary/0/task_202603041451551295878687725791912_1234_m_000000/part-00000 dst=/user/runner/ReadsSparkSinkUnitTest6_f9cf6b32-8558-4e38-8bd0-e043a275e62a.sam.parts/part-00000 perm=runner:supergroup:rw-r--r-- proto=rpc
14:51:56.157 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=delete src=/user/runner/ReadsSparkSinkUnitTest6_f9cf6b32-8558-4e38-8bd0-e043a275e62a.sam.parts/_temporary dst=null perm=null proto=rpc
14:51:56.157 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=create src=/user/runner/ReadsSparkSinkUnitTest6_f9cf6b32-8558-4e38-8bd0-e043a275e62a.sam.parts/_SUCCESS dst=null perm=runner:supergroup:rw-r--r-- proto=rpc
14:51:56.158 INFO StateChange - DIR* completeFile: /user/runner/ReadsSparkSinkUnitTest6_f9cf6b32-8558-4e38-8bd0-e043a275e62a.sam.parts/_SUCCESS is closed by DFSClient_NONMAPREDUCE_-951833550_1
14:51:56.159 INFO audit - allowed=false ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=delete src=/user/runner/ReadsSparkSinkUnitTest6_f9cf6b32-8558-4e38-8bd0-e043a275e62a.sam.parts/.spark-staging-1234 dst=null perm=null proto=rpc
14:51:56.159 INFO SparkHadoopWriter - Write Job job_202603041451551295878687725791912_1234 committed. Elapsed time: 5 ms.
14:51:56.160 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=create src=/user/runner/ReadsSparkSinkUnitTest6_f9cf6b32-8558-4e38-8bd0-e043a275e62a.sam.parts/header dst=null perm=runner:supergroup:rw-r--r-- proto=rpc
14:51:56.161 INFO StateChange - BLOCK* allocate blk_1073741916_1092, replicas=127.0.0.1:34059 for /user/runner/ReadsSparkSinkUnitTest6_f9cf6b32-8558-4e38-8bd0-e043a275e62a.sam.parts/header
14:51:56.162 INFO DataNode - Receiving BP-1768883704-10.1.0.125-1772635868711:blk_1073741916_1092 src: /127.0.0.1:47070 dest: /127.0.0.1:34059
14:51:56.163 INFO clienttrace - src: /127.0.0.1:47070, dest: /127.0.0.1:34059, bytes: 85829, op: HDFS_WRITE, cliID: DFSClient_NONMAPREDUCE_-951833550_1, offset: 0, srvID: 3c05d188-bc2c-46c8-958b-79f7184a02a2, blockid: BP-1768883704-10.1.0.125-1772635868711:blk_1073741916_1092, duration(ns): 606113
14:51:56.163 INFO DataNode - PacketResponder: BP-1768883704-10.1.0.125-1772635868711:blk_1073741916_1092, type=LAST_IN_PIPELINE terminating
14:51:56.164 INFO FSNamesystem - BLOCK* blk_1073741916_1092 is COMMITTED but not COMPLETE(numNodes= 0 < minimum = 1) in file /user/runner/ReadsSparkSinkUnitTest6_f9cf6b32-8558-4e38-8bd0-e043a275e62a.sam.parts/header
14:51:56.565 INFO StateChange - DIR* completeFile: /user/runner/ReadsSparkSinkUnitTest6_f9cf6b32-8558-4e38-8bd0-e043a275e62a.sam.parts/header is closed by DFSClient_NONMAPREDUCE_-951833550_1
14:51:56.566 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=listStatus src=/user/runner/ReadsSparkSinkUnitTest6_f9cf6b32-8558-4e38-8bd0-e043a275e62a.sam.parts dst=null perm=null proto=rpc
14:51:56.567 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=create src=/user/runner/ReadsSparkSinkUnitTest6_f9cf6b32-8558-4e38-8bd0-e043a275e62a.sam.parts/output dst=null perm=runner:supergroup:rw-r--r-- proto=rpc
14:51:56.567 INFO StateChange - DIR* completeFile: /user/runner/ReadsSparkSinkUnitTest6_f9cf6b32-8558-4e38-8bd0-e043a275e62a.sam.parts/output is closed by DFSClient_NONMAPREDUCE_-951833550_1
14:51:56.567 INFO HadoopFileSystemWrapper - Concatenating 2 parts to hdfs://localhost:43595/user/runner/ReadsSparkSinkUnitTest6_f9cf6b32-8558-4e38-8bd0-e043a275e62a.sam
14:51:56.568 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=concat src=[/user/runner/ReadsSparkSinkUnitTest6_f9cf6b32-8558-4e38-8bd0-e043a275e62a.sam.parts/header, /user/runner/ReadsSparkSinkUnitTest6_f9cf6b32-8558-4e38-8bd0-e043a275e62a.sam.parts/part-00000] dst=/user/runner/ReadsSparkSinkUnitTest6_f9cf6b32-8558-4e38-8bd0-e043a275e62a.sam.parts/output perm=runner:supergroup:rw-r--r-- proto=rpc
14:51:56.568 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest6_f9cf6b32-8558-4e38-8bd0-e043a275e62a.sam dst=null perm=null proto=rpc
14:51:56.569 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=delete src=/user/runner/ReadsSparkSinkUnitTest6_f9cf6b32-8558-4e38-8bd0-e043a275e62a.sam dst=null perm=null proto=rpc
14:51:56.569 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=rename src=/user/runner/ReadsSparkSinkUnitTest6_f9cf6b32-8558-4e38-8bd0-e043a275e62a.sam.parts/output dst=/user/runner/ReadsSparkSinkUnitTest6_f9cf6b32-8558-4e38-8bd0-e043a275e62a.sam perm=runner:supergroup:rw-r--r-- proto=rpc
14:51:56.569 INFO HadoopFileSystemWrapper - Concatenating to hdfs://localhost:43595/user/runner/ReadsSparkSinkUnitTest6_f9cf6b32-8558-4e38-8bd0-e043a275e62a.sam done
14:51:56.570 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=delete src=/user/runner/ReadsSparkSinkUnitTest6_f9cf6b32-8558-4e38-8bd0-e043a275e62a.sam.parts dst=null perm=null proto=rpc
14:51:56.570 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest6_f9cf6b32-8558-4e38-8bd0-e043a275e62a.sam dst=null perm=null proto=rpc
14:51:56.570 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest6_f9cf6b32-8558-4e38-8bd0-e043a275e62a.sam dst=null perm=null proto=rpc
14:51:56.571 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest6_f9cf6b32-8558-4e38-8bd0-e043a275e62a.sam dst=null perm=null proto=rpc
14:51:56.571 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest6_f9cf6b32-8558-4e38-8bd0-e043a275e62a.sam dst=null perm=null proto=rpc
WARNING 2026-03-04 14:51:56 SamReaderFactory Unable to detect file format from input URL or stream, assuming SAM format.
14:51:56.573 WARN DFSUtil - Unexpected value for data transfer bytes=86501 duration=0
14:51:56.574 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest6_f9cf6b32-8558-4e38-8bd0-e043a275e62a.sam dst=null perm=null proto=rpc
14:51:56.575 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest6_f9cf6b32-8558-4e38-8bd0-e043a275e62a.sam dst=null perm=null proto=rpc
14:51:56.575 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest6_f9cf6b32-8558-4e38-8bd0-e043a275e62a.sam dst=null perm=null proto=rpc
WARNING 2026-03-04 14:51:56 SamReaderFactory Unable to detect file format from input URL or stream, assuming SAM format.
14:51:56.576 WARN DFSUtil - Unexpected value for data transfer bytes=86501 duration=0
14:51:56.577 WARN DFSUtil - Unexpected value for data transfer bytes=767681 duration=0
14:51:56.578 INFO MemoryStore - Block broadcast_520 stored as values in memory (estimated size 160.7 KiB, free 1916.0 MiB)
14:51:56.579 INFO MemoryStore - Block broadcast_520_piece0 stored as bytes in memory (estimated size 9.6 KiB, free 1916.0 MiB)
14:51:56.579 INFO BlockManagerInfo - Added broadcast_520_piece0 in memory on localhost:44923 (size: 9.6 KiB, free: 1919.2 MiB)
14:51:56.579 INFO SparkContext - Created broadcast 520 from broadcast at SamSource.java:78
14:51:56.580 INFO MemoryStore - Block broadcast_521 stored as values in memory (estimated size 297.9 KiB, free 1915.7 MiB)
14:51:56.586 INFO BlockManagerInfo - Removed broadcast_503_piece0 on localhost:44923 in memory (size: 50.2 KiB, free: 1919.2 MiB)
14:51:56.587 INFO BlockManagerInfo - Removed broadcast_506_piece0 on localhost:44923 in memory (size: 1473.0 B, free: 1919.2 MiB)
14:51:56.587 INFO BlockManagerInfo - Removed broadcast_507_piece0 on localhost:44923 in memory (size: 1473.0 B, free: 1919.2 MiB)
14:51:56.588 INFO BlockManagerInfo - Removed broadcast_516_piece0 on localhost:44923 in memory (size: 50.2 KiB, free: 1919.3 MiB)
14:51:56.588 INFO BlockManagerInfo - Removed broadcast_512_piece0 on localhost:44923 in memory (size: 103.6 KiB, free: 1919.4 MiB)
14:51:56.588 INFO BlockManagerInfo - Removed broadcast_518_piece0 on localhost:44923 in memory (size: 166.1 KiB, free: 1919.6 MiB)
14:51:56.589 INFO BlockManagerInfo - Removed broadcast_513_piece0 on localhost:44923 in memory (size: 103.6 KiB, free: 1919.7 MiB)
14:51:56.589 INFO BlockManagerInfo - Removed broadcast_502_piece0 on localhost:44923 in memory (size: 228.0 B, free: 1919.7 MiB)
14:51:56.590 INFO BlockManagerInfo - Removed broadcast_511_piece0 on localhost:44923 in memory (size: 50.2 KiB, free: 1919.7 MiB)
14:51:56.590 INFO BlockManagerInfo - Removed broadcast_514_piece0 on localhost:44923 in memory (size: 103.6 KiB, free: 1919.8 MiB)
14:51:56.591 INFO BlockManagerInfo - Removed broadcast_517_piece0 on localhost:44923 in memory (size: 9.6 KiB, free: 1919.8 MiB)
14:51:56.591 INFO BlockManagerInfo - Removed broadcast_519_piece0 on localhost:44923 in memory (size: 67.0 KiB, free: 1919.9 MiB)
14:51:56.592 INFO BlockManagerInfo - Removed broadcast_509_piece0 on localhost:44923 in memory (size: 58.1 KiB, free: 1919.9 MiB)
14:51:56.593 INFO BlockManagerInfo - Removed broadcast_510_piece0 on localhost:44923 in memory (size: 187.0 B, free: 1919.9 MiB)
14:51:56.593 INFO MemoryStore - Block broadcast_521_piece0 stored as bytes in memory (estimated size 50.2 KiB, free 1919.2 MiB)
14:51:56.593 INFO BlockManagerInfo - Added broadcast_521_piece0 in memory on localhost:44923 (size: 50.2 KiB, free: 1919.9 MiB)
14:51:56.594 INFO SparkContext - Created broadcast 521 from newAPIHadoopFile at SamSource.java:108
14:51:56.596 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=getfileinfo src=/user/runner/ReadsSparkSinkUnitTest6_f9cf6b32-8558-4e38-8bd0-e043a275e62a.sam dst=null perm=null proto=rpc
14:51:56.597 INFO FileInputFormat - Total input files to process : 1
14:51:56.597 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest6_f9cf6b32-8558-4e38-8bd0-e043a275e62a.sam dst=null perm=null proto=rpc
14:51:56.602 INFO SparkContext - Starting job: collect at ReadsSparkSinkUnitTest.java:182
14:51:56.602 INFO DAGScheduler - Got job 194 (collect at ReadsSparkSinkUnitTest.java:182) with 1 output partitions
14:51:56.602 INFO DAGScheduler - Final stage: ResultStage 257 (collect at ReadsSparkSinkUnitTest.java:182)
14:51:56.602 INFO DAGScheduler - Parents of final stage: List()
14:51:56.602 INFO DAGScheduler - Missing parents: List()
14:51:56.602 INFO DAGScheduler - Submitting ResultStage 257 (MapPartitionsRDD[1239] at filter at ReadsSparkSource.java:96), which has no missing parents
14:51:56.603 INFO MemoryStore - Block broadcast_522 stored as values in memory (estimated size 7.5 KiB, free 1919.1 MiB)
14:51:56.603 INFO MemoryStore - Block broadcast_522_piece0 stored as bytes in memory (estimated size 3.8 KiB, free 1919.1 MiB)
14:51:56.603 INFO BlockManagerInfo - Added broadcast_522_piece0 in memory on localhost:44923 (size: 3.8 KiB, free: 1919.9 MiB)
14:51:56.603 INFO SparkContext - Created broadcast 522 from broadcast at DAGScheduler.scala:1580
14:51:56.603 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 257 (MapPartitionsRDD[1239] at filter at ReadsSparkSource.java:96) (first 15 tasks are for partitions Vector(0))
14:51:56.604 INFO TaskSchedulerImpl - Adding task set 257.0 with 1 tasks resource profile 0
14:51:56.604 INFO TaskSetManager - Starting task 0.0 in stage 257.0 (TID 313) (localhost, executor driver, partition 0, ANY, 7852 bytes)
14:51:56.604 INFO Executor - Running task 0.0 in stage 257.0 (TID 313)
14:51:56.605 INFO NewHadoopRDD - Input split: hdfs://localhost:43595/user/runner/ReadsSparkSinkUnitTest6_f9cf6b32-8558-4e38-8bd0-e043a275e62a.sam:0+847558
14:51:56.607 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest6_f9cf6b32-8558-4e38-8bd0-e043a275e62a.sam dst=null perm=null proto=rpc
14:51:56.618 INFO Executor - Finished task 0.0 in stage 257.0 (TID 313). 651483 bytes result sent to driver
14:51:56.620 INFO TaskSetManager - Finished task 0.0 in stage 257.0 (TID 313) in 16 ms on localhost (executor driver) (1/1)
14:51:56.620 INFO TaskSchedulerImpl - Removed TaskSet 257.0, whose tasks have all completed, from pool
14:51:56.620 INFO DAGScheduler - ResultStage 257 (collect at ReadsSparkSinkUnitTest.java:182) finished in 0.018 s
14:51:56.620 INFO DAGScheduler - Job 194 is finished. Cancelling potential speculative or zombie tasks for this job
14:51:56.620 INFO TaskSchedulerImpl - Killing all running tasks in stage 257: Stage finished
14:51:56.620 INFO DAGScheduler - Job 194 finished: collect at ReadsSparkSinkUnitTest.java:182, took 0.018547 s
14:51:56.630 INFO SparkContext - Starting job: count at ReadsSparkSinkUnitTest.java:185
14:51:56.630 INFO DAGScheduler - Got job 195 (count at ReadsSparkSinkUnitTest.java:185) with 1 output partitions
14:51:56.630 INFO DAGScheduler - Final stage: ResultStage 258 (count at ReadsSparkSinkUnitTest.java:185)
14:51:56.630 INFO DAGScheduler - Parents of final stage: List()
14:51:56.630 INFO DAGScheduler - Missing parents: List()
14:51:56.631 INFO DAGScheduler - Submitting ResultStage 258 (MapPartitionsRDD[1221] at filter at ReadsSparkSource.java:96), which has no missing parents
14:51:56.648 INFO MemoryStore - Block broadcast_523 stored as values in memory (estimated size 426.1 KiB, free 1918.7 MiB)
14:51:56.649 INFO MemoryStore - Block broadcast_523_piece0 stored as bytes in memory (estimated size 153.6 KiB, free 1918.6 MiB)
14:51:56.649 INFO BlockManagerInfo - Added broadcast_523_piece0 in memory on localhost:44923 (size: 153.6 KiB, free: 1919.7 MiB)
14:51:56.650 INFO SparkContext - Created broadcast 523 from broadcast at DAGScheduler.scala:1580
14:51:56.650 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 258 (MapPartitionsRDD[1221] at filter at ReadsSparkSource.java:96) (first 15 tasks are for partitions Vector(0))
14:51:56.650 INFO TaskSchedulerImpl - Adding task set 258.0 with 1 tasks resource profile 0
14:51:56.650 INFO TaskSetManager - Starting task 0.0 in stage 258.0 (TID 314) (localhost, executor driver, partition 0, PROCESS_LOCAL, 7872 bytes)
14:51:56.651 INFO Executor - Running task 0.0 in stage 258.0 (TID 314)
14:51:56.685 INFO NewHadoopRDD - Input split: file:/home/runner/work/gatk/gatk/src/test/resources/org/broadinstitute/hellbender/tools/BQSR/HiSeq.1mb.1RG.2k_lines.bam:0+222075
14:51:56.695 INFO Executor - Finished task 0.0 in stage 258.0 (TID 314). 989 bytes result sent to driver
14:51:56.696 INFO TaskSetManager - Finished task 0.0 in stage 258.0 (TID 314) in 46 ms on localhost (executor driver) (1/1)
14:51:56.696 INFO TaskSchedulerImpl - Removed TaskSet 258.0, whose tasks have all completed, from pool
14:51:56.696 INFO DAGScheduler - ResultStage 258 (count at ReadsSparkSinkUnitTest.java:185) finished in 0.065 s
14:51:56.696 INFO DAGScheduler - Job 195 is finished. Cancelling potential speculative or zombie tasks for this job
14:51:56.696 INFO TaskSchedulerImpl - Killing all running tasks in stage 258: Stage finished
14:51:56.696 INFO DAGScheduler - Job 195 finished: count at ReadsSparkSinkUnitTest.java:185, took 0.065961 s
14:51:56.699 INFO SparkContext - Starting job: count at ReadsSparkSinkUnitTest.java:185
14:51:56.700 INFO DAGScheduler - Got job 196 (count at ReadsSparkSinkUnitTest.java:185) with 1 output partitions
14:51:56.700 INFO DAGScheduler - Final stage: ResultStage 259 (count at ReadsSparkSinkUnitTest.java:185)
14:51:56.700 INFO DAGScheduler - Parents of final stage: List()
14:51:56.700 INFO DAGScheduler - Missing parents: List()
14:51:56.700 INFO DAGScheduler - Submitting ResultStage 259 (MapPartitionsRDD[1239] at filter at ReadsSparkSource.java:96), which has no missing parents
14:51:56.700 INFO MemoryStore - Block broadcast_524 stored as values in memory (estimated size 7.4 KiB, free 1918.6 MiB)
14:51:56.701 INFO MemoryStore - Block broadcast_524_piece0 stored as bytes in memory (estimated size 3.8 KiB, free 1918.6 MiB)
14:51:56.701 INFO BlockManagerInfo - Added broadcast_524_piece0 in memory on localhost:44923 (size: 3.8 KiB, free: 1919.7 MiB)
14:51:56.701 INFO SparkContext - Created broadcast 524 from broadcast at DAGScheduler.scala:1580
14:51:56.701 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 259 (MapPartitionsRDD[1239] at filter at ReadsSparkSource.java:96) (first 15 tasks are for partitions Vector(0))
14:51:56.701 INFO TaskSchedulerImpl - Adding task set 259.0 with 1 tasks resource profile 0
14:51:56.702 INFO TaskSetManager - Starting task 0.0 in stage 259.0 (TID 315) (localhost, executor driver, partition 0, ANY, 7852 bytes)
14:51:56.702 INFO Executor - Running task 0.0 in stage 259.0 (TID 315)
14:51:56.703 INFO NewHadoopRDD - Input split: hdfs://localhost:43595/user/runner/ReadsSparkSinkUnitTest6_f9cf6b32-8558-4e38-8bd0-e043a275e62a.sam:0+847558
14:51:56.705 INFO audit - allowed=true ugi=runner (auth:SIMPLE) ip=/127.0.0.1 cmd=open src=/user/runner/ReadsSparkSinkUnitTest6_f9cf6b32-8558-4e38-8bd0-e043a275e62a.sam dst=null perm=null proto=rpc
14:51:56.706 WARN DFSUtil - Unexpected value for data transfer bytes=86501 duration=0
14:51:56.711 WARN DFSUtil - Unexpected value for data transfer bytes=767681 duration=0
14:51:56.712 INFO Executor - Finished task 0.0 in stage 259.0 (TID 315). 946 bytes result sent to driver
14:51:56.712 INFO TaskSetManager - Finished task 0.0 in stage 259.0 (TID 315) in 10 ms on localhost (executor driver) (1/1)
14:51:56.712 INFO TaskSchedulerImpl - Removed TaskSet 259.0, whose tasks have all completed, from pool
14:51:56.712 INFO DAGScheduler - ResultStage 259 (count at ReadsSparkSinkUnitTest.java:185) finished in 0.012 s
14:51:56.712 INFO DAGScheduler - Job 196 is finished. Cancelling potential speculative or zombie tasks for this job
14:51:56.712 INFO TaskSchedulerImpl - Killing all running tasks in stage 259: Stage finished
14:51:56.712 INFO DAGScheduler - Job 196 finished: count at ReadsSparkSinkUnitTest.java:185, took 0.013007 s
14:51:56.715 INFO MemoryStore - Block broadcast_525 stored as values in memory (estimated size 297.9 KiB, free 1918.3 MiB)
14:51:56.721 INFO MemoryStore - Block broadcast_525_piece0 stored as bytes in memory (estimated size 50.2 KiB, free 1918.2 MiB)
14:51:56.722 INFO BlockManagerInfo - Added broadcast_525_piece0 in memory on localhost:44923 (size: 50.2 KiB, free: 1919.7 MiB)
14:51:56.722 INFO SparkContext - Created broadcast 525 from newAPIHadoopFile at PathSplitSource.java:96
14:51:56.744 INFO MemoryStore - Block broadcast_526 stored as values in memory (estimated size 297.9 KiB, free 1917.9 MiB)
14:51:56.750 INFO MemoryStore - Block broadcast_526_piece0 stored as bytes in memory (estimated size 50.2 KiB, free 1917.9 MiB)
14:51:56.750 INFO BlockManagerInfo - Added broadcast_526_piece0 in memory on localhost:44923 (size: 50.2 KiB, free: 1919.6 MiB)
14:51:56.750 INFO SparkContext - Created broadcast 526 from newAPIHadoopFile at PathSplitSource.java:96
14:51:56.770 INFO FileInputFormat - Total input files to process : 1
14:51:56.772 INFO MemoryStore - Block broadcast_527 stored as values in memory (estimated size 160.7 KiB, free 1917.7 MiB)
14:51:56.773 INFO MemoryStore - Block broadcast_527_piece0 stored as bytes in memory (estimated size 9.6 KiB, free 1917.7 MiB)
14:51:56.773 INFO BlockManagerInfo - Added broadcast_527_piece0 in memory on localhost:44923 (size: 9.6 KiB, free: 1919.6 MiB)
14:51:56.773 INFO SparkContext - Created broadcast 527 from broadcast at ReadsSparkSink.java:133
14:51:56.774 INFO MemoryStore - Block broadcast_528 stored as values in memory (estimated size 163.2 KiB, free 1917.6 MiB)
14:51:56.775 INFO MemoryStore - Block broadcast_528_piece0 stored as bytes in memory (estimated size 9.6 KiB, free 1917.6 MiB)
14:51:56.775 INFO BlockManagerInfo - Added broadcast_528_piece0 in memory on localhost:44923 (size: 9.6 KiB, free: 1919.6 MiB)
14:51:56.775 INFO SparkContext - Created broadcast 528 from broadcast at BamSink.java:76
14:51:56.777 INFO PathOutputCommitterFactory - No output committer factory defined, defaulting to FileOutputCommitterFactory
14:51:56.777 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
14:51:56.777 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
14:51:56.796 INFO SparkContext - Starting job: runJob at SparkHadoopWriter.scala:83
14:51:56.796 INFO DAGScheduler - Registering RDD 1253 (mapToPair at SparkUtils.java:161) as input to shuffle 52
14:51:56.797 INFO DAGScheduler - Got job 197 (runJob at SparkHadoopWriter.scala:83) with 1 output partitions
14:51:56.797 INFO DAGScheduler - Final stage: ResultStage 261 (runJob at SparkHadoopWriter.scala:83)
14:51:56.797 INFO DAGScheduler - Parents of final stage: List(ShuffleMapStage 260)
14:51:56.797 INFO DAGScheduler - Missing parents: List(ShuffleMapStage 260)
14:51:56.797 INFO DAGScheduler - Submitting ShuffleMapStage 260 (MapPartitionsRDD[1253] at mapToPair at SparkUtils.java:161), which has no missing parents
14:51:56.815 INFO MemoryStore - Block broadcast_529 stored as values in memory (estimated size 520.4 KiB, free 1917.0 MiB)
14:51:56.816 INFO MemoryStore - Block broadcast_529_piece0 stored as bytes in memory (estimated size 166.1 KiB, free 1916.9 MiB)
14:51:56.816 INFO BlockManagerInfo - Added broadcast_529_piece0 in memory on localhost:44923 (size: 166.1 KiB, free: 1919.5 MiB)
14:51:56.817 INFO SparkContext - Created broadcast 529 from broadcast at DAGScheduler.scala:1580
14:51:56.817 INFO DAGScheduler - Submitting 1 missing tasks from ShuffleMapStage 260 (MapPartitionsRDD[1253] at mapToPair at SparkUtils.java:161) (first 15 tasks are for partitions Vector(0))
14:51:56.817 INFO TaskSchedulerImpl - Adding task set 260.0 with 1 tasks resource profile 0
14:51:56.817 INFO TaskSetManager - Starting task 0.0 in stage 260.0 (TID 316) (localhost, executor driver, partition 0, PROCESS_LOCAL, 7861 bytes)
14:51:56.818 INFO Executor - Running task 0.0 in stage 260.0 (TID 316)
14:51:56.850 INFO NewHadoopRDD - Input split: file:/home/runner/work/gatk/gatk/src/test/resources/org/broadinstitute/hellbender/tools/BQSR/HiSeq.1mb.1RG.2k_lines.bam:0+222075
14:51:56.866 INFO Executor - Finished task 0.0 in stage 260.0 (TID 316). 1148 bytes result sent to driver
14:51:56.866 INFO TaskSetManager - Finished task 0.0 in stage 260.0 (TID 316) in 49 ms on localhost (executor driver) (1/1)
14:51:56.866 INFO TaskSchedulerImpl - Removed TaskSet 260.0, whose tasks have all completed, from pool
14:51:56.866 INFO DAGScheduler - ShuffleMapStage 260 (mapToPair at SparkUtils.java:161) finished in 0.069 s
14:51:56.867 INFO DAGScheduler - looking for newly runnable stages
14:51:56.867 INFO DAGScheduler - running: HashSet()
14:51:56.867 INFO DAGScheduler - waiting: HashSet(ResultStage 261)
14:51:56.867 INFO DAGScheduler - failed: HashSet()
14:51:56.867 INFO DAGScheduler - Submitting ResultStage 261 (MapPartitionsRDD[1258] at mapToPair at BamSink.java:91), which has no missing parents
14:51:56.874 INFO MemoryStore - Block broadcast_530 stored as values in memory (estimated size 241.4 KiB, free 1916.6 MiB)
14:51:56.875 INFO MemoryStore - Block broadcast_530_piece0 stored as bytes in memory (estimated size 67.0 KiB, free 1916.6 MiB)
14:51:56.875 INFO BlockManagerInfo - Added broadcast_530_piece0 in memory on localhost:44923 (size: 67.0 KiB, free: 1919.4 MiB)
14:51:56.875 INFO SparkContext - Created broadcast 530 from broadcast at DAGScheduler.scala:1580
14:51:56.875 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 261 (MapPartitionsRDD[1258] at mapToPair at BamSink.java:91) (first 15 tasks are for partitions Vector(0))
14:51:56.875 INFO TaskSchedulerImpl - Adding task set 261.0 with 1 tasks resource profile 0
14:51:56.876 INFO TaskSetManager - Starting task 0.0 in stage 261.0 (TID 317) (localhost, executor driver, partition 0, NODE_LOCAL, 7513 bytes)
14:51:56.876 INFO Executor - Running task 0.0 in stage 261.0 (TID 317)
14:51:56.881 INFO ShuffleBlockFetcherIterator - Getting 1 (343.8 KiB) non-empty blocks including 1 (343.8 KiB) local and 0 (0.0 B) host-local and 0 (0.0 B) push-merged-local and 0 (0.0 B) remote blocks
14:51:56.881 INFO ShuffleBlockFetcherIterator - Started 0 remote fetches in 0 ms
14:51:56.895 INFO PathOutputCommitterFactory - No output committer factory defined, defaulting to FileOutputCommitterFactory
14:51:56.895 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
14:51:56.895 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
14:51:56.895 INFO PathOutputCommitterFactory - No output committer factory defined, defaulting to FileOutputCommitterFactory
14:51:56.895 INFO FileOutputCommitter - File Output Committer Algorithm version is 1
14:51:56.895 INFO FileOutputCommitter - FileOutputCommitter skip cleanup _temporary folders under output directory:false, ignore cleanup failures: false
14:51:56.922 INFO FileOutputCommitter - Saved output of task 'attempt_202603041451565551090897566152388_1258_r_000000_0' to file:/tmp/ReadsSparkSinkUnitTest111232109014166244555.bam.parts/_temporary/0/task_202603041451565551090897566152388_1258_r_000000
14:51:56.922 INFO SparkHadoopMapRedUtil - attempt_202603041451565551090897566152388_1258_r_000000_0: Committed. Elapsed time: 0 ms.
14:51:56.922 INFO Executor - Finished task 0.0 in stage 261.0 (TID 317). 1858 bytes result sent to driver
14:51:56.923 INFO TaskSetManager - Finished task 0.0 in stage 261.0 (TID 317) in 47 ms on localhost (executor driver) (1/1)
14:51:56.923 INFO TaskSchedulerImpl - Removed TaskSet 261.0, whose tasks have all completed, from pool
14:51:56.923 INFO DAGScheduler - ResultStage 261 (runJob at SparkHadoopWriter.scala:83) finished in 0.056 s
14:51:56.923 INFO DAGScheduler - Job 197 is finished. Cancelling potential speculative or zombie tasks for this job
14:51:56.923 INFO TaskSchedulerImpl - Killing all running tasks in stage 261: Stage finished
14:51:56.923 INFO DAGScheduler - Job 197 finished: runJob at SparkHadoopWriter.scala:83, took 0.127125 s
14:51:56.923 INFO SparkHadoopWriter - Start to commit write Job job_202603041451565551090897566152388_1258.
14:51:56.929 INFO SparkHadoopWriter - Write Job job_202603041451565551090897566152388_1258 committed. Elapsed time: 5 ms.
14:51:56.942 INFO HadoopFileSystemWrapper - Concatenating 3 parts to file:////tmp/ReadsSparkSinkUnitTest111232109014166244555.bam
14:51:56.947 INFO HadoopFileSystemWrapper - Concatenating to file:////tmp/ReadsSparkSinkUnitTest111232109014166244555.bam done
14:51:56.947 INFO IndexFileMerger - Merging .sbi files in temp directory file:////tmp/ReadsSparkSinkUnitTest111232109014166244555.bam.parts/ to file:////tmp/ReadsSparkSinkUnitTest111232109014166244555.bam.sbi
14:51:56.952 INFO IndexFileMerger - Done merging .sbi files
14:51:56.952 INFO IndexFileMerger - Merging .bai files in temp directory file:////tmp/ReadsSparkSinkUnitTest111232109014166244555.bam.parts/ to file:////tmp/ReadsSparkSinkUnitTest111232109014166244555.bam.bai
14:51:56.957 INFO IndexFileMerger - Done merging .bai files
14:51:56.959 INFO MemoryStore - Block broadcast_531 stored as values in memory (estimated size 320.0 B, free 1916.6 MiB)
14:51:56.960 INFO MemoryStore - Block broadcast_531_piece0 stored as bytes in memory (estimated size 233.0 B, free 1916.6 MiB)
14:51:56.960 INFO BlockManagerInfo - Added broadcast_531_piece0 in memory on localhost:44923 (size: 233.0 B, free: 1919.4 MiB)
14:51:56.960 INFO SparkContext - Created broadcast 531 from broadcast at BamSource.java:104
14:51:56.961 INFO MemoryStore - Block broadcast_532 stored as values in memory (estimated size 297.9 KiB, free 1916.3 MiB)
14:51:56.967 INFO MemoryStore - Block broadcast_532_piece0 stored as bytes in memory (estimated size 50.2 KiB, free 1916.2 MiB)
14:51:56.967 INFO BlockManagerInfo - Added broadcast_532_piece0 in memory on localhost:44923 (size: 50.2 KiB, free: 1919.3 MiB)
14:51:56.968 INFO SparkContext - Created broadcast 532 from newAPIHadoopFile at PathSplitSource.java:96
14:51:56.977 INFO FileInputFormat - Total input files to process : 1
14:51:56.991 INFO SparkContext - Starting job: collect at ReadsSparkSinkUnitTest.java:182
14:51:56.992 INFO DAGScheduler - Got job 198 (collect at ReadsSparkSinkUnitTest.java:182) with 1 output partitions
14:51:56.992 INFO DAGScheduler - Final stage: ResultStage 262 (collect at ReadsSparkSinkUnitTest.java:182)
14:51:56.992 INFO DAGScheduler - Parents of final stage: List()
14:51:56.992 INFO DAGScheduler - Missing parents: List()
14:51:56.992 INFO DAGScheduler - Submitting ResultStage 262 (MapPartitionsRDD[1264] at filter at ReadsSparkSource.java:96), which has no missing parents
14:51:56.998 INFO MemoryStore - Block broadcast_533 stored as values in memory (estimated size 148.2 KiB, free 1916.1 MiB)
14:51:57.003 INFO BlockManagerInfo - Removed broadcast_521_piece0 on localhost:44923 in memory (size: 50.2 KiB, free: 1919.4 MiB)
14:51:57.003 INFO MemoryStore - Block broadcast_533_piece0 stored as bytes in memory (estimated size 54.5 KiB, free 1916.4 MiB)
14:51:57.003 INFO BlockManagerInfo - Added broadcast_533_piece0 in memory on localhost:44923 (size: 54.5 KiB, free: 1919.3 MiB)
14:51:57.003 INFO SparkContext - Created broadcast 533 from broadcast at DAGScheduler.scala:1580
14:51:57.003 INFO BlockManagerInfo - Removed broadcast_529_piece0 on localhost:44923 in memory (size: 166.1 KiB, free: 1919.5 MiB)
14:51:57.003 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 262 (MapPartitionsRDD[1264] at filter at ReadsSparkSource.java:96) (first 15 tasks are for partitions Vector(0))
14:51:57.003 INFO TaskSchedulerImpl - Adding task set 262.0 with 1 tasks resource profile 0
14:51:57.004 INFO BlockManagerInfo - Removed broadcast_528_piece0 on localhost:44923 in memory (size: 9.6 KiB, free: 1919.5 MiB)
14:51:57.004 INFO BlockManagerInfo - Removed broadcast_524_piece0 on localhost:44923 in memory (size: 3.8 KiB, free: 1919.5 MiB)
14:51:57.004 INFO TaskSetManager - Starting task 0.0 in stage 262.0 (TID 318) (localhost, executor driver, partition 0, PROCESS_LOCAL, 7810 bytes)
14:51:57.004 INFO Executor - Running task 0.0 in stage 262.0 (TID 318)
14:51:57.005 INFO BlockManagerInfo - Removed broadcast_526_piece0 on localhost:44923 in memory (size: 50.2 KiB, free: 1919.6 MiB)
14:51:57.005 INFO BlockManagerInfo - Removed broadcast_530_piece0 on localhost:44923 in memory (size: 67.0 KiB, free: 1919.6 MiB)
14:51:57.005 INFO BlockManagerInfo - Removed broadcast_527_piece0 on localhost:44923 in memory (size: 9.6 KiB, free: 1919.6 MiB)
14:51:57.006 INFO BlockManagerInfo - Removed broadcast_522_piece0 on localhost:44923 in memory (size: 3.8 KiB, free: 1919.6 MiB)
14:51:57.006 INFO BlockManagerInfo - Removed broadcast_515_piece0 on localhost:44923 in memory (size: 50.2 KiB, free: 1919.7 MiB)
14:51:57.006 INFO BlockManagerInfo - Removed broadcast_520_piece0 on localhost:44923 in memory (size: 9.6 KiB, free: 1919.7 MiB)
14:51:57.007 INFO BlockManagerInfo - Removed broadcast_523_piece0 on localhost:44923 in memory (size: 153.6 KiB, free: 1919.8 MiB)
14:51:57.022 INFO NewHadoopRDD - Input split: file:/tmp/ReadsSparkSinkUnitTest111232109014166244555.bam:0+237038
14:51:57.027 INFO Executor - Finished task 0.0 in stage 262.0 (TID 318). 651483 bytes result sent to driver
14:51:57.029 INFO TaskSetManager - Finished task 0.0 in stage 262.0 (TID 318) in 24 ms on localhost (executor driver) (1/1)
14:51:57.029 INFO TaskSchedulerImpl - Removed TaskSet 262.0, whose tasks have all completed, from pool
14:51:57.029 INFO DAGScheduler - ResultStage 262 (collect at ReadsSparkSinkUnitTest.java:182) finished in 0.037 s
14:51:57.029 INFO DAGScheduler - Job 198 is finished. Cancelling potential speculative or zombie tasks for this job
14:51:57.029 INFO TaskSchedulerImpl - Killing all running tasks in stage 262: Stage finished
14:51:57.029 INFO DAGScheduler - Job 198 finished: collect at ReadsSparkSinkUnitTest.java:182, took 0.037396 s
14:51:57.038 INFO SparkContext - Starting job: count at ReadsSparkSinkUnitTest.java:185
14:51:57.039 INFO DAGScheduler - Got job 199 (count at ReadsSparkSinkUnitTest.java:185) with 1 output partitions
14:51:57.039 INFO DAGScheduler - Final stage: ResultStage 263 (count at ReadsSparkSinkUnitTest.java:185)
14:51:57.039 INFO DAGScheduler - Parents of final stage: List()
14:51:57.039 INFO DAGScheduler - Missing parents: List()
14:51:57.039 INFO DAGScheduler - Submitting ResultStage 263 (MapPartitionsRDD[1246] at filter at ReadsSparkSource.java:96), which has no missing parents
14:51:57.056 INFO MemoryStore - Block broadcast_534 stored as values in memory (estimated size 426.1 KiB, free 1918.7 MiB)
14:51:57.057 INFO MemoryStore - Block broadcast_534_piece0 stored as bytes in memory (estimated size 153.6 KiB, free 1918.6 MiB)
14:51:57.057 INFO BlockManagerInfo - Added broadcast_534_piece0 in memory on localhost:44923 (size: 153.6 KiB, free: 1919.7 MiB)
14:51:57.058 INFO SparkContext - Created broadcast 534 from broadcast at DAGScheduler.scala:1580
14:51:57.058 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 263 (MapPartitionsRDD[1246] at filter at ReadsSparkSource.java:96) (first 15 tasks are for partitions Vector(0))
14:51:57.058 INFO TaskSchedulerImpl - Adding task set 263.0 with 1 tasks resource profile 0
14:51:57.058 INFO TaskSetManager - Starting task 0.0 in stage 263.0 (TID 319) (localhost, executor driver, partition 0, PROCESS_LOCAL, 7872 bytes)
14:51:57.059 INFO Executor - Running task 0.0 in stage 263.0 (TID 319)
14:51:57.091 INFO NewHadoopRDD - Input split: file:/home/runner/work/gatk/gatk/src/test/resources/org/broadinstitute/hellbender/tools/BQSR/HiSeq.1mb.1RG.2k_lines.bam:0+222075
14:51:57.100 INFO Executor - Finished task 0.0 in stage 263.0 (TID 319). 989 bytes result sent to driver
14:51:57.101 INFO TaskSetManager - Finished task 0.0 in stage 263.0 (TID 319) in 43 ms on localhost (executor driver) (1/1)
14:51:57.101 INFO TaskSchedulerImpl - Removed TaskSet 263.0, whose tasks have all completed, from pool
14:51:57.101 INFO DAGScheduler - ResultStage 263 (count at ReadsSparkSinkUnitTest.java:185) finished in 0.062 s
14:51:57.101 INFO DAGScheduler - Job 199 is finished. Cancelling potential speculative or zombie tasks for this job
14:51:57.101 INFO TaskSchedulerImpl - Killing all running tasks in stage 263: Stage finished
14:51:57.101 INFO DAGScheduler - Job 199 finished: count at ReadsSparkSinkUnitTest.java:185, took 0.062721 s
14:51:57.105 INFO SparkContext - Starting job: count at ReadsSparkSinkUnitTest.java:185
14:51:57.105 INFO DAGScheduler - Got job 200 (count at ReadsSparkSinkUnitTest.java:185) with 1 output partitions
14:51:57.105 INFO DAGScheduler - Final stage: ResultStage 264 (count at ReadsSparkSinkUnitTest.java:185)
14:51:57.105 INFO DAGScheduler - Parents of final stage: List()
14:51:57.105 INFO DAGScheduler - Missing parents: List()
14:51:57.105 INFO DAGScheduler - Submitting ResultStage 264 (MapPartitionsRDD[1264] at filter at ReadsSparkSource.java:96), which has no missing parents
14:51:57.111 INFO MemoryStore - Block broadcast_535 stored as values in memory (estimated size 148.1 KiB, free 1918.4 MiB)
14:51:57.112 INFO MemoryStore - Block broadcast_535_piece0 stored as bytes in memory (estimated size 54.5 KiB, free 1918.4 MiB)
14:51:57.112 INFO BlockManagerInfo - Added broadcast_535_piece0 in memory on localhost:44923 (size: 54.5 KiB, free: 1919.6 MiB)
14:51:57.113 INFO SparkContext - Created broadcast 535 from broadcast at DAGScheduler.scala:1580
14:51:57.113 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 264 (MapPartitionsRDD[1264] at filter at ReadsSparkSource.java:96) (first 15 tasks are for partitions Vector(0))
14:51:57.113 INFO TaskSchedulerImpl - Adding task set 264.0 with 1 tasks resource profile 0
14:51:57.113 INFO TaskSetManager - Starting task 0.0 in stage 264.0 (TID 320) (localhost, executor driver, partition 0, PROCESS_LOCAL, 7810 bytes)
14:51:57.114 INFO Executor - Running task 0.0 in stage 264.0 (TID 320)
14:51:57.126 INFO NewHadoopRDD - Input split: file:/tmp/ReadsSparkSinkUnitTest111232109014166244555.bam:0+237038
14:51:57.129 INFO Executor - Finished task 0.0 in stage 264.0 (TID 320). 989 bytes result sent to driver
14:51:57.129 INFO TaskSetManager - Finished task 0.0 in stage 264.0 (TID 320) in 16 ms on localhost (executor driver) (1/1)
14:51:57.130 INFO TaskSchedulerImpl - Removed TaskSet 264.0, whose tasks have all completed, from pool
14:51:57.130 INFO DAGScheduler - ResultStage 264 (count at ReadsSparkSinkUnitTest.java:185) finished in 0.025 s
14:51:57.130 INFO DAGScheduler - Job 200 is finished. Cancelling potential speculative or zombie tasks for this job
14:51:57.130 INFO TaskSchedulerImpl - Killing all running tasks in stage 264: Stage finished
14:51:57.130 INFO DAGScheduler - Job 200 finished: count at ReadsSparkSinkUnitTest.java:185, took 0.025122 s