[June 12, 2025 at 8:46:33 PM UTC] org.broadinstitute.hellbender.tools.spark.pipelines.metrics.CollectMultipleMetricsSpark done. Elapsed time: 0.00 minutes.
Runtime.totalMemory()=1738539008
[June 12, 2025 at 8:46:33 PM UTC] org.broadinstitute.hellbender.tools.spark.pipelines.metrics.CollectMultipleMetricsSpark done. Elapsed time: 0.00 minutes.
Runtime.totalMemory()=1738539008
[June 12, 2025 at 8:46:34 PM UTC] org.broadinstitute.hellbender.tools.spark.pipelines.metrics.CollectMultipleMetricsSpark done. Elapsed time: 0.00 minutes.
Runtime.totalMemory()=1738539008
20:46:34.034 INFO CollectMultipleMetricsSpark - ------------------------------------------------------------
20:46:34.034 INFO CollectMultipleMetricsSpark - org.broadinstitute.hellbender.tools.spark.pipelines.metrics vUnavailable
20:46:34.034 INFO CollectMultipleMetricsSpark - For support and documentation go to https://software.broadinstitute.org/gatk/
20:46:34.034 INFO CollectMultipleMetricsSpark - Executing as runner@pkrvmxyh4eaekms on Linux v6.11.0-1015-azure amd64
20:46:34.034 INFO CollectMultipleMetricsSpark - Java runtime: OpenJDK 64-Bit Server VM v17.0.6+10
20:46:34.034 INFO CollectMultipleMetricsSpark - Start Date/Time: June 12, 2025 at 8:46:34 PM UTC
20:46:34.034 INFO CollectMultipleMetricsSpark - ------------------------------------------------------------
20:46:34.034 INFO CollectMultipleMetricsSpark - ------------------------------------------------------------
20:46:34.034 INFO CollectMultipleMetricsSpark - HTSJDK Defaults.COMPRESSION_LEVEL : 2
20:46:34.034 INFO CollectMultipleMetricsSpark - HTSJDK Defaults.USE_ASYNC_IO_READ_FOR_SAMTOOLS : false
20:46:34.034 INFO CollectMultipleMetricsSpark - HTSJDK Defaults.USE_ASYNC_IO_WRITE_FOR_SAMTOOLS : true
20:46:34.034 INFO CollectMultipleMetricsSpark - HTSJDK Defaults.USE_ASYNC_IO_WRITE_FOR_TRIBBLE : false
20:46:34.034 INFO CollectMultipleMetricsSpark - Deflater: IntelDeflater
20:46:34.034 INFO CollectMultipleMetricsSpark - Inflater: IntelInflater
20:46:34.034 INFO CollectMultipleMetricsSpark - GCS max retries/reopens: 20
20:46:34.034 INFO CollectMultipleMetricsSpark - Requester pays: disabled
20:46:34.034 WARN CollectMultipleMetricsSpark -
?[1m?[31m !!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!
Warning: CollectMultipleMetricsSpark is a BETA tool and is not yet ready for use in production
!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!?[0m
20:46:34.034 INFO CollectMultipleMetricsSpark - Initializing engine
20:46:34.034 INFO CollectMultipleMetricsSpark - Done initializing engine
20:46:34.035 INFO CollectMultipleMetricsSpark - Spark verbosity set to INFO (see --spark-verbosity argument)
WARNING 2025-06-12 20:46:34 SamReaderFactory Unable to detect file format from input URL or stream, assuming SAM format.
WARNING 2025-06-12 20:46:34 SamReaderFactory Unable to detect file format from input URL or stream, assuming SAM format.
20:46:34.036 INFO MemoryStore - Block broadcast_3390 stored as values in memory (estimated size 37.8 KiB, free 1917.5 MiB)
20:46:34.037 INFO MemoryStore - Block broadcast_3390_piece0 stored as bytes in memory (estimated size 8.4 KiB, free 1917.5 MiB)
20:46:34.037 INFO BlockManagerInfo - Added broadcast_3390_piece0 in memory on localhost:34191 (size: 8.4 KiB, free: 1919.3 MiB)
20:46:34.037 INFO SparkContext - Created broadcast 3390 from broadcast at SamSource.java:78
20:46:34.038 INFO MemoryStore - Block broadcast_3391 stored as values in memory (estimated size 306.3 KiB, free 1917.2 MiB)
20:46:34.044 INFO MemoryStore - Block broadcast_3391_piece0 stored as bytes in memory (estimated size 64.4 KiB, free 1917.1 MiB)
20:46:34.044 INFO BlockManagerInfo - Added broadcast_3391_piece0 in memory on localhost:34191 (size: 64.4 KiB, free: 1919.2 MiB)
20:46:34.045 INFO SparkContext - Created broadcast 3391 from newAPIHadoopFile at SamSource.java:108
WARNING 2025-06-12 20:46:34 SamReaderFactory Unable to detect file format from input URL or stream, assuming SAM format.
WARNING 2025-06-12 20:46:34 SamReaderFactory Unable to detect file format from input URL or stream, assuming SAM format.
20:46:34.047 INFO MemoryStore - Block broadcast_3392 stored as values in memory (estimated size 37.8 KiB, free 1917.1 MiB)
20:46:34.048 INFO MemoryStore - Block broadcast_3392_piece0 stored as bytes in memory (estimated size 8.4 KiB, free 1917.1 MiB)
20:46:34.048 INFO BlockManagerInfo - Added broadcast_3392_piece0 in memory on localhost:34191 (size: 8.4 KiB, free: 1919.2 MiB)
20:46:34.048 INFO SparkContext - Created broadcast 3392 from broadcast at SamSource.java:78
20:46:34.048 INFO MemoryStore - Block broadcast_3393 stored as values in memory (estimated size 306.3 KiB, free 1916.8 MiB)
20:46:34.055 INFO MemoryStore - Block broadcast_3393_piece0 stored as bytes in memory (estimated size 64.4 KiB, free 1916.7 MiB)
20:46:34.055 INFO BlockManagerInfo - Added broadcast_3393_piece0 in memory on localhost:34191 (size: 64.4 KiB, free: 1919.1 MiB)
20:46:34.055 INFO SparkContext - Created broadcast 3393 from newAPIHadoopFile at SamSource.java:108
20:46:34.058 INFO FileInputFormat - Total input files to process : 1
20:46:34.061 INFO SparkContext - Starting job: count at CollectMultipleMetricsSparkIntegrationTest.java:130
20:46:34.061 INFO DAGScheduler - Got job 886 (count at CollectMultipleMetricsSparkIntegrationTest.java:130) with 1 output partitions
20:46:34.061 INFO DAGScheduler - Final stage: ResultStage 2476 (count at CollectMultipleMetricsSparkIntegrationTest.java:130)
20:46:34.061 INFO DAGScheduler - Parents of final stage: List()
20:46:34.061 INFO DAGScheduler - Missing parents: List()
20:46:34.061 INFO DAGScheduler - Submitting ResultStage 2476 (MapPartitionsRDD[10216] at filter at CollectMultipleMetricsSpark.java:193), which has no missing parents
20:46:34.062 INFO MemoryStore - Block broadcast_3394 stored as values in memory (estimated size 34.8 KiB, free 1916.7 MiB)
20:46:34.062 INFO MemoryStore - Block broadcast_3394_piece0 stored as bytes in memory (estimated size 14.6 KiB, free 1916.7 MiB)
20:46:34.062 INFO BlockManagerInfo - Added broadcast_3394_piece0 in memory on localhost:34191 (size: 14.6 KiB, free: 1919.1 MiB)
20:46:34.063 INFO SparkContext - Created broadcast 3394 from broadcast at DAGScheduler.scala:1580
20:46:34.063 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 2476 (MapPartitionsRDD[10216] at filter at CollectMultipleMetricsSpark.java:193) (first 15 tasks are for partitions Vector(0))
20:46:34.063 INFO TaskSchedulerImpl - Adding task set 2476.0 with 1 tasks resource profile 0
20:46:34.063 INFO TaskSetManager - Starting task 0.0 in stage 2476.0 (TID 1907) (localhost, executor driver, partition 0, PROCESS_LOCAL, 9772 bytes)
20:46:34.063 INFO Executor - Running task 0.0 in stage 2476.0 (TID 1907)
20:46:34.064 INFO NewHadoopRDD - Input split: file:/home/runner/work/gatk/gatk/src/test/resources/org/broadinstitute/hellbender/metrics/analysis/CollectInsertSizeMetrics/insert_size_metrics_test.sam:0+44008
20:46:34.066 INFO Executor - Finished task 0.0 in stage 2476.0 (TID 1907). 989 bytes result sent to driver
20:46:34.066 INFO TaskSetManager - Finished task 0.0 in stage 2476.0 (TID 1907) in 3 ms on localhost (executor driver) (1/1)
20:46:34.066 INFO TaskSchedulerImpl - Removed TaskSet 2476.0, whose tasks have all completed, from pool
20:46:34.066 INFO DAGScheduler - ResultStage 2476 (count at CollectMultipleMetricsSparkIntegrationTest.java:130) finished in 0.005 s
20:46:34.066 INFO DAGScheduler - Job 886 is finished. Cancelling potential speculative or zombie tasks for this job
20:46:34.066 INFO TaskSchedulerImpl - Killing all running tasks in stage 2476: Stage finished
20:46:34.066 INFO DAGScheduler - Job 886 finished: count at CollectMultipleMetricsSparkIntegrationTest.java:130, took 0.005176 s
20:46:34.066 INFO CollectMultipleMetricsSpark - Shutting down engine
[June 12, 2025 at 8:46:34 PM UTC] org.broadinstitute.hellbender.tools.spark.pipelines.metrics.CollectMultipleMetricsSpark done. Elapsed time: 0.00 minutes.
Runtime.totalMemory()=1738539008
20:46:34.071 INFO CollectMultipleMetricsSpark - ------------------------------------------------------------
20:46:34.071 INFO CollectMultipleMetricsSpark - org.broadinstitute.hellbender.tools.spark.pipelines.metrics vUnavailable
20:46:34.071 INFO CollectMultipleMetricsSpark - For support and documentation go to https://software.broadinstitute.org/gatk/
20:46:34.071 INFO CollectMultipleMetricsSpark - Executing as runner@pkrvmxyh4eaekms on Linux v6.11.0-1015-azure amd64
20:46:34.071 INFO CollectMultipleMetricsSpark - Java runtime: OpenJDK 64-Bit Server VM v17.0.6+10
20:46:34.071 INFO CollectMultipleMetricsSpark - Start Date/Time: June 12, 2025 at 8:46:34 PM UTC
20:46:34.071 INFO CollectMultipleMetricsSpark - ------------------------------------------------------------
20:46:34.071 INFO CollectMultipleMetricsSpark - ------------------------------------------------------------
20:46:34.072 INFO CollectMultipleMetricsSpark - HTSJDK Defaults.COMPRESSION_LEVEL : 2
20:46:34.072 INFO CollectMultipleMetricsSpark - HTSJDK Defaults.USE_ASYNC_IO_READ_FOR_SAMTOOLS : false
20:46:34.072 INFO CollectMultipleMetricsSpark - HTSJDK Defaults.USE_ASYNC_IO_WRITE_FOR_SAMTOOLS : true
20:46:34.072 INFO CollectMultipleMetricsSpark - HTSJDK Defaults.USE_ASYNC_IO_WRITE_FOR_TRIBBLE : false
20:46:34.072 INFO CollectMultipleMetricsSpark - Deflater: IntelDeflater
20:46:34.072 INFO CollectMultipleMetricsSpark - Inflater: IntelInflater
20:46:34.072 INFO CollectMultipleMetricsSpark - GCS max retries/reopens: 20
20:46:34.072 INFO CollectMultipleMetricsSpark - Requester pays: disabled
20:46:34.072 WARN CollectMultipleMetricsSpark -
?[1m?[31m !!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!
Warning: CollectMultipleMetricsSpark is a BETA tool and is not yet ready for use in production
!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!?[0m
20:46:34.072 INFO CollectMultipleMetricsSpark - Initializing engine
20:46:34.072 INFO CollectMultipleMetricsSpark - Done initializing engine
20:46:34.072 INFO CollectMultipleMetricsSpark - Spark verbosity set to INFO (see --spark-verbosity argument)
20:46:34.074 INFO MemoryStore - Block broadcast_3395 stored as values in memory (estimated size 306.3 KiB, free 1916.4 MiB)
20:46:34.081 INFO MemoryStore - Block broadcast_3395_piece0 stored as bytes in memory (estimated size 64.4 KiB, free 1916.3 MiB)
20:46:34.081 INFO BlockManagerInfo - Added broadcast_3395_piece0 in memory on localhost:34191 (size: 64.4 KiB, free: 1919.0 MiB)
20:46:34.081 INFO SparkContext - Created broadcast 3395 from newAPIHadoopFile at PathSplitSource.java:96
20:46:34.101 INFO MemoryStore - Block broadcast_3396 stored as values in memory (estimated size 306.3 KiB, free 1916.0 MiB)
20:46:34.108 INFO MemoryStore - Block broadcast_3396_piece0 stored as bytes in memory (estimated size 64.4 KiB, free 1916.0 MiB)
20:46:34.108 INFO BlockManagerInfo - Added broadcast_3396_piece0 in memory on localhost:34191 (size: 64.4 KiB, free: 1919.0 MiB)
20:46:34.108 INFO SparkContext - Created broadcast 3396 from newAPIHadoopFile at PathSplitSource.java:96
20:46:34.128 INFO FileInputFormat - Total input files to process : 1
20:46:34.131 INFO SparkContext - Starting job: count at CollectMultipleMetricsSparkIntegrationTest.java:130
20:46:34.131 INFO DAGScheduler - Got job 887 (count at CollectMultipleMetricsSparkIntegrationTest.java:130) with 1 output partitions
20:46:34.131 INFO DAGScheduler - Final stage: ResultStage 2477 (count at CollectMultipleMetricsSparkIntegrationTest.java:130)
20:46:34.131 INFO DAGScheduler - Parents of final stage: List()
20:46:34.131 INFO DAGScheduler - Missing parents: List()
20:46:34.131 INFO DAGScheduler - Submitting ResultStage 2477 (MapPartitionsRDD[10229] at filter at CollectMultipleMetricsSpark.java:193), which has no missing parents
20:46:34.148 INFO MemoryStore - Block broadcast_3397 stored as values in memory (estimated size 478.0 KiB, free 1915.5 MiB)
20:46:34.151 INFO MemoryStore - Block broadcast_3397_piece0 stored as bytes in memory (estimated size 208.0 KiB, free 1915.3 MiB)
20:46:34.151 INFO BlockManagerInfo - Added broadcast_3397_piece0 in memory on localhost:34191 (size: 208.0 KiB, free: 1918.8 MiB)
20:46:34.151 INFO SparkContext - Created broadcast 3397 from broadcast at DAGScheduler.scala:1580
20:46:34.151 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 2477 (MapPartitionsRDD[10229] at filter at CollectMultipleMetricsSpark.java:193) (first 15 tasks are for partitions Vector(0))
20:46:34.151 INFO TaskSchedulerImpl - Adding task set 2477.0 with 1 tasks resource profile 0
20:46:34.152 INFO TaskSetManager - Starting task 0.0 in stage 2477.0 (TID 1908) (localhost, executor driver, partition 0, PROCESS_LOCAL, 9772 bytes)
20:46:34.152 INFO Executor - Running task 0.0 in stage 2477.0 (TID 1908)
20:46:34.182 INFO NewHadoopRDD - Input split: file:/home/runner/work/gatk/gatk/src/test/resources/org/broadinstitute/hellbender/metrics/analysis/CollectInsertSizeMetrics/insert_size_metrics_test.bam:0+8071
20:46:34.186 INFO Executor - Finished task 0.0 in stage 2477.0 (TID 1908). 989 bytes result sent to driver
20:46:34.186 INFO TaskSetManager - Finished task 0.0 in stage 2477.0 (TID 1908) in 34 ms on localhost (executor driver) (1/1)
20:46:34.186 INFO TaskSchedulerImpl - Removed TaskSet 2477.0, whose tasks have all completed, from pool
20:46:34.186 INFO DAGScheduler - ResultStage 2477 (count at CollectMultipleMetricsSparkIntegrationTest.java:130) finished in 0.054 s
20:46:34.186 INFO DAGScheduler - Job 887 is finished. Cancelling potential speculative or zombie tasks for this job
20:46:34.186 INFO TaskSchedulerImpl - Killing all running tasks in stage 2477: Stage finished
20:46:34.186 INFO DAGScheduler - Job 887 finished: count at CollectMultipleMetricsSparkIntegrationTest.java:130, took 0.055239 s
20:46:34.186 INFO CollectMultipleMetricsSpark - Shutting down engine
[June 12, 2025 at 8:46:34 PM UTC] org.broadinstitute.hellbender.tools.spark.pipelines.metrics.CollectMultipleMetricsSpark done. Elapsed time: 0.00 minutes.
Runtime.totalMemory()=1738539008
20:46:34.191 INFO CollectMultipleMetricsSpark - ------------------------------------------------------------
20:46:34.191 INFO CollectMultipleMetricsSpark - org.broadinstitute.hellbender.tools.spark.pipelines.metrics vUnavailable
20:46:34.191 INFO CollectMultipleMetricsSpark - For support and documentation go to https://software.broadinstitute.org/gatk/
20:46:34.191 INFO CollectMultipleMetricsSpark - Executing as runner@pkrvmxyh4eaekms on Linux v6.11.0-1015-azure amd64
20:46:34.191 INFO CollectMultipleMetricsSpark - Java runtime: OpenJDK 64-Bit Server VM v17.0.6+10
20:46:34.191 INFO CollectMultipleMetricsSpark - Start Date/Time: June 12, 2025 at 8:46:34 PM UTC
20:46:34.191 INFO CollectMultipleMetricsSpark - ------------------------------------------------------------
20:46:34.191 INFO CollectMultipleMetricsSpark - ------------------------------------------------------------
20:46:34.192 INFO CollectMultipleMetricsSpark - HTSJDK Defaults.COMPRESSION_LEVEL : 2
20:46:34.192 INFO CollectMultipleMetricsSpark - HTSJDK Defaults.USE_ASYNC_IO_READ_FOR_SAMTOOLS : false
20:46:34.192 INFO CollectMultipleMetricsSpark - HTSJDK Defaults.USE_ASYNC_IO_WRITE_FOR_SAMTOOLS : true
20:46:34.192 INFO CollectMultipleMetricsSpark - HTSJDK Defaults.USE_ASYNC_IO_WRITE_FOR_TRIBBLE : false
20:46:34.192 INFO CollectMultipleMetricsSpark - Deflater: IntelDeflater
20:46:34.192 INFO CollectMultipleMetricsSpark - Inflater: IntelInflater
20:46:34.192 INFO CollectMultipleMetricsSpark - GCS max retries/reopens: 20
20:46:34.192 INFO CollectMultipleMetricsSpark - Requester pays: disabled
20:46:34.192 WARN CollectMultipleMetricsSpark -
?[1m?[31m !!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!
Warning: CollectMultipleMetricsSpark is a BETA tool and is not yet ready for use in production
!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!?[0m
20:46:34.192 INFO CollectMultipleMetricsSpark - Initializing engine
20:46:34.192 INFO CollectMultipleMetricsSpark - Done initializing engine
20:46:34.192 INFO CollectMultipleMetricsSpark - Spark verbosity set to INFO (see --spark-verbosity argument)
20:46:34.194 INFO MemoryStore - Block broadcast_3398 stored as values in memory (estimated size 600.0 B, free 1915.3 MiB)
20:46:34.194 INFO MemoryStore - Block broadcast_3398_piece0 stored as bytes in memory (estimated size 211.0 B, free 1915.3 MiB)
20:46:34.194 INFO BlockManagerInfo - Added broadcast_3398_piece0 in memory on localhost:34191 (size: 211.0 B, free: 1918.8 MiB)
20:46:34.195 INFO SparkContext - Created broadcast 3398 from broadcast at CramSource.java:114
20:46:34.195 INFO MemoryStore - Block broadcast_3399 stored as values in memory (estimated size 306.3 KiB, free 1915.0 MiB)
20:46:34.202 INFO MemoryStore - Block broadcast_3399_piece0 stored as bytes in memory (estimated size 64.4 KiB, free 1914.9 MiB)
20:46:34.202 INFO BlockManagerInfo - Added broadcast_3399_piece0 in memory on localhost:34191 (size: 64.4 KiB, free: 1918.7 MiB)
20:46:34.202 INFO SparkContext - Created broadcast 3399 from newAPIHadoopFile at PathSplitSource.java:96
20:46:34.217 INFO MemoryStore - Block broadcast_3400 stored as values in memory (estimated size 600.0 B, free 1914.9 MiB)
20:46:34.217 INFO MemoryStore - Block broadcast_3400_piece0 stored as bytes in memory (estimated size 211.0 B, free 1914.9 MiB)
20:46:34.217 INFO BlockManagerInfo - Added broadcast_3400_piece0 in memory on localhost:34191 (size: 211.0 B, free: 1918.7 MiB)
20:46:34.217 INFO SparkContext - Created broadcast 3400 from broadcast at CramSource.java:114
20:46:34.218 INFO MemoryStore - Block broadcast_3401 stored as values in memory (estimated size 306.3 KiB, free 1914.6 MiB)
20:46:34.225 INFO MemoryStore - Block broadcast_3401_piece0 stored as bytes in memory (estimated size 64.4 KiB, free 1914.6 MiB)
20:46:34.225 INFO BlockManagerInfo - Added broadcast_3401_piece0 in memory on localhost:34191 (size: 64.4 KiB, free: 1918.7 MiB)
20:46:34.225 INFO SparkContext - Created broadcast 3401 from newAPIHadoopFile at PathSplitSource.java:96
20:46:34.239 INFO FileInputFormat - Total input files to process : 1
20:46:34.242 INFO SparkContext - Starting job: count at CollectMultipleMetricsSparkIntegrationTest.java:130
20:46:34.242 INFO DAGScheduler - Got job 888 (count at CollectMultipleMetricsSparkIntegrationTest.java:130) with 1 output partitions
20:46:34.242 INFO DAGScheduler - Final stage: ResultStage 2478 (count at CollectMultipleMetricsSparkIntegrationTest.java:130)
20:46:34.242 INFO DAGScheduler - Parents of final stage: List()
20:46:34.242 INFO DAGScheduler - Missing parents: List()
20:46:34.243 INFO DAGScheduler - Submitting ResultStage 2478 (MapPartitionsRDD[10240] at filter at CollectMultipleMetricsSpark.java:193), which has no missing parents
20:46:34.258 INFO MemoryStore - Block broadcast_3402 stored as values in memory (estimated size 330.5 KiB, free 1914.2 MiB)
20:46:34.259 INFO MemoryStore - Block broadcast_3402_piece0 stored as bytes in memory (estimated size 143.8 KiB, free 1914.1 MiB)
20:46:34.260 INFO BlockManagerInfo - Added broadcast_3402_piece0 in memory on localhost:34191 (size: 143.8 KiB, free: 1918.5 MiB)
20:46:34.260 INFO SparkContext - Created broadcast 3402 from broadcast at DAGScheduler.scala:1580
20:46:34.260 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 2478 (MapPartitionsRDD[10240] at filter at CollectMultipleMetricsSpark.java:193) (first 15 tasks are for partitions Vector(0))
20:46:34.260 INFO TaskSchedulerImpl - Adding task set 2478.0 with 1 tasks resource profile 0
20:46:34.260 INFO TaskSetManager - Starting task 0.0 in stage 2478.0 (TID 1909) (localhost, executor driver, partition 0, PROCESS_LOCAL, 9773 bytes)
20:46:34.260 INFO Executor - Running task 0.0 in stage 2478.0 (TID 1909)
20:46:34.281 INFO NewHadoopRDD - Input split: file:/home/runner/work/gatk/gatk/src/test/resources/org/broadinstitute/hellbender/metrics/analysis/CollectInsertSizeMetrics/insert_size_metrics_test.cram:0+8617
20:46:34.285 INFO Executor - Finished task 0.0 in stage 2478.0 (TID 1909). 989 bytes result sent to driver
20:46:34.286 INFO TaskSetManager - Finished task 0.0 in stage 2478.0 (TID 1909) in 26 ms on localhost (executor driver) (1/1)
20:46:34.286 INFO TaskSchedulerImpl - Removed TaskSet 2478.0, whose tasks have all completed, from pool
20:46:34.286 INFO DAGScheduler - ResultStage 2478 (count at CollectMultipleMetricsSparkIntegrationTest.java:130) finished in 0.043 s
20:46:34.286 INFO DAGScheduler - Job 888 is finished. Cancelling potential speculative or zombie tasks for this job
20:46:34.286 INFO TaskSchedulerImpl - Killing all running tasks in stage 2478: Stage finished
20:46:34.286 INFO DAGScheduler - Job 888 finished: count at CollectMultipleMetricsSparkIntegrationTest.java:130, took 0.043674 s
20:46:34.286 INFO CollectMultipleMetricsSpark - Shutting down engine
[June 12, 2025 at 8:46:34 PM UTC] org.broadinstitute.hellbender.tools.spark.pipelines.metrics.CollectMultipleMetricsSpark done. Elapsed time: 0.00 minutes.
Runtime.totalMemory()=1738539008