[May 19, 2025 at 3:45:38 PM UTC] org.broadinstitute.hellbender.tools.spark.pipelines.metrics.CollectMultipleMetricsSpark done. Elapsed time: 0.00 minutes.
Runtime.totalMemory()=2032140288
[May 19, 2025 at 3:45:38 PM UTC] org.broadinstitute.hellbender.tools.spark.pipelines.metrics.CollectMultipleMetricsSpark done. Elapsed time: 0.00 minutes.
Runtime.totalMemory()=2032140288
[May 19, 2025 at 3:45:39 PM UTC] org.broadinstitute.hellbender.tools.spark.pipelines.metrics.CollectMultipleMetricsSpark done. Elapsed time: 0.00 minutes.
Runtime.totalMemory()=2032140288
15:45:39.182 INFO CollectMultipleMetricsSpark - ------------------------------------------------------------
15:45:39.182 INFO CollectMultipleMetricsSpark - org.broadinstitute.hellbender.tools.spark.pipelines.metrics vUnavailable
15:45:39.182 INFO CollectMultipleMetricsSpark - For support and documentation go to https://software.broadinstitute.org/gatk/
15:45:39.182 INFO CollectMultipleMetricsSpark - Executing as runner@pkrvmf6wy0o8zjz on Linux v6.11.0-1014-azure amd64
15:45:39.182 INFO CollectMultipleMetricsSpark - Java runtime: OpenJDK 64-Bit Server VM v17.0.6+10
15:45:39.182 INFO CollectMultipleMetricsSpark - Start Date/Time: May 19, 2025 at 3:45:39 PM UTC
15:45:39.182 INFO CollectMultipleMetricsSpark - ------------------------------------------------------------
15:45:39.182 INFO CollectMultipleMetricsSpark - ------------------------------------------------------------
15:45:39.182 INFO CollectMultipleMetricsSpark - HTSJDK Defaults.COMPRESSION_LEVEL : 2
15:45:39.182 INFO CollectMultipleMetricsSpark - HTSJDK Defaults.USE_ASYNC_IO_READ_FOR_SAMTOOLS : false
15:45:39.182 INFO CollectMultipleMetricsSpark - HTSJDK Defaults.USE_ASYNC_IO_WRITE_FOR_SAMTOOLS : true
15:45:39.182 INFO CollectMultipleMetricsSpark - HTSJDK Defaults.USE_ASYNC_IO_WRITE_FOR_TRIBBLE : false
15:45:39.182 INFO CollectMultipleMetricsSpark - Deflater: IntelDeflater
15:45:39.182 INFO CollectMultipleMetricsSpark - Inflater: IntelInflater
15:45:39.182 INFO CollectMultipleMetricsSpark - GCS max retries/reopens: 20
15:45:39.182 INFO CollectMultipleMetricsSpark - Requester pays: disabled
15:45:39.182 WARN CollectMultipleMetricsSpark -
?[1m?[31m !!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!
Warning: CollectMultipleMetricsSpark is a BETA tool and is not yet ready for use in production
!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!?[0m
15:45:39.182 INFO CollectMultipleMetricsSpark - Initializing engine
15:45:39.182 INFO CollectMultipleMetricsSpark - Done initializing engine
15:45:39.182 INFO CollectMultipleMetricsSpark - Spark verbosity set to INFO (see --spark-verbosity argument)
WARNING 2025-05-19 15:45:39 SamReaderFactory Unable to detect file format from input URL or stream, assuming SAM format.
WARNING 2025-05-19 15:45:39 SamReaderFactory Unable to detect file format from input URL or stream, assuming SAM format.
15:45:39.184 INFO MemoryStore - Block broadcast_3482 stored as values in memory (estimated size 37.8 KiB, free 1908.3 MiB)
15:45:39.184 INFO MemoryStore - Block broadcast_3482_piece0 stored as bytes in memory (estimated size 8.4 KiB, free 1908.3 MiB)
15:45:39.184 INFO BlockManagerInfo - Added broadcast_3482_piece0 in memory on localhost:44421 (size: 8.4 KiB, free: 1918.0 MiB)
15:45:39.185 INFO SparkContext - Created broadcast 3482 from broadcast at SamSource.java:78
15:45:39.185 INFO MemoryStore - Block broadcast_3483 stored as values in memory (estimated size 306.3 KiB, free 1908.0 MiB)
15:45:39.192 INFO MemoryStore - Block broadcast_3483_piece0 stored as bytes in memory (estimated size 64.4 KiB, free 1908.0 MiB)
15:45:39.192 INFO BlockManagerInfo - Added broadcast_3483_piece0 in memory on localhost:44421 (size: 64.4 KiB, free: 1917.9 MiB)
15:45:39.192 INFO SparkContext - Created broadcast 3483 from newAPIHadoopFile at SamSource.java:108
WARNING 2025-05-19 15:45:39 SamReaderFactory Unable to detect file format from input URL or stream, assuming SAM format.
WARNING 2025-05-19 15:45:39 SamReaderFactory Unable to detect file format from input URL or stream, assuming SAM format.
15:45:39.194 INFO MemoryStore - Block broadcast_3484 stored as values in memory (estimated size 37.8 KiB, free 1907.9 MiB)
15:45:39.195 INFO MemoryStore - Block broadcast_3484_piece0 stored as bytes in memory (estimated size 8.4 KiB, free 1907.9 MiB)
15:45:39.195 INFO BlockManagerInfo - Added broadcast_3484_piece0 in memory on localhost:44421 (size: 8.4 KiB, free: 1917.9 MiB)
15:45:39.195 INFO SparkContext - Created broadcast 3484 from broadcast at SamSource.java:78
15:45:39.196 INFO MemoryStore - Block broadcast_3485 stored as values in memory (estimated size 306.3 KiB, free 1907.6 MiB)
15:45:39.202 INFO MemoryStore - Block broadcast_3485_piece0 stored as bytes in memory (estimated size 64.4 KiB, free 1907.6 MiB)
15:45:39.202 INFO BlockManagerInfo - Added broadcast_3485_piece0 in memory on localhost:44421 (size: 64.4 KiB, free: 1917.8 MiB)
15:45:39.202 INFO SparkContext - Created broadcast 3485 from newAPIHadoopFile at SamSource.java:108
15:45:39.205 INFO FileInputFormat - Total input files to process : 1
15:45:39.208 INFO SparkContext - Starting job: count at CollectMultipleMetricsSparkIntegrationTest.java:130
15:45:39.208 INFO DAGScheduler - Got job 912 (count at CollectMultipleMetricsSparkIntegrationTest.java:130) with 1 output partitions
15:45:39.208 INFO DAGScheduler - Final stage: ResultStage 2505 (count at CollectMultipleMetricsSparkIntegrationTest.java:130)
15:45:39.208 INFO DAGScheduler - Parents of final stage: List()
15:45:39.208 INFO DAGScheduler - Missing parents: List()
15:45:39.208 INFO DAGScheduler - Submitting ResultStage 2505 (MapPartitionsRDD[10461] at filter at CollectMultipleMetricsSpark.java:193), which has no missing parents
15:45:39.209 INFO MemoryStore - Block broadcast_3486 stored as values in memory (estimated size 34.8 KiB, free 1907.5 MiB)
15:45:39.209 INFO MemoryStore - Block broadcast_3486_piece0 stored as bytes in memory (estimated size 14.6 KiB, free 1907.5 MiB)
15:45:39.209 INFO BlockManagerInfo - Added broadcast_3486_piece0 in memory on localhost:44421 (size: 14.6 KiB, free: 1917.8 MiB)
15:45:39.209 INFO SparkContext - Created broadcast 3486 from broadcast at DAGScheduler.scala:1580
15:45:39.210 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 2505 (MapPartitionsRDD[10461] at filter at CollectMultipleMetricsSpark.java:193) (first 15 tasks are for partitions Vector(0))
15:45:39.210 INFO TaskSchedulerImpl - Adding task set 2505.0 with 1 tasks resource profile 0
15:45:39.210 INFO TaskSetManager - Starting task 0.0 in stage 2505.0 (TID 1936) (localhost, executor driver, partition 0, PROCESS_LOCAL, 9772 bytes)
15:45:39.210 INFO Executor - Running task 0.0 in stage 2505.0 (TID 1936)
15:45:39.211 INFO NewHadoopRDD - Input split: file:/home/runner/work/gatk/gatk/src/test/resources/org/broadinstitute/hellbender/metrics/analysis/CollectInsertSizeMetrics/insert_size_metrics_test.sam:0+44008
15:45:39.212 INFO Executor - Finished task 0.0 in stage 2505.0 (TID 1936). 946 bytes result sent to driver
15:45:39.213 INFO TaskSetManager - Finished task 0.0 in stage 2505.0 (TID 1936) in 3 ms on localhost (executor driver) (1/1)
15:45:39.213 INFO TaskSchedulerImpl - Removed TaskSet 2505.0, whose tasks have all completed, from pool
15:45:39.213 INFO DAGScheduler - ResultStage 2505 (count at CollectMultipleMetricsSparkIntegrationTest.java:130) finished in 0.005 s
15:45:39.213 INFO DAGScheduler - Job 912 is finished. Cancelling potential speculative or zombie tasks for this job
15:45:39.213 INFO TaskSchedulerImpl - Killing all running tasks in stage 2505: Stage finished
15:45:39.213 INFO DAGScheduler - Job 912 finished: count at CollectMultipleMetricsSparkIntegrationTest.java:130, took 0.004923 s
15:45:39.213 INFO CollectMultipleMetricsSpark - Shutting down engine
[May 19, 2025 at 3:45:39 PM UTC] org.broadinstitute.hellbender.tools.spark.pipelines.metrics.CollectMultipleMetricsSpark done. Elapsed time: 0.00 minutes.
Runtime.totalMemory()=2032140288
15:45:39.217 INFO CollectMultipleMetricsSpark - ------------------------------------------------------------
15:45:39.217 INFO CollectMultipleMetricsSpark - org.broadinstitute.hellbender.tools.spark.pipelines.metrics vUnavailable
15:45:39.217 INFO CollectMultipleMetricsSpark - For support and documentation go to https://software.broadinstitute.org/gatk/
15:45:39.217 INFO CollectMultipleMetricsSpark - Executing as runner@pkrvmf6wy0o8zjz on Linux v6.11.0-1014-azure amd64
15:45:39.217 INFO CollectMultipleMetricsSpark - Java runtime: OpenJDK 64-Bit Server VM v17.0.6+10
15:45:39.217 INFO CollectMultipleMetricsSpark - Start Date/Time: May 19, 2025 at 3:45:39 PM UTC
15:45:39.217 INFO CollectMultipleMetricsSpark - ------------------------------------------------------------
15:45:39.217 INFO CollectMultipleMetricsSpark - ------------------------------------------------------------
15:45:39.218 INFO CollectMultipleMetricsSpark - HTSJDK Defaults.COMPRESSION_LEVEL : 2
15:45:39.218 INFO CollectMultipleMetricsSpark - HTSJDK Defaults.USE_ASYNC_IO_READ_FOR_SAMTOOLS : false
15:45:39.218 INFO CollectMultipleMetricsSpark - HTSJDK Defaults.USE_ASYNC_IO_WRITE_FOR_SAMTOOLS : true
15:45:39.218 INFO CollectMultipleMetricsSpark - HTSJDK Defaults.USE_ASYNC_IO_WRITE_FOR_TRIBBLE : false
15:45:39.218 INFO CollectMultipleMetricsSpark - Deflater: IntelDeflater
15:45:39.218 INFO CollectMultipleMetricsSpark - Inflater: IntelInflater
15:45:39.218 INFO CollectMultipleMetricsSpark - GCS max retries/reopens: 20
15:45:39.218 INFO CollectMultipleMetricsSpark - Requester pays: disabled
15:45:39.218 WARN CollectMultipleMetricsSpark -
?[1m?[31m !!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!
Warning: CollectMultipleMetricsSpark is a BETA tool and is not yet ready for use in production
!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!?[0m
15:45:39.218 INFO CollectMultipleMetricsSpark - Initializing engine
15:45:39.218 INFO CollectMultipleMetricsSpark - Done initializing engine
15:45:39.218 INFO CollectMultipleMetricsSpark - Spark verbosity set to INFO (see --spark-verbosity argument)
15:45:39.219 INFO MemoryStore - Block broadcast_3487 stored as values in memory (estimated size 306.3 KiB, free 1907.2 MiB)
15:45:39.226 INFO MemoryStore - Block broadcast_3487_piece0 stored as bytes in memory (estimated size 64.4 KiB, free 1907.2 MiB)
15:45:39.226 INFO BlockManagerInfo - Added broadcast_3487_piece0 in memory on localhost:44421 (size: 64.4 KiB, free: 1917.7 MiB)
15:45:39.226 INFO SparkContext - Created broadcast 3487 from newAPIHadoopFile at PathSplitSource.java:96
15:45:39.246 INFO MemoryStore - Block broadcast_3488 stored as values in memory (estimated size 306.3 KiB, free 1906.9 MiB)
15:45:39.253 INFO MemoryStore - Block broadcast_3488_piece0 stored as bytes in memory (estimated size 64.4 KiB, free 1906.8 MiB)
15:45:39.253 INFO BlockManagerInfo - Added broadcast_3488_piece0 in memory on localhost:44421 (size: 64.4 KiB, free: 1917.7 MiB)
15:45:39.253 INFO SparkContext - Created broadcast 3488 from newAPIHadoopFile at PathSplitSource.java:96
15:45:39.273 INFO FileInputFormat - Total input files to process : 1
15:45:39.276 INFO SparkContext - Starting job: count at CollectMultipleMetricsSparkIntegrationTest.java:130
15:45:39.276 INFO DAGScheduler - Got job 913 (count at CollectMultipleMetricsSparkIntegrationTest.java:130) with 1 output partitions
15:45:39.276 INFO DAGScheduler - Final stage: ResultStage 2506 (count at CollectMultipleMetricsSparkIntegrationTest.java:130)
15:45:39.276 INFO DAGScheduler - Parents of final stage: List()
15:45:39.276 INFO DAGScheduler - Missing parents: List()
15:45:39.276 INFO DAGScheduler - Submitting ResultStage 2506 (MapPartitionsRDD[10474] at filter at CollectMultipleMetricsSpark.java:193), which has no missing parents
15:45:39.292 INFO MemoryStore - Block broadcast_3489 stored as values in memory (estimated size 478.0 KiB, free 1906.3 MiB)
15:45:39.295 INFO MemoryStore - Block broadcast_3489_piece0 stored as bytes in memory (estimated size 208.0 KiB, free 1906.1 MiB)
15:45:39.295 INFO BlockManagerInfo - Added broadcast_3489_piece0 in memory on localhost:44421 (size: 208.0 KiB, free: 1917.5 MiB)
15:45:39.295 INFO SparkContext - Created broadcast 3489 from broadcast at DAGScheduler.scala:1580
15:45:39.295 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 2506 (MapPartitionsRDD[10474] at filter at CollectMultipleMetricsSpark.java:193) (first 15 tasks are for partitions Vector(0))
15:45:39.295 INFO TaskSchedulerImpl - Adding task set 2506.0 with 1 tasks resource profile 0
15:45:39.296 INFO TaskSetManager - Starting task 0.0 in stage 2506.0 (TID 1937) (localhost, executor driver, partition 0, PROCESS_LOCAL, 9772 bytes)
15:45:39.296 INFO Executor - Running task 0.0 in stage 2506.0 (TID 1937)
15:45:39.324 INFO NewHadoopRDD - Input split: file:/home/runner/work/gatk/gatk/src/test/resources/org/broadinstitute/hellbender/metrics/analysis/CollectInsertSizeMetrics/insert_size_metrics_test.bam:0+8071
15:45:39.327 INFO Executor - Finished task 0.0 in stage 2506.0 (TID 1937). 989 bytes result sent to driver
15:45:39.327 INFO TaskSetManager - Finished task 0.0 in stage 2506.0 (TID 1937) in 32 ms on localhost (executor driver) (1/1)
15:45:39.327 INFO TaskSchedulerImpl - Removed TaskSet 2506.0, whose tasks have all completed, from pool
15:45:39.327 INFO DAGScheduler - ResultStage 2506 (count at CollectMultipleMetricsSparkIntegrationTest.java:130) finished in 0.051 s
15:45:39.328 INFO DAGScheduler - Job 913 is finished. Cancelling potential speculative or zombie tasks for this job
15:45:39.328 INFO TaskSchedulerImpl - Killing all running tasks in stage 2506: Stage finished
15:45:39.328 INFO DAGScheduler - Job 913 finished: count at CollectMultipleMetricsSparkIntegrationTest.java:130, took 0.051885 s
15:45:39.328 INFO CollectMultipleMetricsSpark - Shutting down engine
[May 19, 2025 at 3:45:39 PM UTC] org.broadinstitute.hellbender.tools.spark.pipelines.metrics.CollectMultipleMetricsSpark done. Elapsed time: 0.00 minutes.
Runtime.totalMemory()=2032140288
15:45:39.332 INFO CollectMultipleMetricsSpark - ------------------------------------------------------------
15:45:39.332 INFO CollectMultipleMetricsSpark - org.broadinstitute.hellbender.tools.spark.pipelines.metrics vUnavailable
15:45:39.332 INFO CollectMultipleMetricsSpark - For support and documentation go to https://software.broadinstitute.org/gatk/
15:45:39.332 INFO CollectMultipleMetricsSpark - Executing as runner@pkrvmf6wy0o8zjz on Linux v6.11.0-1014-azure amd64
15:45:39.332 INFO CollectMultipleMetricsSpark - Java runtime: OpenJDK 64-Bit Server VM v17.0.6+10
15:45:39.332 INFO CollectMultipleMetricsSpark - Start Date/Time: May 19, 2025 at 3:45:39 PM UTC
15:45:39.332 INFO CollectMultipleMetricsSpark - ------------------------------------------------------------
15:45:39.332 INFO CollectMultipleMetricsSpark - ------------------------------------------------------------
15:45:39.332 INFO CollectMultipleMetricsSpark - HTSJDK Defaults.COMPRESSION_LEVEL : 2
15:45:39.332 INFO CollectMultipleMetricsSpark - HTSJDK Defaults.USE_ASYNC_IO_READ_FOR_SAMTOOLS : false
15:45:39.332 INFO CollectMultipleMetricsSpark - HTSJDK Defaults.USE_ASYNC_IO_WRITE_FOR_SAMTOOLS : true
15:45:39.332 INFO CollectMultipleMetricsSpark - HTSJDK Defaults.USE_ASYNC_IO_WRITE_FOR_TRIBBLE : false
15:45:39.332 INFO CollectMultipleMetricsSpark - Deflater: IntelDeflater
15:45:39.332 INFO CollectMultipleMetricsSpark - Inflater: IntelInflater
15:45:39.332 INFO CollectMultipleMetricsSpark - GCS max retries/reopens: 20
15:45:39.332 INFO CollectMultipleMetricsSpark - Requester pays: disabled
15:45:39.332 WARN CollectMultipleMetricsSpark -
?[1m?[31m !!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!
Warning: CollectMultipleMetricsSpark is a BETA tool and is not yet ready for use in production
!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!?[0m
15:45:39.332 INFO CollectMultipleMetricsSpark - Initializing engine
15:45:39.332 INFO CollectMultipleMetricsSpark - Done initializing engine
15:45:39.332 INFO CollectMultipleMetricsSpark - Spark verbosity set to INFO (see --spark-verbosity argument)
15:45:39.334 INFO MemoryStore - Block broadcast_3490 stored as values in memory (estimated size 600.0 B, free 1906.1 MiB)
15:45:39.334 INFO MemoryStore - Block broadcast_3490_piece0 stored as bytes in memory (estimated size 211.0 B, free 1906.1 MiB)
15:45:39.335 INFO BlockManagerInfo - Added broadcast_3490_piece0 in memory on localhost:44421 (size: 211.0 B, free: 1917.5 MiB)
15:45:39.335 INFO SparkContext - Created broadcast 3490 from broadcast at CramSource.java:114
15:45:39.335 INFO MemoryStore - Block broadcast_3491 stored as values in memory (estimated size 306.3 KiB, free 1905.8 MiB)
15:45:39.342 INFO MemoryStore - Block broadcast_3491_piece0 stored as bytes in memory (estimated size 64.4 KiB, free 1905.8 MiB)
15:45:39.342 INFO BlockManagerInfo - Added broadcast_3491_piece0 in memory on localhost:44421 (size: 64.4 KiB, free: 1917.4 MiB)
15:45:39.342 INFO SparkContext - Created broadcast 3491 from newAPIHadoopFile at PathSplitSource.java:96
15:45:39.356 INFO MemoryStore - Block broadcast_3492 stored as values in memory (estimated size 600.0 B, free 1905.8 MiB)
15:45:39.357 INFO MemoryStore - Block broadcast_3492_piece0 stored as bytes in memory (estimated size 211.0 B, free 1905.8 MiB)
15:45:39.357 INFO BlockManagerInfo - Added broadcast_3492_piece0 in memory on localhost:44421 (size: 211.0 B, free: 1917.4 MiB)
15:45:39.357 INFO SparkContext - Created broadcast 3492 from broadcast at CramSource.java:114
15:45:39.358 INFO MemoryStore - Block broadcast_3493 stored as values in memory (estimated size 306.3 KiB, free 1905.5 MiB)
15:45:39.364 INFO MemoryStore - Block broadcast_3493_piece0 stored as bytes in memory (estimated size 64.4 KiB, free 1905.4 MiB)
15:45:39.364 INFO BlockManagerInfo - Added broadcast_3493_piece0 in memory on localhost:44421 (size: 64.4 KiB, free: 1917.3 MiB)
15:45:39.364 INFO SparkContext - Created broadcast 3493 from newAPIHadoopFile at PathSplitSource.java:96
15:45:39.378 INFO FileInputFormat - Total input files to process : 1
15:45:39.381 INFO SparkContext - Starting job: count at CollectMultipleMetricsSparkIntegrationTest.java:130
15:45:39.382 INFO DAGScheduler - Got job 914 (count at CollectMultipleMetricsSparkIntegrationTest.java:130) with 1 output partitions
15:45:39.382 INFO DAGScheduler - Final stage: ResultStage 2507 (count at CollectMultipleMetricsSparkIntegrationTest.java:130)
15:45:39.382 INFO DAGScheduler - Parents of final stage: List()
15:45:39.382 INFO DAGScheduler - Missing parents: List()
15:45:39.382 INFO DAGScheduler - Submitting ResultStage 2507 (MapPartitionsRDD[10485] at filter at CollectMultipleMetricsSpark.java:193), which has no missing parents
15:45:39.393 INFO MemoryStore - Block broadcast_3494 stored as values in memory (estimated size 330.5 KiB, free 1905.1 MiB)
15:45:39.395 INFO MemoryStore - Block broadcast_3494_piece0 stored as bytes in memory (estimated size 143.8 KiB, free 1904.9 MiB)
15:45:39.395 INFO BlockManagerInfo - Added broadcast_3494_piece0 in memory on localhost:44421 (size: 143.8 KiB, free: 1917.2 MiB)
15:45:39.395 INFO SparkContext - Created broadcast 3494 from broadcast at DAGScheduler.scala:1580
15:45:39.395 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 2507 (MapPartitionsRDD[10485] at filter at CollectMultipleMetricsSpark.java:193) (first 15 tasks are for partitions Vector(0))
15:45:39.395 INFO TaskSchedulerImpl - Adding task set 2507.0 with 1 tasks resource profile 0
15:45:39.395 INFO TaskSetManager - Starting task 0.0 in stage 2507.0 (TID 1938) (localhost, executor driver, partition 0, PROCESS_LOCAL, 9773 bytes)
15:45:39.396 INFO Executor - Running task 0.0 in stage 2507.0 (TID 1938)
15:45:39.415 INFO NewHadoopRDD - Input split: file:/home/runner/work/gatk/gatk/src/test/resources/org/broadinstitute/hellbender/metrics/analysis/CollectInsertSizeMetrics/insert_size_metrics_test.cram:0+8617
15:45:39.419 INFO Executor - Finished task 0.0 in stage 2507.0 (TID 1938). 989 bytes result sent to driver
15:45:39.419 INFO TaskSetManager - Finished task 0.0 in stage 2507.0 (TID 1938) in 24 ms on localhost (executor driver) (1/1)
15:45:39.419 INFO TaskSchedulerImpl - Removed TaskSet 2507.0, whose tasks have all completed, from pool
15:45:39.419 INFO DAGScheduler - ResultStage 2507 (count at CollectMultipleMetricsSparkIntegrationTest.java:130) finished in 0.037 s
15:45:39.419 INFO DAGScheduler - Job 914 is finished. Cancelling potential speculative or zombie tasks for this job
15:45:39.419 INFO TaskSchedulerImpl - Killing all running tasks in stage 2507: Stage finished
15:45:39.419 INFO DAGScheduler - Job 914 finished: count at CollectMultipleMetricsSparkIntegrationTest.java:130, took 0.037959 s
15:45:39.419 INFO CollectMultipleMetricsSpark - Shutting down engine
[May 19, 2025 at 3:45:39 PM UTC] org.broadinstitute.hellbender.tools.spark.pipelines.metrics.CollectMultipleMetricsSpark done. Elapsed time: 0.00 minutes.
Runtime.totalMemory()=2032140288