WARNING 2025-07-15 20:47:58 AsciiLineReader Creating an indexable source for an AsciiFeatureCodec using a stream that is neither a PositionalBufferedStream nor a BlockCompressedInputStream
WARNING 2025-07-15 20:47:58 AsciiLineReader Creating an indexable source for an AsciiFeatureCodec using a stream that is neither a PositionalBufferedStream nor a BlockCompressedInputStream
WARNING 2025-07-15 20:47:58 AsciiLineReader Creating an indexable source for an AsciiFeatureCodec using a stream that is neither a PositionalBufferedStream nor a BlockCompressedInputStream
WARNING 2025-07-15 20:47:58 AsciiLineReader Creating an indexable source for an AsciiFeatureCodec using a stream that is neither a PositionalBufferedStream nor a BlockCompressedInputStream
20:47:58.076 INFO MemoryStore - Block broadcast_102 stored as values in memory (estimated size 81.1 KiB, free 1919.9 MiB)
20:47:58.077 INFO MemoryStore - Block broadcast_103 stored as values in memory (estimated size 81.1 KiB, free 1919.8 MiB)
20:47:58.080 INFO MemoryStore - Block broadcast_102_piece0 stored as bytes in memory (estimated size 7.2 KiB, free 1919.8 MiB)
20:47:58.080 INFO BlockManagerInfo - Added broadcast_102_piece0 in memory on localhost:39529 (size: 7.2 KiB, free: 1920.0 MiB)
20:47:58.081 INFO SparkContext - Created broadcast 102 from broadcast at VcfSource.java:129
20:47:58.083 INFO MemoryStore - Block broadcast_104 stored as values in memory (estimated size 298.2 KiB, free 1919.5 MiB)
20:47:58.080 INFO MemoryStore - Block broadcast_103_piece0 stored as bytes in memory (estimated size 7.1 KiB, free 1919.8 MiB)
20:47:58.083 INFO BlockManagerInfo - Added broadcast_103_piece0 in memory on localhost:39529 (size: 7.1 KiB, free: 1920.0 MiB)
20:47:58.084 INFO SparkContext - Created broadcast 103 from broadcast at VcfSource.java:129
20:47:58.085 INFO MemoryStore - Block broadcast_105 stored as values in memory (estimated size 298.2 KiB, free 1919.2 MiB)
20:47:58.098 INFO MemoryStore - Block broadcast_104_piece0 stored as bytes in memory (estimated size 50.4 KiB, free 1919.2 MiB)
20:47:58.098 INFO BlockManagerInfo - Added broadcast_104_piece0 in memory on localhost:39529 (size: 50.4 KiB, free: 1919.9 MiB)
20:47:58.099 INFO SparkContext - Created broadcast 104 from newAPIHadoopFile at VcfSource.java:168
20:47:58.099 INFO MemoryStore - Block broadcast_105_piece0 stored as bytes in memory (estimated size 50.4 KiB, free 1919.1 MiB)
20:47:58.100 INFO BlockManagerInfo - Added broadcast_105_piece0 in memory on localhost:39529 (size: 50.4 KiB, free: 1919.9 MiB)
20:47:58.100 INFO SparkContext - Created broadcast 105 from newAPIHadoopFile at VcfSource.java:168
20:47:58.108 INFO FileInputFormat - Total input files to process : 1
20:47:58.109 INFO FileInputFormat - Total input files to process : 1
20:47:58.115 INFO FeatureManager - Using codec VCFCodec to read file file:///gatkCloneMountPoint/src/test/resources/Homo_sapiens_assembly19.dbsnp135.chr1_1M.exome_intervals.vcf
20:47:58.128 INFO FeatureManager - Using codec VCFCodec to read file file:///gatkCloneMountPoint/src/test/resources/HSA19.dbsnp135.chr1_1M.exome_intervals.modified.vcf
20:47:58.144 INFO SparkContext - Starting job: collect at VariantsSparkSourceUnitTest.java:78
20:47:58.146 INFO DAGScheduler - Got job 43 (collect at VariantsSparkSourceUnitTest.java:78) with 2 output partitions
20:47:58.146 INFO DAGScheduler - Final stage: ResultStage 52 (collect at VariantsSparkSourceUnitTest.java:78)
20:47:58.146 INFO DAGScheduler - Parents of final stage: List()
20:47:58.146 INFO DAGScheduler - Missing parents: List()
20:47:58.146 INFO DAGScheduler - Submitting ResultStage 52 (UnionRDD[266] at union at VariantsSparkSource.java:51), which has no missing parents
20:47:58.148 INFO MemoryStore - Block broadcast_106 stored as values in memory (estimated size 9.0 KiB, free 1919.1 MiB)
20:47:58.149 INFO MemoryStore - Block broadcast_106_piece0 stored as bytes in memory (estimated size 4.3 KiB, free 1919.1 MiB)
20:47:58.149 INFO BlockManagerInfo - Added broadcast_106_piece0 in memory on localhost:39529 (size: 4.3 KiB, free: 1919.9 MiB)
20:47:58.149 INFO SparkContext - Created broadcast 106 from broadcast at DAGScheduler.scala:1580
20:47:58.149 INFO DAGScheduler - Submitting 2 missing tasks from ResultStage 52 (UnionRDD[266] at union at VariantsSparkSource.java:51) (first 15 tasks are for partitions Vector(0, 1))
20:47:58.149 INFO TaskSchedulerImpl - Adding task set 52.0 with 2 tasks resource profile 0
20:47:58.152 INFO TaskSetManager - Starting task 0.0 in stage 52.0 (TID 121) (localhost, executor driver, partition 0, PROCESS_LOCAL, 7982 bytes)
20:47:58.153 INFO TaskSetManager - Starting task 1.0 in stage 52.0 (TID 122) (localhost, executor driver, partition 1, PROCESS_LOCAL, 7973 bytes)
20:47:58.153 INFO Executor - Running task 0.0 in stage 52.0 (TID 121)
20:47:58.154 INFO Executor - Running task 1.0 in stage 52.0 (TID 122)
20:47:58.162 INFO NewHadoopRDD - Input split: file:/gatkCloneMountPoint/src/test/resources/Homo_sapiens_assembly19.dbsnp135.chr1_1M.exome_intervals.vcf:0+111531
20:47:58.163 INFO NewHadoopRDD - Input split: file:/gatkCloneMountPoint/src/test/resources/HSA19.dbsnp135.chr1_1M.exome_intervals.modified.vcf:0+111529
20:47:58.199 INFO Executor - Finished task 0.0 in stage 52.0 (TID 121). 12151 bytes result sent to driver
20:47:58.199 INFO Executor - Finished task 1.0 in stage 52.0 (TID 122). 12151 bytes result sent to driver
20:47:58.203 INFO TaskSetManager - Finished task 0.0 in stage 52.0 (TID 121) in 53 ms on localhost (executor driver) (1/2)
20:47:58.204 INFO TaskSetManager - Finished task 1.0 in stage 52.0 (TID 122) in 52 ms on localhost (executor driver) (2/2)
20:47:58.204 INFO TaskSchedulerImpl - Removed TaskSet 52.0, whose tasks have all completed, from pool
20:47:58.205 INFO DAGScheduler - ResultStage 52 (collect at VariantsSparkSourceUnitTest.java:78) finished in 0.058 s
20:47:58.205 INFO DAGScheduler - Job 43 is finished. Cancelling potential speculative or zombie tasks for this job
20:47:58.205 INFO TaskSchedulerImpl - Killing all running tasks in stage 52: Stage finished
20:47:58.205 INFO DAGScheduler - Job 43 finished: collect at VariantsSparkSourceUnitTest.java:78, took 0.060761 s
WARNING 2025-07-15 20:47:58 AsciiLineReader Creating an indexable source for an AsciiFeatureCodec using a stream that is neither a PositionalBufferedStream nor a BlockCompressedInputStream
WARNING 2025-07-15 20:47:58 AsciiLineReader Creating an indexable source for an AsciiFeatureCodec using a stream that is neither a PositionalBufferedStream nor a BlockCompressedInputStream
20:47:58.218 INFO MemoryStore - Block broadcast_107 stored as values in memory (estimated size 81.1 KiB, free 1919.1 MiB)
20:47:58.219 INFO MemoryStore - Block broadcast_107_piece0 stored as bytes in memory (estimated size 7.2 KiB, free 1919.0 MiB)
20:47:58.220 INFO BlockManagerInfo - Added broadcast_107_piece0 in memory on localhost:39529 (size: 7.2 KiB, free: 1919.9 MiB)
20:47:58.220 INFO SparkContext - Created broadcast 107 from broadcast at VcfSource.java:129
20:47:58.222 INFO MemoryStore - Block broadcast_108 stored as values in memory (estimated size 298.5 KiB, free 1918.8 MiB)
20:47:58.233 INFO MemoryStore - Block broadcast_108_piece0 stored as bytes in memory (estimated size 50.4 KiB, free 1918.7 MiB)
20:47:58.233 INFO BlockManagerInfo - Added broadcast_108_piece0 in memory on localhost:39529 (size: 50.4 KiB, free: 1919.8 MiB)
20:47:58.234 INFO SparkContext - Created broadcast 108 from newAPIHadoopFile at VcfSource.java:168
20:47:58.238 INFO FeatureManager - Using codec VCFCodec to read file file:///gatkCloneMountPoint/src/test/resources/Homo_sapiens_assembly19.dbsnp135.chr1_1M.exome_intervals.vcf
20:47:58.250 INFO FileInputFormat - Total input files to process : 1
20:47:58.258 INFO SparkContext - Starting job: collect at VariantsSparkSourceUnitTest.java:43
20:47:58.259 INFO DAGScheduler - Got job 44 (collect at VariantsSparkSourceUnitTest.java:43) with 1 output partitions
20:47:58.259 INFO DAGScheduler - Final stage: ResultStage 53 (collect at VariantsSparkSourceUnitTest.java:43)
20:47:58.259 INFO DAGScheduler - Parents of final stage: List()
20:47:58.259 INFO DAGScheduler - Missing parents: List()
20:47:58.259 INFO DAGScheduler - Submitting ResultStage 53 (MapPartitionsRDD[271] at map at VariantsSparkSource.java:38), which has no missing parents
20:47:58.260 INFO MemoryStore - Block broadcast_109 stored as values in memory (estimated size 7.4 KiB, free 1918.7 MiB)
20:47:58.261 INFO MemoryStore - Block broadcast_109_piece0 stored as bytes in memory (estimated size 3.7 KiB, free 1918.7 MiB)
20:47:58.261 INFO BlockManagerInfo - Added broadcast_109_piece0 in memory on localhost:39529 (size: 3.7 KiB, free: 1919.8 MiB)
20:47:58.261 INFO SparkContext - Created broadcast 109 from broadcast at DAGScheduler.scala:1580
20:47:58.262 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 53 (MapPartitionsRDD[271] at map at VariantsSparkSource.java:38) (first 15 tasks are for partitions Vector(0))
20:47:58.262 INFO TaskSchedulerImpl - Adding task set 53.0 with 1 tasks resource profile 0
20:47:58.262 INFO TaskSetManager - Starting task 0.0 in stage 53.0 (TID 123) (localhost, executor driver, partition 0, PROCESS_LOCAL, 7858 bytes)
20:47:58.263 INFO Executor - Running task 0.0 in stage 53.0 (TID 123)
20:47:58.265 INFO NewHadoopRDD - Input split: file:/gatkCloneMountPoint/src/test/resources/Homo_sapiens_assembly19.dbsnp135.chr1_1M.exome_intervals.vcf:0+111531
20:47:58.280 INFO Executor - Finished task 0.0 in stage 53.0 (TID 123). 12151 bytes result sent to driver
20:47:58.282 INFO TaskSetManager - Finished task 0.0 in stage 53.0 (TID 123) in 20 ms on localhost (executor driver) (1/1)
20:47:58.282 INFO TaskSchedulerImpl - Removed TaskSet 53.0, whose tasks have all completed, from pool
20:47:58.283 INFO DAGScheduler - ResultStage 53 (collect at VariantsSparkSourceUnitTest.java:43) finished in 0.022 s
20:47:58.283 INFO DAGScheduler - Job 44 is finished. Cancelling potential speculative or zombie tasks for this job
20:47:58.283 INFO TaskSchedulerImpl - Killing all running tasks in stage 53: Stage finished
20:47:58.283 INFO DAGScheduler - Job 44 finished: collect at VariantsSparkSourceUnitTest.java:43, took 0.024474 s
WARNING 2025-07-15 20:47:58 AsciiLineReader Creating an indexable source for an AsciiFeatureCodec using a stream that is neither a PositionalBufferedStream nor a BlockCompressedInputStream
WARNING 2025-07-15 20:47:58 AsciiLineReader Creating an indexable source for an AsciiFeatureCodec using a stream that is neither a PositionalBufferedStream nor a BlockCompressedInputStream
20:47:58.293 INFO MemoryStore - Block broadcast_110 stored as values in memory (estimated size 81.1 KiB, free 1918.6 MiB)
20:47:58.296 INFO MemoryStore - Block broadcast_110_piece0 stored as bytes in memory (estimated size 7.1 KiB, free 1918.6 MiB)
20:47:58.296 INFO BlockManagerInfo - Added broadcast_110_piece0 in memory on localhost:39529 (size: 7.1 KiB, free: 1919.8 MiB)
20:47:58.296 INFO SparkContext - Created broadcast 110 from broadcast at VcfSource.java:129
20:47:58.298 INFO MemoryStore - Block broadcast_111 stored as values in memory (estimated size 298.8 KiB, free 1918.3 MiB)
20:47:58.309 INFO MemoryStore - Block broadcast_111_piece0 stored as bytes in memory (estimated size 50.3 KiB, free 1918.3 MiB)
20:47:58.309 INFO BlockManagerInfo - Added broadcast_111_piece0 in memory on localhost:39529 (size: 50.3 KiB, free: 1919.8 MiB)
20:47:58.310 INFO SparkContext - Created broadcast 111 from newAPIHadoopFile at VcfSource.java:168
20:47:58.314 INFO FeatureManager - Using codec VCFCodec to read file file:///gatkCloneMountPoint/src/test/resources/HSA19.dbsnp135.chr1_1M.exome_intervals.modified.vcf
20:47:58.325 INFO FileInputFormat - Total input files to process : 1
20:47:58.330 INFO SparkContext - Starting job: collect at VariantsSparkSourceUnitTest.java:43
20:47:58.331 INFO DAGScheduler - Got job 45 (collect at VariantsSparkSourceUnitTest.java:43) with 1 output partitions
20:47:58.331 INFO DAGScheduler - Final stage: ResultStage 54 (collect at VariantsSparkSourceUnitTest.java:43)
20:47:58.331 INFO DAGScheduler - Parents of final stage: List()
20:47:58.331 INFO DAGScheduler - Missing parents: List()
20:47:58.331 INFO DAGScheduler - Submitting ResultStage 54 (MapPartitionsRDD[276] at map at VariantsSparkSource.java:38), which has no missing parents
20:47:58.333 INFO MemoryStore - Block broadcast_112 stored as values in memory (estimated size 7.4 KiB, free 1918.3 MiB)
20:47:58.333 INFO MemoryStore - Block broadcast_112_piece0 stored as bytes in memory (estimated size 3.7 KiB, free 1918.3 MiB)
20:47:58.333 INFO BlockManagerInfo - Added broadcast_112_piece0 in memory on localhost:39529 (size: 3.7 KiB, free: 1919.8 MiB)
20:47:58.334 INFO SparkContext - Created broadcast 112 from broadcast at DAGScheduler.scala:1580
20:47:58.334 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 54 (MapPartitionsRDD[276] at map at VariantsSparkSource.java:38) (first 15 tasks are for partitions Vector(0))
20:47:58.334 INFO TaskSchedulerImpl - Adding task set 54.0 with 1 tasks resource profile 0
20:47:58.335 INFO TaskSetManager - Starting task 0.0 in stage 54.0 (TID 124) (localhost, executor driver, partition 0, PROCESS_LOCAL, 7849 bytes)
20:47:58.337 INFO Executor - Running task 0.0 in stage 54.0 (TID 124)
20:47:58.339 INFO NewHadoopRDD - Input split: file:/gatkCloneMountPoint/src/test/resources/HSA19.dbsnp135.chr1_1M.exome_intervals.modified.vcf:0+111529
20:47:58.351 INFO Executor - Finished task 0.0 in stage 54.0 (TID 124). 12108 bytes result sent to driver
20:47:58.354 INFO TaskSetManager - Finished task 0.0 in stage 54.0 (TID 124) in 20 ms on localhost (executor driver) (1/1)
20:47:58.354 INFO TaskSchedulerImpl - Removed TaskSet 54.0, whose tasks have all completed, from pool
20:47:58.354 INFO DAGScheduler - ResultStage 54 (collect at VariantsSparkSourceUnitTest.java:43) finished in 0.022 s
20:47:58.355 INFO DAGScheduler - Job 45 is finished. Cancelling potential speculative or zombie tasks for this job
20:47:58.355 INFO TaskSchedulerImpl - Killing all running tasks in stage 54: Stage finished
20:47:58.355 INFO DAGScheduler - Job 45 finished: collect at VariantsSparkSourceUnitTest.java:43, took 0.024332 s
WARNING 2025-07-15 20:47:58 AsciiLineReader Creating an indexable source for an AsciiFeatureCodec using a stream that is neither a PositionalBufferedStream nor a BlockCompressedInputStream
WARNING 2025-07-15 20:47:58 AsciiLineReader Creating an indexable source for an AsciiFeatureCodec using a stream that is neither a PositionalBufferedStream nor a BlockCompressedInputStream
20:47:58.365 INFO MemoryStore - Block broadcast_113 stored as values in memory (estimated size 81.1 KiB, free 1918.2 MiB)
20:47:58.368 INFO MemoryStore - Block broadcast_113_piece0 stored as bytes in memory (estimated size 7.2 KiB, free 1918.2 MiB)
20:47:58.368 INFO BlockManagerInfo - Added broadcast_113_piece0 in memory on localhost:39529 (size: 7.2 KiB, free: 1919.8 MiB)
20:47:58.368 INFO SparkContext - Created broadcast 113 from broadcast at VcfSource.java:129
20:47:58.369 INFO MemoryStore - Block broadcast_114 stored as values in memory (estimated size 299.1 KiB, free 1917.9 MiB)
20:47:58.376 INFO MemoryStore - Block broadcast_114_piece0 stored as bytes in memory (estimated size 50.3 KiB, free 1917.8 MiB)
20:47:58.376 INFO BlockManagerInfo - Added broadcast_114_piece0 in memory on localhost:39529 (size: 50.3 KiB, free: 1919.7 MiB)
20:47:58.377 INFO SparkContext - Created broadcast 114 from newAPIHadoopFile at VcfSource.java:168
20:47:58.380 INFO FeatureManager - Using codec VCFCodec to read file file:///gatkCloneMountPoint/src/test/resources/Homo_sapiens_assembly19.dbsnp135.chr1_1M.exome_intervals.vcf
20:47:58.386 INFO FileInputFormat - Total input files to process : 1
20:47:58.390 INFO SparkContext - Starting job: collect at VariantsSparkSourceUnitTest.java:55
20:47:58.390 INFO DAGScheduler - Got job 46 (collect at VariantsSparkSourceUnitTest.java:55) with 1 output partitions
20:47:58.390 INFO DAGScheduler - Final stage: ResultStage 55 (collect at VariantsSparkSourceUnitTest.java:55)
20:47:58.390 INFO DAGScheduler - Parents of final stage: List()
20:47:58.391 INFO DAGScheduler - Missing parents: List()
20:47:58.391 INFO DAGScheduler - Submitting ResultStage 55 (MapPartitionsRDD[279] at mapPartitions at VcfSource.java:134), which has no missing parents
20:47:58.392 INFO MemoryStore - Block broadcast_115 stored as values in memory (estimated size 6.3 KiB, free 1917.8 MiB)
20:47:58.392 INFO MemoryStore - Block broadcast_115_piece0 stored as bytes in memory (estimated size 3.4 KiB, free 1917.8 MiB)
20:47:58.392 INFO BlockManagerInfo - Added broadcast_115_piece0 in memory on localhost:39529 (size: 3.4 KiB, free: 1919.7 MiB)
20:47:58.393 INFO SparkContext - Created broadcast 115 from broadcast at DAGScheduler.scala:1580
20:47:58.393 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 55 (MapPartitionsRDD[279] at mapPartitions at VcfSource.java:134) (first 15 tasks are for partitions Vector(0))
20:47:58.393 INFO TaskSchedulerImpl - Adding task set 55.0 with 1 tasks resource profile 0
20:47:58.394 INFO TaskSetManager - Starting task 0.0 in stage 55.0 (TID 125) (localhost, executor driver, partition 0, PROCESS_LOCAL, 7858 bytes)
20:47:58.394 INFO Executor - Running task 0.0 in stage 55.0 (TID 125)
20:47:58.396 INFO NewHadoopRDD - Input split: file:/gatkCloneMountPoint/src/test/resources/Homo_sapiens_assembly19.dbsnp135.chr1_1M.exome_intervals.vcf:0+111531
20:47:58.415 INFO Executor - Finished task 0.0 in stage 55.0 (TID 125). 153274 bytes result sent to driver
20:47:58.430 INFO TaskSetManager - Finished task 0.0 in stage 55.0 (TID 125) in 36 ms on localhost (executor driver) (1/1)
20:47:58.430 INFO DAGScheduler - ResultStage 55 (collect at VariantsSparkSourceUnitTest.java:55) finished in 0.039 s
20:47:58.430 INFO DAGScheduler - Job 46 is finished. Cancelling potential speculative or zombie tasks for this job
20:47:58.430 INFO TaskSchedulerImpl - Removed TaskSet 55.0, whose tasks have all completed, from pool
20:47:58.431 INFO TaskSchedulerImpl - Killing all running tasks in stage 55: Stage finished
20:47:58.431 INFO DAGScheduler - Job 46 finished: collect at VariantsSparkSourceUnitTest.java:55, took 0.040868 s
WARNING 2025-07-15 20:47:58 AsciiLineReader Creating an indexable source for an AsciiFeatureCodec using a stream that is neither a PositionalBufferedStream nor a BlockCompressedInputStream
WARNING 2025-07-15 20:47:58 AsciiLineReader Creating an indexable source for an AsciiFeatureCodec using a stream that is neither a PositionalBufferedStream nor a BlockCompressedInputStream
20:47:58.453 INFO MemoryStore - Block broadcast_116 stored as values in memory (estimated size 81.1 KiB, free 1917.7 MiB)
20:47:58.454 INFO MemoryStore - Block broadcast_116_piece0 stored as bytes in memory (estimated size 7.1 KiB, free 1917.7 MiB)
20:47:58.454 INFO BlockManagerInfo - Added broadcast_116_piece0 in memory on localhost:39529 (size: 7.1 KiB, free: 1919.7 MiB)
20:47:58.455 INFO SparkContext - Created broadcast 116 from broadcast at VcfSource.java:129
20:47:58.456 INFO MemoryStore - Block broadcast_117 stored as values in memory (estimated size 299.3 KiB, free 1917.4 MiB)
20:47:58.463 INFO MemoryStore - Block broadcast_117_piece0 stored as bytes in memory (estimated size 50.4 KiB, free 1917.4 MiB)
20:47:58.463 INFO BlockManagerInfo - Added broadcast_117_piece0 in memory on localhost:39529 (size: 50.4 KiB, free: 1919.6 MiB)
20:47:58.464 INFO SparkContext - Created broadcast 117 from newAPIHadoopFile at VcfSource.java:168
20:47:58.466 INFO FeatureManager - Using codec VCFCodec to read file file:///gatkCloneMountPoint/src/test/resources/HSA19.dbsnp135.chr1_1M.exome_intervals.modified.vcf
20:47:58.473 INFO FileInputFormat - Total input files to process : 1
20:47:58.479 INFO SparkContext - Starting job: collect at VariantsSparkSourceUnitTest.java:55
20:47:58.479 INFO DAGScheduler - Got job 47 (collect at VariantsSparkSourceUnitTest.java:55) with 1 output partitions
20:47:58.479 INFO DAGScheduler - Final stage: ResultStage 56 (collect at VariantsSparkSourceUnitTest.java:55)
20:47:58.479 INFO DAGScheduler - Parents of final stage: List()
20:47:58.479 INFO DAGScheduler - Missing parents: List()
20:47:58.479 INFO DAGScheduler - Submitting ResultStage 56 (MapPartitionsRDD[282] at mapPartitions at VcfSource.java:134), which has no missing parents
20:47:58.480 INFO MemoryStore - Block broadcast_118 stored as values in memory (estimated size 6.3 KiB, free 1917.4 MiB)
20:47:58.481 INFO MemoryStore - Block broadcast_118_piece0 stored as bytes in memory (estimated size 3.4 KiB, free 1917.4 MiB)
20:47:58.482 INFO BlockManagerInfo - Added broadcast_118_piece0 in memory on localhost:39529 (size: 3.4 KiB, free: 1919.6 MiB)
20:47:58.483 INFO SparkContext - Created broadcast 118 from broadcast at DAGScheduler.scala:1580
20:47:58.483 INFO DAGScheduler - Submitting 1 missing tasks from ResultStage 56 (MapPartitionsRDD[282] at mapPartitions at VcfSource.java:134) (first 15 tasks are for partitions Vector(0))
20:47:58.483 INFO TaskSchedulerImpl - Adding task set 56.0 with 1 tasks resource profile 0
20:47:58.484 INFO TaskSetManager - Starting task 0.0 in stage 56.0 (TID 126) (localhost, executor driver, partition 0, PROCESS_LOCAL, 7849 bytes)
20:47:58.485 INFO Executor - Running task 0.0 in stage 56.0 (TID 126)
20:47:58.487 INFO NewHadoopRDD - Input split: file:/gatkCloneMountPoint/src/test/resources/HSA19.dbsnp135.chr1_1M.exome_intervals.modified.vcf:0+111529
20:47:58.517 INFO Executor - Finished task 0.0 in stage 56.0 (TID 126). 153274 bytes result sent to driver
20:47:58.534 INFO TaskSetManager - Finished task 0.0 in stage 56.0 (TID 126) in 50 ms on localhost (executor driver) (1/1)
20:47:58.535 INFO TaskSchedulerImpl - Removed TaskSet 56.0, whose tasks have all completed, from pool
20:47:58.535 INFO DAGScheduler - ResultStage 56 (collect at VariantsSparkSourceUnitTest.java:55) finished in 0.055 s
20:47:58.535 INFO DAGScheduler - Job 47 is finished. Cancelling potential speculative or zombie tasks for this job
20:47:58.535 INFO TaskSchedulerImpl - Killing all running tasks in stage 56: Stage finished
20:47:58.536 INFO DAGScheduler - Job 47 finished: collect at VariantsSparkSourceUnitTest.java:55, took 0.056919 s