Sample does not have a non-negative sample median when running DenoiseReadCounts to test the example data Tutorial11682
Using GATK jar /gatk/gatk-package-4.1.2.0-local.jar
The Genome Analysis Toolkit (GATK) v4.1.2.0
HTSJDK Version: 2.19.0
Picard Version: 2.19.0
Command:
Using GATK jar /gatk/gatk-package-4.1.2.0-local.jar
Running:
java -Dsamjdk.use_async_io_read_samtools=false -Dsamjdk.use_async_io_write_samtools=true -Dsamjdk.use_async_io_write_tribble=false -Dsamjdk.compression_level=2 -Xmx16G -jar /gatk/gatk-package-4.1.2.0-local.jar DenoiseReadCounts --input /scratch/ywu244/training/fastqc/Michael_Cavnar/12042020_CNV/tutorial_11682/test/tumor.counts.hdf5 --count-panel-of-normals /scratch/ywu244/training/fastqc/Michael_Cavnar/12042020_CNV/tutorial_11682/test/cnvponC.pon.hdf5 --standardized-copy-ratios /scratch/ywu244/training/fastqc/Michael_Cavnar/12042020_CNV/tutorial_11682/test/hcc1143_T_clean.standardizedCR.tsv --denoised-copy-ratios /scratch/ywu244/training/fastqc/Michael_Cavnar/12042020_CNV/tutorial_11682/test/hcc1143_T_clean.denoisedCR.tsv
16:59:08.236 INFO NativeLibraryLoader - Loading libgkl_compression.so from jar:file:/gatk/gatk-package-4.1.2.0-local.jar!/com/intel/gkl/native/libgkl_compression.so
Jan 13, 2021 4:59:09 PM shaded.cloud_nio.com.google.auth.oauth2.ComputeEngineCredentials runningOnComputeEngine
INFO: Failed to detect whether we are running on Google Compute Engine.
16:59:09.900 INFO DenoiseReadCounts - ------------------------------------------------------------
16:59:09.901 INFO DenoiseReadCounts - The Genome Analysis Toolkit (GATK) v4.1.2.0
16:59:09.901 INFO DenoiseReadCounts - For support and documentation go to https://software.broadinstitute.org/gatk/
16:59:09.901 INFO DenoiseReadCounts - Executing as ywu244@login004 on Linux v3.10.0-957.21.3.el7.x86_64 amd64
16:59:09.901 INFO DenoiseReadCounts - Java runtime: OpenJDK 64-Bit Server VM v1.8.0_191-8u191-b12-0ubuntu0.16.04.1-b12
16:59:09.901 INFO DenoiseReadCounts - Start Date/Time: January 13, 2021 4:59:08 PM UTC
16:59:09.901 INFO DenoiseReadCounts - ------------------------------------------------------------
16:59:09.901 INFO DenoiseReadCounts - ------------------------------------------------------------
16:59:09.902 INFO DenoiseReadCounts - HTSJDK Version: 2.19.0
16:59:09.902 INFO DenoiseReadCounts - Picard Version: 2.19.0
16:59:09.903 INFO DenoiseReadCounts - HTSJDK Defaults.COMPRESSION_LEVEL : 2
16:59:09.903 INFO DenoiseReadCounts - HTSJDK Defaults.USE_ASYNC_IO_READ_FOR_SAMTOOLS : false
16:59:09.903 INFO DenoiseReadCounts - HTSJDK Defaults.USE_ASYNC_IO_WRITE_FOR_SAMTOOLS : true
16:59:09.903 INFO DenoiseReadCounts - HTSJDK Defaults.USE_ASYNC_IO_WRITE_FOR_TRIBBLE : false
16:59:09.903 INFO DenoiseReadCounts - Deflater: IntelDeflater
16:59:09.903 INFO DenoiseReadCounts - Inflater: IntelInflater
16:59:09.903 INFO DenoiseReadCounts - GCS max retries/reopens: 20
16:59:09.903 INFO DenoiseReadCounts - Requester pays: disabled
16:59:09.903 INFO DenoiseReadCounts - Initializing engine
16:59:09.903 INFO DenoiseReadCounts - Done initializing engine
log4j:WARN No appenders could be found for logger (org.broadinstitute.hdf5.HDF5Library).
log4j:WARN Please initialize the log4j system properly.
log4j:WARN See http://logging.apache.org/log4j/1.2/faq.html#noconfig for more info.
16:59:09.944 INFO DenoiseReadCounts - Reading read-counts file (/scratch/ywu244/training/fastqc/Michael_Cavnar/12042020_CNV/tutorial_11682/test/tumor.counts.hdf5)...
16:59:10.235 INFO SVDDenoisingUtils - Validating sample intervals against original intervals used to build panel of normals...
16:59:10.277 INFO SVDDenoisingUtils - Preprocessing and standardizing sample read counts...
16:59:10.301 INFO SVDDenoisingUtils - Preprocessing read counts...
16:59:10.301 INFO SVDDenoisingUtils - Transforming read counts to fractional coverage...
16:59:10.310 INFO SVDDenoisingUtils - Subsetting sample intervals to post-filter panel intervals...
16:59:10.367 INFO SVDDenoisingUtils - Dividing by interval medians from the panel of normals...
16:59:10.376 INFO SVDDenoisingUtils - Sample read counts preprocessed.
16:59:10.376 INFO SVDDenoisingUtils - Standardizing read counts...
16:59:10.376 INFO SVDDenoisingUtils - Dividing by sample medians and transforming to log2 space...
16:59:10.396 INFO DenoiseReadCounts - Shutting down engine
[January 13, 2021 4:59:10 PM UTC] org.broadinstitute.hellbender.tools.copynumber.DenoiseReadCounts done. Elapsed time: 0.04 minutes.
Runtime.totalMemory()=1210580992
java.lang.IllegalArgumentException: Sample does not have a non-negative sample median.
at org.broadinstitute.hellbender.utils.Utils.validateArg(Utils.java:724)
at org.broadinstitute.hellbender.utils.param.ParamUtils.isPositive(ParamUtils.java:165) at org.broadinstitute.hellbender.tools.copynumber.denoising.SVDDenoisingUtils.lambda$divideBySampleMedianAndTransformToLog2$26(SVDDenoisingUtils.java:483)
at java.util.stream.Streams$RangeIntSpliterator.forEachRemaining(Streams.java:110)
at java.util.stream.IntPipeline$Head.forEach(IntPipeline.java:557)
at org.broadinstitute.hellbender.tools.copynumber.denoising.SVDDenoisingUtils.divideBySampleMedianAndTransformToLog2(SVDDenoisingUtils.java:482)
at org.broadinstitute.hellbender.tools.copynumber.denoising.SVDDenoisingUtils.preprocessAndStandardizeSample(SVDDenoisingUtils.java:406)
at org.broadinstitute.hellbender.tools.copynumber.denoising.SVDDenoisingUtils.denoise(SVDDenoisingUtils.java:124)
at org.broadinstitute.hellbender.tools.copynumber.denoising.SVDReadCountPanelOfNormals.denoise(SVDReadCountPanelOfNormals.java:88)
at org.broadinstitute.hellbender.tools.copynumber.DenoiseReadCounts.doWork(DenoiseReadCounts.java:200)
at org.broadinstitute.hellbender.cmdline.CommandLineProgram.runTool(CommandLineProgram.java:139)
at org.broadinstitute.hellbender.cmdline.CommandLineProgram.instanceMainPostParseArgs(CommandLineProgram.java:191)
at org.broadinstitute.hellbender.cmdline.CommandLineProgram.instanceMain(CommandLineProgram.java:210)
at org.broadinstitute.hellbender.Main.runCommandLineProgram(Main.java:162)
at org.broadinstitute.hellbender.Main.mainEntry(Main.java:205)
at org.broadinstitute.hellbender.Main.main(Main.java:291)
-
Hi Yuanyuan Wu,
Could you confirm that you are following all the steps in the tutorial you listed? Please give the previous commands as well.
Thank you,
Genevieve
-
Yes, I can confirm that. The only difference is I used "CollectReadCounts" instead of "CollectFragmentCounts" because the GATK version is different.
The previous steps:
gatk --java-options "-Xmx16G" PreprocessIntervals \-L /tutorial_11682/targets_C.interval_list \--reference /reference/Homo_sapiens_assembly38.fasta \--bin-length 0 \--interval-merging-rule OVERLAPPING_ONLY \--output /tutorial_11682/test/targets_C.preprocessed.interval_listgatk --java-options "-Xmx16G" CollectReadCounts \-L /tutorial_11682/tutorial_11682/targets_C.preprocessed.interval_list \-I /tutorial_11682/tutorial_11682/tumor.bam \--interval-merging-rule OVERLAPPING_ONLY \--output /tutorial_11682/test/tumor.counts.hdf5gatk --java-options "-Xmx16G" CreateReadCountPanelOfNormals \--input /tutorial_11682/HG00133.alt_bwamem_GRCh38DH.20150826.GBR.exome.counts.hdf5 \--input /tutorial_11682/HG00733.alt_bwamem_GRCh38DH.20150826.PUR.exome.counts.hdf5 \--input /tutorial_11682/NA19654.alt_bwamem_GRCh38DH.20150826.MXL.exome.counts.hdf5 \--minimum-interval-median-percentile 5.0 \--output /tutorial_11682/test/cnvponC.pon.hdf5 -
Hi Yuanyuan Wu,
The error message is reporting that you do not have a non-negative sample median (it is mostly likely zero) which is in your tumor.counts.hdf5 file. You can examine the count file to verify if that is true.
Could you post the stack trace from when you ran CollectReadCounts?
Thank you,
Genevieve
-
Please see below, which I got after ran CollectReadCounts. How could I examine the count file to verify if that is true?
Using GATK jar /gatk/gatk-package-4.1.2.0-local.jar
Running:
java -Dsamjdk.use_async_io_read_samtools=false -Dsamjdk.use_async_io_write_samtools=true -Dsamjdk.use_async_io_write_tribble=false -Dsamjdk.compression_level=2 -Xmx16G -jar /gatk/gatk-package-4.1.2.0-local.jar CollectReadCounts -L /tutorial_11682/targets_C.preprocessed.interval_list -I /test/tumor.counts.hdf5
23:23:56.580 INFO NativeLibraryLoader - Loading libgkl_compression.so from jar:file:/gatk/gatk-package-4.1.2.0-local.jar!/com/intel/gkl/native/libgkl_compression.so
Jan 14, 2021 11:23:58 PM shaded.cloud_nio.com.google.auth.oauth2.ComputeEngineCredentials runningOnComputeEngine
INFO: Failed to detect whether we are running on Google Compute Engine.
23:23:58.331 INFO CollectReadCounts - ------------------------------------------------------------
23:23:58.332 INFO CollectReadCounts - The Genome Analysis Toolkit (GATK) v4.1.2.0
23:23:58.332 INFO CollectReadCounts - For support and documentation go to https://software.broadinstitute.org/gatk/
23:23:58.332 INFO CollectReadCounts - Executing as ywu244@login005 on Linux v3.10.0-957.21.3.el7.x86_64 amd64
23:23:58.332 INFO CollectReadCounts - Java runtime: OpenJDK 64-Bit Server VM v1.8.0_191-8u191-b12-0ubuntu0.16.04.1-b12
23:23:58.332 INFO CollectReadCounts - Start Date/Time: January 14, 2021 11:23:56 PM UTC
23:23:58.332 INFO CollectReadCounts - ------------------------------------------------------------
23:23:58.332 INFO CollectReadCounts - ------------------------------------------------------------
23:23:58.333 INFO CollectReadCounts - HTSJDK Version: 2.19.0
23:23:58.333 INFO CollectReadCounts - Picard Version: 2.19.0
23:23:58.333 INFO CollectReadCounts - HTSJDK Defaults.COMPRESSION_LEVEL : 2
23:23:58.333 INFO CollectReadCounts - HTSJDK Defaults.USE_ASYNC_IO_READ_FOR_SAMTOOLS : false
23:23:58.333 INFO CollectReadCounts - HTSJDK Defaults.USE_ASYNC_IO_WRITE_FOR_SAMTOOLS : true
23:23:58.333 INFO CollectReadCounts - HTSJDK Defaults.USE_ASYNC_IO_WRITE_FOR_TRIBBLE : false
23:23:58.333 INFO CollectReadCounts - Deflater: IntelDeflater
23:23:58.333 INFO CollectReadCounts - Inflater: IntelInflater
23:23:58.333 INFO CollectReadCounts - GCS max retries/reopens: 20
23:23:58.333 INFO CollectReadCounts - Requester pays: disabled
23:23:58.334 INFO CollectReadCounts - Initializing engine
WARNING: BAM index file tutorial_11682/tumor.bai is older than BAM /tutorial_11682/tumor.bam
23:23:59.162 INFO IntervalArgumentCollection - Processing 115110897 bp from intervals
23:23:59.197 INFO CollectReadCounts - Done initializing engine
23:23:59.200 INFO CollectReadCounts - Collecting read counts...
23:23:59.201 INFO ProgressMeter - Starting traversal
23:23:59.201 INFO ProgressMeter - Current Locus Elapsed Minutes Reads Processed Reads/Minute
23:24:09.205 INFO ProgressMeter - chr17:59963133 0.2 2475000 14847030.6
23:24:13.057 INFO CollectReadCounts - 1180729 read(s) filtered by: ((((WellformedReadFilter AND MappedReadFilter) AND NonZeroReferenceLengthAlignmentReadFilter) AND NotDuplicateReadFilter) AND MappingQualityReadFilter)
920554 read(s) filtered by: (((WellformedReadFilter AND MappedReadFilter) AND NonZeroReferenceLengthAlignmentReadFilter) AND NotDuplicateReadFilter)
28682 read(s) filtered by: ((WellformedReadFilter AND MappedReadFilter) AND NonZeroReferenceLengthAlignmentReadFilter)
28682 read(s) filtered by: (WellformedReadFilter AND MappedReadFilter)
28682 read(s) filtered by: MappedReadFilter
891872 read(s) filtered by: NotDuplicateReadFilter
260175 read(s) filtered by: MappingQualityReadFilter23:24:13.057 INFO ProgressMeter - chr22:26297032 0.2 3608572 15626033.5
23:24:13.057 INFO ProgressMeter - Traversal complete. Processed 3608572 total reads in 0.2 minutes.
23:24:13.058 INFO CollectReadCounts - Writing read counts to /tutorial_11682/test/tumor.counts.hdf5...
log4j:WARN No appenders could be found for logger (org.broadinstitute.hdf5.HDF5Library).
log4j:WARN Please initialize the log4j system properly.
log4j:WARN See http://logging.apache.org/log4j/1.2/faq.html#noconfig for more info.
23:24:13.499 INFO CollectReadCounts - CollectReadCounts complete.
23:24:13.500 INFO CollectReadCounts - Shutting down engine
[January 14, 2021 11:24:13 PM UTC] org.broadinstitute.hellbender.tools.copynumber.CollectReadCounts done. Elapsed time: 0.28 minutes.
Runtime.totalMemory()=2520776704 -
Hi Yuanyuan Wu,
I brought this up with our developer team to confirm it is not a bug and I think we have found the issue! The input for DenoiseReadCounts should be a file downloaded from the tutorial, hcc1143_T_clean.counts.hdf5. (Not the file tutorial_11682/test/tumor.counts.hdf5).
For CollectFragmentCounts, there is a sentence we missed in the tutorial: The tutorial does not use the resulting file in subsequent steps.
So, the CollectReadCounts command is run as an example, but it is not meant to be used in the next steps. You can use the provided file (hcc1143_T_clean.counts.hdf5) as input to DenoiseReadCounts.
Hope this resolves the problem! Have a good weekend,
Genevieve
-
Ok, great. Thanks for your clarification. I need to run CollectReadCounts on the tumor samples from my own project anyway, am I right?
Except for that, I got two other errors when I plot figures by using
PlotDenoisedCopyRatios and PlotModeeledSegments. The issue is I can get the results on this step but it throws an error. Please see below.gatk PlotDenoisedCopyRatios \--standardized-copy-ratios /tutorial_11682/test_01122021/hcc1143_T_clean.standardizedCR.tsv \--denoised-copy-ratios /test_01122021/hcc1143_T_clean.denoisedCR.tsv \--sequence-dictionary /reference/Homo_sapiens_assembly38.dict \--minimum-contig-length 46709983 \--output /tutorial_11682/test/plots \--output-prefix hcc1143_T_cleanUsing GATK jar /gatk/gatk-package-4.1.2.0-local.jar
Running:
java -Dsamjdk.use_async_io_read_samtools=false -Dsamjdk.use_async_io_write_samtools=true -Dsamjdk.use_async_io_write_tri/tutorial_11682/test/hcc1143_T_clean.standardizedCR.tsv --denoised-copy-ratios /tutorial_11682/test/hcc1143_T_clean.denoisedCR.tsv --sequence-dictionary /reference/Homo_sapiens_assembly38.dict --minimum-contig-length 46709983 --output tutorial_11682/test/plots --output-prefix hcc1143_T_clean
17:02:14.660 INFO NativeLibraryLoader - Loading libgkl_compression.so from jar:file:/gatk/gatk-package-4.1.2.0-local.jar!/com/intel/gkl/native/libgkl_compression.so
Jan 13, 2021 5:02:16 PM shaded.cloud_nio.com.google.auth.oauth2.ComputeEngineCredentials runningOnComputeEngine
INFO: Failed to detect whether we are running on Google Compute Engine.
17:02:16.325 INFO PlotDenoisedCopyRatios - ------------------------------------------------------------
17:02:16.325 INFO PlotDenoisedCopyRatios - The Genome Analysis Toolkit (GATK) v4.1.2.0
17:02:16.325 INFO PlotDenoisedCopyRatios - For support and documentation go to https://software.broadinstitute.org/gatk/
17:02:16.326 INFO PlotDenoisedCopyRatios - Executing as ywu244@login004 on Linux v3.10.0-957.21.3.el7.x86_64 amd64
17:02:16.326 INFO PlotDenoisedCopyRatios - Java runtime: OpenJDK 64-Bit Server VM v1.8.0_191-8u191-b12-0ubuntu0.16.04.1-b12
17:02:16.326 INFO PlotDenoisedCopyRatios - Start Date/Time: January 13, 2021 5:02:14 PM UTC
17:02:16.326 INFO PlotDenoisedCopyRatios - ------------------------------------------------------------
17:02:16.326 INFO PlotDenoisedCopyRatios - ------------------------------------------------------------
17:02:16.327 INFO PlotDenoisedCopyRatios - HTSJDK Version: 2.19.0
17:02:16.327 INFO PlotDenoisedCopyRatios - Picard Version: 2.19.0
17:02:16.327 INFO PlotDenoisedCopyRatios - HTSJDK Defaults.COMPRESSION_LEVEL : 2
17:02:16.327 INFO PlotDenoisedCopyRatios - HTSJDK Defaults.USE_ASYNC_IO_READ_FOR_SAMTOOLS : false
17:02:16.327 INFO PlotDenoisedCopyRatios - HTSJDK Defaults.USE_ASYNC_IO_WRITE_FOR_SAMTOOLS : true
17:02:16.327 INFO PlotDenoisedCopyRatios - HTSJDK Defaults.USE_ASYNC_IO_WRITE_FOR_TRIBBLE : false
17:02:16.328 INFO PlotDenoisedCopyRatios - Deflater: IntelDeflater
17:02:16.328 INFO PlotDenoisedCopyRatios - Inflater: IntelInflater
17:02:16.328 INFO PlotDenoisedCopyRatios - GCS max retries/reopens: 20
17:02:16.328 INFO PlotDenoisedCopyRatios - Requester pays: disabled
17:02:16.328 INFO PlotDenoisedCopyRatios - Initializing engine
17:02:16.328 INFO PlotDenoisedCopyRatios - Done initializing engine
17:02:16.373 INFO PlotDenoisedCopyRatios - Reading and validating input files...
17:02:17.461 INFO PlotDenoisedCopyRatios - Contigs above length threshold: {chr1=248956422, chr2=242193529, chr3=198295559, chr4=190214555, chr5=181538259, chr6=170805979, chr7=159345973, chr8=145138636, chr9=138394717, chr10=133797422, chr11=135086622, chr12=133275309, chr13=114364328, chr14=107043718, chr15=101991189, chr16=90338345, chr17=83257441, chr18=80373285, chr19=58617616, chr20=64444167, chr21=46709983, chr22=50818468, chrX=156040895, chrY=57227415}
17:02:17.558 INFO PlotDenoisedCopyRatios - Writing plots to /tutorial_11682/test/plots...
17:02:20.487 INFO PlotDenoisedCopyRatios - PlotDenoisedCopyRatios complete.
17:02:20.487 INFO PlotDenoisedCopyRatios - Shutting down engine
[January 13, 2021 5:02:20 PM UTC] org.broadinstitute.hellbender.tools.copynumber.plotting.PlotDenoisedCopyRatios done. Elapsed time: 0.10 minutes.
Runtime.totalMemory()=1269301248
Exception in thread "Thread-1" htsjdk.samtools.util.RuntimeIOException: java.nio.file.NoSuchFileException: /tmp/Rlib.7048755237054159199
at htsjdk.samtools.util.IOUtil.recursiveDelete(IOUtil.java:1346)
at org.broadinstitute.hellbender.utils.io.IOUtils.deleteRecursively(IOUtils.java:1061)
at org.broadinstitute.hellbender.utils.io.DeleteRecursivelyOnExitPathHook.runHooks(DeleteRecursivelyOnExitPathHook.java:56)
at java.lang.Thread.run(Thread.java:748)
Caused by: java.nio.file.NoSuchFileException: /tmp/Rlib.7048755237054159199
at sun.nio.fs.UnixException.translateToIOException(UnixException.java:86)
at sun.nio.fs.UnixException.rethrowAsIOException(UnixException.java:102)
at sun.nio.fs.UnixException.rethrowAsIOException(UnixException.java:107)
at sun.nio.fs.UnixFileAttributeViews$Basic.readAttributes(UnixFileAttributeViews.java:55)
at sun.nio.fs.UnixFileSystemProvider.readAttributes(UnixFileSystemProvider.java:144)
at sun.nio.fs.LinuxFileSystemProvider.readAttributes(LinuxFileSystemProvider.java:99)
at java.nio.file.Files.readAttributes(Files.java:1737)
at java.nio.file.FileTreeWalker.getAttributes(FileTreeWalker.java:219)
at java.nio.file.FileTreeWalker.visit(FileTreeWalker.java:276)
at java.nio.file.FileTreeWalker.walk(FileTreeWalker.java:322)
at java.nio.file.Files.walkFileTree(Files.java:2662)
at java.nio.file.Files.walkFileTree(Files.java:2742)
at htsjdk.samtools.util.IOUtil.recursiveDelete(IOUtil.java:1344)
... 3 moregatk PlotModeledSegments \--denoised-copy-ratios //tutorial_11682/test/hcc1143_T_clean.denoisedCR.tsv \--allelic-counts /tutorial_11683/test/hcc1143_T_clean.hets.tsv \--segments /tutorial_11683/test/hcc1143_T_clean.modelFinal.seg \--sequence-dictionary /reference/Homo_sapiens_assembly38.dict \--minimum-contig-length 46709983 \--output /tutorial_11683/test/plots \--output-prefix hcc1143_T_cleanUsing GATK jar /gatk/gatk-package-4.1.2.0-local.jar
Running:Mich
java -Dsamjdk.use_async_io_read_samtools=false -Dsamjdk.use_async_io_write_samtools=true -Dsamjdk.use_async_io_write_tribble=false -Dsamjdk.compression_level=2 -jar /gatk/gatk-package-4.1.2.0-local.jar PlotModeledSegments --denoised-copy-ratios /scratch/ywu244/training/fastqc/ael_Cavnar/12042020_CNV/tutorial_11682/test/hcc1143_T_clean.denoisedCR.tsv --allelic-counts /tutorial_11683/test/hcc1143_T_clean.hets.tsv --segments /scratch/ywu244/training/fastqc/Mic
tutorial_11683/test/hcc1143_T_clean.modelFinal.seg --sequence-dictionary /reference/Homo_sapiens_assembly38.dict --minimum-contig-length 46709983 --output /tutorial_11683/test/plots --output-prefix hcc1143_T_clean
20:35:51.183 INFO NativeLibraryLoader - Loading libgkl_compression.so from jar:file:/gatk/gatk-package-4.1.2.0-local.jar!/com/intel/gkl/native/libgkl_compression.so
Jan 13, 2021 8:35:52 PM shaded.cloud_nio.com.google.auth.oauth2.ComputeEngineCredentials runningOnComputeEngine
INFO: Failed to detect whether we are running on Google Compute Engine.
20:35:52.846 INFO PlotModeledSegments - ------------------------------------------------------------
20:35:52.847 INFO PlotModeledSegments - The Genome Analysis Toolkit (GATK) v4.1.2.0
20:35:52.847 INFO PlotModeledSegments - For support and documentation go to https://software.broadinstitute.org/gatk/
20:35:52.847 INFO PlotModeledSegments - Executing as ywu244@login002 on Linux v3.10.0-957.21.3.el7.x86_64 amd64
20:35:52.847 INFO PlotModeledSegments - Java runtime: OpenJDK 64-Bit Server VM v1.8.0_191-8u191-b12-0ubuntu0.16.04.1-b12
20:35:52.847 INFO PlotModeledSegments - Start Date/Time: January 13, 2021 8:35:51 PM UTC
20:35:52.847 INFO PlotModeledSegments - ------------------------------------------------------------
20:35:52.847 INFO PlotModeledSegments - ------------------------------------------------------------
20:35:52.848 INFO PlotModeledSegments - HTSJDK Version: 2.19.0
20:35:52.848 INFO PlotModeledSegments - Picard Version: 2.19.0
20:35:52.849 INFO PlotModeledSegments - HTSJDK Defaults.COMPRESSION_LEVEL : 2
20:35:52.849 INFO PlotModeledSegments - HTSJDK Defaults.USE_ASYNC_IO_READ_FOR_SAMTOOLS : false
20:35:52.849 INFO PlotModeledSegments - HTSJDK Defaults.USE_ASYNC_IO_WRITE_FOR_SAMTOOLS : true
20:35:52.849 INFO PlotModeledSegments - HTSJDK Defaults.USE_ASYNC_IO_WRITE_FOR_TRIBBLE : false
20:35:52.849 INFO PlotModeledSegments - Deflater: IntelDeflater
20:35:52.849 INFO PlotModeledSegments - Inflater: IntelInflater
20:35:52.849 INFO PlotModeledSegments - GCS max retries/reopens: 20
20:35:52.849 INFO PlotModeledSegments - Requester pays: disabled
20:35:52.849 INFO PlotModeledSegments - Initializing engine
20:35:52.849 INFO PlotModeledSegments - Done initializing engine
20:35:52.851 INFO PlotModeledSegments - Reading and validating input files...
20:35:53.747 INFO PlotModeledSegments - Contigs above length threshold: {chr1=248956422, chr2=242193529, chr3=198295559, chr4=190214555, chr5=181538259, chr6=170805979, chr7=159345973, chr8=145138636, chr9=138394717, chr10=133797422, chr11=135086622, chr12=133275309, chr13=114364328, chr14=107043718, chr15=101991189, chr16=90338345, chr17=83257441, chr18=80373285, chr19=58617616, chr20=64444167, chr21=46709983, chr22=50818468, chrX=156040895, chrY=57227415}
20:35:53.852 INFO PlotModeledSegments - Writing plot to /12042020_CNV/tutorial_11683/test/plots/hcc1143_T_clean.modeled.png...
20:35:56.358 INFO PlotModeledSegments - PlotModeledSegments complete.
20:35:56.358 INFO PlotModeledSegments - Shutting down engine
[January 13, 2021 8:35:56 PM UTC] org.broadinstitute.hellbender.tools.copynumber.plotting.PlotModeledSegments done. Elapsed time: 0.09 minutes.
Runtime.totalMemory()=1221066752
Exception in thread "Thread-1" htsjdk.samtools.util.RuntimeIOException: java.nio.file.NoSuchFileException: /tmp/Rlib.3280229804452061538
at htsjdk.samtools.util.IOUtil.recursiveDelete(IOUtil.java:1346)
at org.broadinstitute.hellbender.utils.io.IOUtils.deleteRecursively(IOUtils.java:1061)
at org.broadinstitute.hellbender.utils.io.DeleteRecursivelyOnExitPathHook.runHooks(DeleteRecursivelyOnExitPathHook.java:56)
at java.lang.Thread.run(Thread.java:748)
Caused by: java.nio.file.NoSuchFileException: /tmp/Rlib.3280229804452061538
at sun.nio.fs.UnixException.translateToIOException(UnixException.java:86)
at sun.nio.fs.UnixException.rethrowAsIOException(UnixException.java:102)
at sun.nio.fs.UnixException.rethrowAsIOException(UnixException.java:107)
at sun.nio.fs.UnixFileAttributeViews$Basic.readAttributes(UnixFileAttributeViews.java:55)
at sun.nio.fs.UnixFileSystemProvider.readAttributes(UnixFileSystemProvider.java:144)
at sun.nio.fs.LinuxFileSystemProvider.readAttributes(LinuxFileSystemProvider.java:99)
at java.nio.file.Files.readAttributes(Files.java:1737)
at java.nio.file.FileTreeWalker.getAttributes(FileTreeWalker.java:219)
at java.nio.file.FileTreeWalker.visit(FileTreeWalker.java:276)
at java.nio.file.FileTreeWalker.walk(FileTreeWalker.java:322)
at java.nio.file.Files.walkFileTree(Files.java:2662)
at java.nio.file.Files.walkFileTree(Files.java:2742)
at htsjdk.samtools.util.IOUtil.recursiveDelete(IOUtil.java:1344)
... 3 more -
You may want to verify that R is working correctly, please see again the Tools Involved section in the tutorial. If you are having issues, try using GATK in a docker container.
Please sign in to leave a comment.
7 comments