Error during generation of Generate a CNV panel of normals with CreateReadCountPanelOfNormals
a) GATK version used: gatk-4.1.7.0
b) Exact GATK commands used:
gatk --java-options "-Xmx6500m" CreateReadCountPanelOfNormals \
-I SRR2017727.counts.hdf5 \
-I SRR2017793.counts.hdf5 \
-I SRR2017794.counts.hdf5 \
-I SRR2017795.counts.hdf5 \
-I SRR2017802.counts.hdf5 \
-I SRR2017908.counts.hdf5 \
-I SRR2017915.counts.hdf5 \
-I SRR2017936.counts.hdf5 \
-I SRR2017993.counts.hdf5 \
-I SRR2017994.counts.hdf5 \
-I SRR2017996.counts.hdf5 \
-I SRR2018104.counts.hdf5 \
-I SRR2018123.counts.hdf5 \
-I SRR2018227.counts.hdf5 \
-I SRR2018230.counts.hdf5 \
-I SRR2064151.counts.hdf5 \
-I SRR2064166.counts.hdf5 \
-I SRR2064168.counts.hdf5 \
-I SRR2064169.counts.hdf5 \
-I SRR2064170.counts.hdf5 \
--minimum-interval-median-percentile 5.0 \
--number-of-eigensamples 4 \
-O GSE75885_cnvponC.pon.hdf5
c) The entire error log if applicable:
20/05/26 12:07:06 INFO H5: HDF5 library:
20/05/26 12:07:06 INFO H5: successfully loaded.
12:07:06.894 INFO CreateReadCountPanelOfNormals - Retrieving intervals from first read-counts file (SRR2017727.counts.hdf5)...
12:07:07.078 INFO CreateReadCountPanelOfNormals - No annotated intervals were provided...
12:07:07.078 INFO CreateReadCountPanelOfNormals - Validating and aggregating input read-counts files...
12:07:07.085 INFO CreateReadCountPanelOfNormals - Aggregating read-counts file SRR2017727.counts.hdf5 (1 / 20)
12:07:07.163 INFO CreateReadCountPanelOfNormals - Aggregating read-counts file SRR2017793.counts.hdf5 (2 / 20)
12:07:07.215 INFO CreateReadCountPanelOfNormals - Aggregating read-counts file SRR2017794.counts.hdf5 (3 / 20)
12:07:07.266 INFO CreateReadCountPanelOfNormals - Aggregating read-counts file SRR2017795.counts.hdf5 (4 / 20)
12:07:07.316 INFO CreateReadCountPanelOfNormals - Aggregating read-counts file SRR2017802.counts.hdf5 (5 / 20)
12:07:07.431 INFO CreateReadCountPanelOfNormals - Aggregating read-counts file SRR2017908.counts.hdf5 (6 / 20)
12:07:07.497 INFO CreateReadCountPanelOfNormals - Aggregating read-counts file SRR2017915.counts.hdf5 (7 / 20)
12:07:07.533 INFO CreateReadCountPanelOfNormals - Aggregating read-counts file SRR2017936.counts.hdf5 (8 / 20)
12:07:07.592 INFO CreateReadCountPanelOfNormals - Aggregating read-counts file SRR2017993.counts.hdf5 (9 / 20)
12:07:07.659 INFO CreateReadCountPanelOfNormals - Aggregating read-counts file SRR2017994.counts.hdf5 (10 / 20)
12:07:07.702 INFO CreateReadCountPanelOfNormals - Aggregating read-counts file SRR2017996.counts.hdf5 (11 / 20)
12:07:07.742 INFO CreateReadCountPanelOfNormals - Aggregating read-counts file SRR2018104.counts.hdf5 (12 / 20)
12:07:07.822 INFO CreateReadCountPanelOfNormals - Aggregating read-counts file SRR2018123.counts.hdf5 (13 / 20)
12:07:07.857 INFO CreateReadCountPanelOfNormals - Aggregating read-counts file SRR2018227.counts.hdf5 (14 / 20)
12:07:07.891 INFO CreateReadCountPanelOfNormals - Aggregating read-counts file SRR2018230.counts.hdf5 (15 / 20)
12:07:07.970 INFO CreateReadCountPanelOfNormals - Aggregating read-counts file SRR2064151.counts.hdf5 (16 / 20)
12:07:08.003 INFO CreateReadCountPanelOfNormals - Aggregating read-counts file SRR2064166.counts.hdf5 (17 / 20)
12:07:08.042 INFO CreateReadCountPanelOfNormals - Aggregating read-counts file SRR2064168.counts.hdf5 (18 / 20)
12:07:08.127 INFO CreateReadCountPanelOfNormals - Aggregating read-counts file SRR2064169.counts.hdf5 (19 / 20)
12:07:08.160 INFO CreateReadCountPanelOfNormals - Aggregating read-counts file SRR2064170.counts.hdf5 (20 / 20)
12:07:08.198 INFO CreateReadCountPanelOfNormals - Creating the panel of normals...
12:07:08.199 INFO HDF5SVDReadCountPanelOfNormals - Creating read-count panel of normals at /media/jchen/BioPC01.01/sandbox_ForcopyNumber/GSE75885_cnvponC.pon.hdf5...
12:07:08.200 INFO HDF5SVDReadCountPanelOfNormals - Writing version number (7.0)...
12:07:08.202 INFO HDF5SVDReadCountPanelOfNormals - Writing command line...
12:07:08.205 INFO HDF5SVDReadCountPanelOfNormals - Writing sequence dictionary...
12:07:08.207 INFO HDF5SVDReadCountPanelOfNormals - Writing original read counts (242211 x 20)...
12:07:08.279 INFO HDF5SVDReadCountPanelOfNormals - Writing original sample filenames (20)...
12:07:08.280 INFO HDF5SVDReadCountPanelOfNormals - Writing original intervals (242211)...
12:07:08.316 INFO HDF5SVDReadCountPanelOfNormals - Preprocessing and standardizing read counts...
12:07:08.317 INFO SVDDenoisingUtils - Preprocessing read counts...
12:07:08.317 INFO SVDDenoisingUtils - Transforming read counts to fractional coverage...
12:07:08.571 INFO SVDDenoisingUtils - Filtering intervals with median (across samples) less than or equal to the 5.00 percentile...
12:07:08.580 INFO SVDDenoisingUtils - After filtering, 169381 out of 242211 intervals remain...
12:07:08.581 INFO SVDDenoisingUtils - Dividing by interval medians...
12:07:08.646 INFO SVDDenoisingUtils - Filtering samples with a fraction of zero-coverage intervals above 5.00 percent...
12:07:08.695 INFO SVDDenoisingUtils - After filtering, 4 out of 20 samples remain...
12:07:08.695 INFO SVDDenoisingUtils - Filtering intervals with a fraction of zero-coverage samples above 5.00 percent...
12:07:08.733 INFO SVDDenoisingUtils - After filtering, 163904 out of 242211 intervals remain...
12:07:08.733 INFO SVDDenoisingUtils - Filtering samples with a median (across intervals) below the 2.50 percentile or above the 97.50 percentile...
12:07:08.837 INFO SVDDenoisingUtils - After filtering, 4 out of 20 samples remain...
12:07:08.866 INFO SVDDenoisingUtils - Heap utilization statistics [MB]:
12:07:08.866 INFO SVDDenoisingUtils - Used memory: 457
12:07:08.866 INFO SVDDenoisingUtils - Free memory: 2263
12:07:08.866 INFO SVDDenoisingUtils - Total memory: 2721
12:07:08.866 INFO SVDDenoisingUtils - Maximum memory: 6500
12:07:08.866 INFO SVDDenoisingUtils - Performing garbage collection...
12:07:08.910 INFO SVDDenoisingUtils - Heap utilization statistics [MB]:
12:07:08.910 INFO SVDDenoisingUtils - Used memory: 107
12:07:08.910 INFO SVDDenoisingUtils - Free memory: 269
12:07:08.910 INFO SVDDenoisingUtils - Total memory: 377
12:07:08.910 INFO SVDDenoisingUtils - Maximum memory: 6500
12:07:09.033 INFO SVDDenoisingUtils - 13632 zero-coverage values were imputed to the median of the non-zero values in the corresponding interval...
12:07:09.047 INFO SVDDenoisingUtils - 1310 values below the 0.10 percentile or above the 99.90 percentile were truncated to the corresponding value...
12:07:09.047 INFO SVDDenoisingUtils - Panel read counts preprocessed.
12:07:09.047 INFO SVDDenoisingUtils - Standardizing read counts...
12:07:09.047 INFO SVDDenoisingUtils - Dividing by sample medians and transforming to log2 space...
12:07:09.063 INFO SVDDenoisingUtils - Subtracting median of sample medians...
12:07:09.073 INFO SVDDenoisingUtils - Panel read counts standardized.
12:07:09.079 INFO HDF5SVDReadCountPanelOfNormals - Writing panel sample filenames (4)...
12:07:09.080 INFO HDF5SVDReadCountPanelOfNormals - Writing panel intervals (163904)...
12:07:09.089 INFO HDF5SVDReadCountPanelOfNormals - Writing panel interval fractional medians (163904)...
12:07:09.090 INFO HDF5SVDReadCountPanelOfNormals - Performing SVD (truncated at 4 eigensamples) of standardized counts (transposed to 163904 x 4)...
12:07:09.104 INFO SparkConverter - Converting matrix to distributed Spark matrix...
12:07:09.208 INFO SparkConverter - Done converting matrix to distributed Spark matrix...
12:07:09.241 WARN HDF5SVDReadCountPanelOfNormals - Exception encountered during creation of panel of normals (java.lang.IllegalArgumentException: Unsupported class file major version 55). Attempting to delete partial output in /media/jchen/BioPC01.01/sandbox_ForcopyNumber/GSE75885_cnvponC.pon.hdf5...
20/05/26 12:07:09 INFO SparkUI: Stopped Spark web UI at http://192.168.71.128:4040
20/05/26 12:07:09 INFO MapOutputTrackerMasterEndpoint: MapOutputTrackerMasterEndpoint stopped!
20/05/26 12:07:09 INFO MemoryStore: MemoryStore cleared
20/05/26 12:07:09 INFO BlockManager: BlockManager stopped
20/05/26 12:07:09 INFO BlockManagerMaster: BlockManagerMaster stopped
20/05/26 12:07:09 INFO OutputCommitCoordinator$OutputCommitCoordinatorEndpoint: OutputCommitCoordinator stopped!
20/05/26 12:07:09 INFO SparkContext: Successfully stopped SparkContext
12:07:09.316 INFO CreateReadCountPanelOfNormals - Shutting down engine
[May 26, 2020 at 12:07:09 PM PDT] org.broadinstitute.hellbender.tools.copynumber.CreateReadCountPanelOfNormals done. Elapsed time: 0.06 minutes.
Runtime.totalMemory()=708837376
org.broadinstitute.hellbender.exceptions.GATKException: Could not create panel of normals. It may be necessary to use stricter parameters for filtering. For example, use a larger value of minimum-interval-median-percentile.
at org.broadinstitute.hellbender.tools.copynumber.denoising.HDF5SVDReadCountPanelOfNormals.create(HDF5SVDReadCountPanelOfNormals.java:354)
at org.broadinstitute.hellbender.tools.copynumber.CreateReadCountPanelOfNormals.runPipeline(CreateReadCountPanelOfNormals.java:290)
at org.broadinstitute.hellbender.engine.spark.SparkCommandLineProgram.doWork(SparkCommandLineProgram.java:31)
at org.broadinstitute.hellbender.cmdline.CommandLineProgram.runTool(CommandLineProgram.java:139)
at org.broadinstitute.hellbender.cmdline.CommandLineProgram.instanceMainPostParseArgs(CommandLineProgram.java:191)
at org.broadinstitute.hellbender.cmdline.CommandLineProgram.instanceMain(CommandLineProgram.java:210)
at org.broadinstitute.hellbender.Main.runCommandLineProgram(Main.java:163)
at org.broadinstitute.hellbender.Main.mainEntry(Main.java:206)
at org.broadinstitute.hellbender.Main.main(Main.java:292)
Caused by: java.lang.IllegalArgumentException: Unsupported class file major version 55
at org.apache.xbean.asm6.ClassReader.<init>(ClassReader.java:166)
at org.apache.xbean.asm6.ClassReader.<init>(ClassReader.java:148)
at org.apache.xbean.asm6.ClassReader.<init>(ClassReader.java:136)
at org.apache.xbean.asm6.ClassReader.<init>(ClassReader.java:237)
at org.apache.spark.util.ClosureCleaner$.getClassReader(ClosureCleaner.scala:49)
at org.apache.spark.util.FieldAccessFinder$$anon$3$$anonfun$visitMethodInsn$2.apply(ClosureCleaner.scala:517)
at org.apache.spark.util.FieldAccessFinder$$anon$3$$anonfun$visitMethodInsn$2.apply(ClosureCleaner.scala:500)
at scala.collection.TraversableLike$WithFilter$$anonfun$foreach$1.apply(TraversableLike.scala:733)
at scala.collection.mutable.HashMap$$anon$1$$anonfun$foreach$2.apply(HashMap.scala:134)
at scala.collection.mutable.HashMap$$anon$1$$anonfun$foreach$2.apply(HashMap.scala:134)
at scala.collection.mutable.HashTable$class.foreachEntry(HashTable.scala:236)
at scala.collection.mutable.HashMap.foreachEntry(HashMap.scala:40)
at scala.collection.mutable.HashMap$$anon$1.foreach(HashMap.scala:134)
at scala.collection.TraversableLike$WithFilter.foreach(TraversableLike.scala:732)
at org.apache.spark.util.FieldAccessFinder$$anon$3.visitMethodInsn(ClosureCleaner.scala:500)
at org.apache.xbean.asm6.ClassReader.readCode(ClassReader.java:2175)
at org.apache.xbean.asm6.ClassReader.readMethod(ClassReader.java:1238)
at org.apache.xbean.asm6.ClassReader.accept(ClassReader.java:631)
at org.apache.xbean.asm6.ClassReader.accept(ClassReader.java:355)
at org.apache.spark.util.ClosureCleaner$$anonfun$org$apache$spark$util$ClosureCleaner$$clean$14.apply(ClosureCleaner.scala:307)
at org.apache.spark.util.ClosureCleaner$$anonfun$org$apache$spark$util$ClosureCleaner$$clean$14.apply(ClosureCleaner.scala:306)
at scala.collection.immutable.List.foreach(List.scala:392)
at org.apache.spark.util.ClosureCleaner$.org$apache$spark$util$ClosureCleaner$$clean(ClosureCleaner.scala:306)
at org.apache.spark.util.ClosureCleaner$.clean(ClosureCleaner.scala:162)
at org.apache.spark.SparkContext.clean(SparkContext.scala:2326)
at org.apache.spark.SparkContext.runJob(SparkContext.scala:2100)
at org.apache.spark.rdd.RDD$$anonfun$take$1.apply(RDD.scala:1364)
at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:151)
at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:112)
at org.apache.spark.rdd.RDD.withScope(RDD.scala:363)
at org.apache.spark.rdd.RDD.take(RDD.scala:1337)
at org.apache.spark.rdd.RDD$$anonfun$first$1.apply(RDD.scala:1378)
at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:151)
at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:112)
at org.apache.spark.rdd.RDD.withScope(RDD.scala:363)
at org.apache.spark.rdd.RDD.first(RDD.scala:1377)
at org.apache.spark.mllib.linalg.distributed.RowMatrix.numCols(RowMatrix.scala:61)
at org.apache.spark.mllib.linalg.distributed.RowMatrix.computeSVD(RowMatrix.scala:232)
at org.apache.spark.mllib.linalg.distributed.RowMatrix.computeSVD(RowMatrix.scala:208)
at org.broadinstitute.hellbender.tools.copynumber.denoising.HDF5SVDReadCountPanelOfNormals.create(HDF5SVDReadCountPanelOfNormals.java:326)
... 8 more
20/05/26 12:07:09 INFO ShutdownHookManager: Shutdown hook called
20/05/26 12:07:09 INFO ShutdownHookManager: Deleting directory /tmp/spark-5a0f58e7-2472-4117-bebc-9eb3a48e2297
-
I think I have the Java version issue. My Java is too new :)
java --version
openjdk 11.0.7 2020-04-14
OpenJDK Runtime Environment (build 11.0.7+10-post-Ubuntu-2ubuntu218.04)
OpenJDK 64-Bit Server VM (build 11.0.7+10-post-Ubuntu-2ubuntu218.04, mixed mode, sharing)What is the recommended way to proceed?
Joan
-
Hi,
Take a look at this doc: https://gatk.broadinstitute.org/hc/en-us/articles/360035532332-Java-version-issues
Please sign in to leave a comment.
2 comments