MarkDuplicatesSpark not working on Putty
Dear All,
I am trying to run MarkDuplicatesSpark on a server using putty but I am getting errors with the logs below:
a) GATK version is gatk4.1.9.0
b) command used
./gatk MarkDuplicatesSpark -I Kaito.fixrg.bam -M Kaitodedupmetrics.txt -O Kaito_sorted_dedup_reads.bam
c) Error log:
Running:
java -Dsamjdk.use_async_io_read_samtools=false -Dsamjdk.use_async_io_write_samtools=true -Dsamjdk.use_async_io_write_tribble=false -Dsamjdk.compression_level=2 -jar /home/naika/gatk4/gatk-package-4.1.9.0-local.jar MarkDuplicatesSpark -I Kaito.fixrg.bam -M Kaitodedupmetrics.txt -O Kaito_sorted_dedup_reads.bam
12:30:53.128 INFO NativeLibraryLoader - Loading libgkl_compression.so from jar:file:/home/naika/gatk4/gatk-package-4.1.9.0-local.jar!/com/intel/gkl/native/libgkl_compression.so
Dec 15, 2020 12:30:53 PM shaded.cloud_nio.com.google.auth.oauth2.ComputeEngineCredentials runningOnComputeEngine
INFO: Failed to detect whether we are running on Google Compute Engine.
12:30:53.772 INFO MarkDuplicatesSpark - ------------------------------------------------------------
12:30:53.773 INFO MarkDuplicatesSpark - The Genome Analysis Toolkit (GATK) v4.1.9.0
12:30:53.773 INFO MarkDuplicatesSpark - For support and documentation go to https://software.broadinstitute.org/gatk/
12:30:53.775 INFO MarkDuplicatesSpark - Initializing engine
12:30:53.775 INFO MarkDuplicatesSpark - Done initializing engine
12:30:54.796 INFO MarkDuplicatesSpark - Shutting down engine
[2020/12/15 12:30:54 JST] org.broadinstitute.hellbender.tools.spark.transforms.markduplicates.MarkDuplicatesSpark done. Elapsed time: 0.03 minutes.
Runtime.totalMemory()=2361917440
Exception in thread "main" java.lang.ExceptionInInitializerError
at org.apache.spark.SparkConf$.<init>(SparkConf.scala:716)
at org.apache.spark.SparkConf$.<clinit>(SparkConf.scala)
at org.apache.spark.SparkConf.set(SparkConf.scala:95)
at org.apache.spark.SparkConf$$anonfun$loadFromSystemProperties$3.apply(SparkConf.scala:77)
at org.apache.spark.SparkConf$$anonfun$loadFromSystemProperties$3.apply(SparkConf.scala:76)
at scala.collection.TraversableLike$WithFilter$$anonfun$foreach$1.apply(TraversableLike.scala:733)
at scala.collection.immutable.HashMap$HashMap1.foreach(HashMap.scala:221)
at scala.collection.immutable.HashMap$HashTrieMap.foreach(HashMap.scala:428)
at scala.collection.immutable.HashMap$HashTrieMap.foreach(HashMap.scala:428)
at scala.collection.TraversableLike$WithFilter.foreach(TraversableLike.scala:732)
at org.apache.spark.SparkConf.loadFromSystemProperties(SparkConf.scala:76)
at org.apache.spark.SparkConf.<init>(SparkConf.scala:71)
at org.apache.spark.SparkConf.<init>(SparkConf.scala:58)
at org.broadinstitute.hellbender.engine.spark.SparkContextFactory.setupSparkConf(SparkContextFactory.java:173)
at org.broadinstitute.hellbender.engine.spark.SparkContextFactory.createSparkContext(SparkContextFactory.java:183)
at org.broadinstitute.hellbender.engine.spark.SparkContextFactory.getSparkContext(SparkContextFactory.java:117)
at org.broadinstitute.hellbender.engine.spark.SparkCommandLineProgram.doWork(SparkCommandLineProgram.java:28)
at org.broadinstitute.hellbender.cmdline.CommandLineProgram.runTool(CommandLineProgram.java:140)
at org.broadinstitute.hellbender.cmdline.CommandLineProgram.instanceMainPostParseArgs(CommandLineProgram.java:192)
at org.broadinstitute.hellbender.cmdline.CommandLineProgram.instanceMain(CommandLineProgram.java:211)
at org.broadinstitute.hellbender.Main.runCommandLineProgram(Main.java:160)
at org.broadinstitute.hellbender.Main.mainEntry(Main.java:203)
at org.broadinstitute.hellbender.Main.main(Main.java:289)
Caused by: java.net.UnknownHostException: ororon: ororon: 名前またはサービスが不明です
at java.net.InetAddress.getLocalHost(InetAddress.java:1505)
at org.apache.spark.util.Utils$.findLocalInetAddress(Utils.scala:946)
at org.apache.spark.util.Utils$.org$apache$spark$util$Utils$$localIpAddress$lzycompute(Utils.scala:939)
at org.apache.spark.util.Utils$.org$apache$spark$util$Utils$$localIpAddress(Utils.scala:939)
at org.apache.spark.util.Utils$$anonfun$localCanonicalHostName$1.apply(Utils.scala:996)
at org.apache.spark.util.Utils$$anonfun$localCanonicalHostName$1.apply(Utils.scala:996)
at scala.Option.getOrElse(Option.scala:121)
at org.apache.spark.util.Utils$.localCanonicalHostName(Utils.scala:996)
at org.apache.spark.internal.config.package$.<init>(package.scala:302)
at org.apache.spark.internal.config.package$.<clinit>(package.scala)
... 23 more
Caused by: java.net.UnknownHostException: ororon: 名前またはサービスが不明です
at java.net.Inet6AddressImpl.lookupAllHostAddr(Native Method)
at java.net.InetAddress$2.lookupAllHostAddr(InetAddress.java:928)
at java.net.InetAddress.getAddressesFromNameService(InetAddress.java:1323)
at java.net.InetAddress.getLocalHost(InetAddress.java:1500)
... 32 more
名前またはサービスが不明です this Japanese sentence translates into Name or Service Unknown.
I have tried changing host name etc but I do not own the server and I have no government over the host names, is there anything I can do about this?
I have no problem running MarkDuplicatesSpark with bam files under 50GB in my own desktop though.
Thank you!
-
Hi, please note that your question was posted while the GATK Team was Out of Office.
Please repost any outstanding GATK issues and we will get to them if possible. Our first priority is solving GATK issues and abnormal results, see our support policy for more details.
Please sign in to leave a comment.
1 comment