Genome Analysis Toolkit

Variant Discovery in High-Throughput Sequencing Data

GATK process banner

Need Help?

Search our documentation

Community Forum

Hi, How can we help?

Developed in the Data Sciences Platform at the Broad Institute, the toolkit offers a wide variety of tools with a primary focus on variant discovery and genotyping. Its powerful processing engine and high-performance computing features make it capable of taking on projects of any size. Learn more

change the memory when running SVPreprocess

0

2 comments

  • Avatar
    Genevieve Brandt (she/her)

    Thank you for your post! Bob Handsaker has been tagged and will get back to you shortly.

    0
    Comment actions Permalink
  • Avatar
    zzq

    Dear all,

    Here is the comments from Bob, it works for me when setting "-memLimit 8".

    Hi,

    One approach is to pass -memLimit N (gigabyte units, e.g. -memLimit 4), which will set the default java memory limit for all of the individual jobs (not just ReduceInsertSizeHistograms). This will only affect jobs that do not specifically set a different memory limit in the Q script.

    The other option is to edit the Q script itself to set a higher memory limit for ReduceInsertSizeHistograms only. In qscript/SVQScript.q see the definition for ReduceInsertSizeHistograms and add 'this.memoryLimit = Some(4)' or whatever you need. You will see other examples of commands that override the default memory limits in this way (e.g. MergeSamFiles requests 8g).

    Since you are using -jobRunner ParallelShell, giving large amounts of memory to the parent Queue process with

    ms="-Xms40g"
    

    is actually counter-productive. The parent process is just tracking what needs to be done and forking child processes to do the work. So you should just give it 4g (or maybe 8g if you are processing thousands of input files) and that should be plenty.

    -Bob

    1
    Comment actions Permalink

Please sign in to leave a comment.

Powered by Zendesk