Useful Information
Last updated
Was this helpful?
Last updated
Was this helpful?
You also do not need to define executor like you might do for some other cloud Nextflow . By default the executor is 'local'. However, if you are for instance going to be running Nextflow in multiple locations and want different settings based on location you could set a DNAnexus in your nextflow.config which explicitly defines the executor and things like default queueSize.
Here is an example DNAnexus executor profile which also enables docker.
when running on DNAnexus you would then give '-profile dnanexus' to 'nextflow_run_opts' in the UI or in the CLI it would be -inextflow_run_opts='-profile dnanexus'
You could also create a test profile for testing on your own servers/cloud workstation and on DNAnexus.
If pipeline contains inputs from external sources (such as S3, FTP, HTTPS), those files are staged in the head-node and may run out of storage space (inputs sources from DNAnexus are not staged in this way).
The instance size of the head-node can be customized: in "Applet Settings" on the UI with the --instance-type flag on the CLI
20 sessions can be cached per project
The number of times any of those sessions can be resumed is unlimited
To create a support ticket if there are technical issues:
Go to the Help header (same section where Projects and Tools are) inside the platform
Select "Contact Support"
Fill in the Subject and Message to submit a support ticket.
Some of the links on these pages will take the user to pages that are maintained by third parties. The accuracy and IP rights of the information on these third party is the responsibility of these third parties.
Sessions can be deleted to allow more, or development/running can be migrated to another project which will have its own 20-session limit Private S3 can be referenced by adding AWS scope to configs