SUBMIT COMPRESSED FILES USING SAVED CONFIGURATION

Submits one or more compressed simulation files to an HPC cluster using a pre-saved HPC configuration, automatically locating the solver input deck via a keyword identifier. Use this worker when the full simulation package is already archived and a stored HPC settings profile is available for job dispatch.

When to use

Tagged: archive, compressed_files, hpc, job_submission, saved_config, solver_submit.

Inputs

Label ID Type Default Required Description
Study Name study_name text   Optional human-readable label for the submitted study; used to identify the job in the platform UI — leave blank to inherit the default job name.
HPC Settings hpc_settings remote_lookup One or more pre-saved HPC configuration profiles (remote lookup keyed by ‘remote_lookup_hpc_settings’) that define cluster, queue, resource limits, and solver environment; at least one entry is required.
Input File Keyword include_file_keyword text Keyword string used to identify the primary solver input (include/deck) file within the compressed archive (e.g. ‘input.key’ or a pattern matched by the HPC dispatcher); required.
Input File input_file file One or more compressed archive files (e.g. .zip, .tar.gz) containing the full simulation package to be submitted; repeatable to support multi-part archives.

Outputs

Label ID Type Description
Job Submit By Config Output 1 job_submit_by_config_output_1 integer Integer primary key of the simulation record created in the d3VIEW database upon successful job submission.
Job Submit By Config Output 2 job_submit_by_config_output_2 integer Integer primary key of the HPC job record tracking queue status, resource usage, and completion state for the submitted job.

Disciplines

  • data.io.archive
  • platform.hpc_config
  • platform.job_submission

Auto-generated from platform schema. Worker id: job_submit_compressed_files. Schema hash: 0f9b085eb1da. Hand-curated docs in workerexamples/ override this page when present.