We use PBS gui interface to queue our HPC jobs up for solving, in our case LS-DYNA
jobs.
Within HyperMesh we setup and configure the different load cases that will be applied
to our models via the sub-system browser enabling us to quickly switch between load
case configurations of our models, works very well.
Now we have created a .tcl script that allows us to export all the particular LS-Dyna job
(.k files) (i.e by selecting which confurations we want exported) into their own directory (folder) on our HPC as a batch output export run.
So within an overall parent directory we might end up with 7 different folders, each with it’s own input .k deck file representing each configuration that was written out.
Each folder is automatically named according to the load case.
Now we use this export process all of the time and it saves us a huge amount of time in the export section of our workflow.
Now at this stage we have to submit each job manually to our HPC for solving and we
currently do this manually through the PBS gui, i.e. set the output folder to the each
load case folder and select the according .k file that lives in that folder.
Now to the actual question…
What we would like to do is somehow create a script or some automated process where we simply select the parent directory and that script/automated process looks into that folder, finds all sub folders, finds all .k solution decks within those folders and automatically queues those jobs into the PBS queue and starts solving.
Ideally the resources set for each job (i.e. cpu cores) would be the same for each job and might be derived from the settings applied to the PBS gui settings at that time however this might just have to hard coded into the script or what the automated process is.
How can we achieve this workflow in PBS, any help would be greatly appraciated.