In order to run a job on SGE I need to submit the script containing SGE parameters and commands of my job. For example, suppose I have a script named myscript.sge contaning:
#$ -some_sge_parameter
#$ -another_sge_parameter
#$ -so_many_sge_parameters
my_first_command
second_command -i some_input_here -p some_parameter_here -x also_here
Then I'd simply use qsub myscript.sge and be done with that.
But I have a bunch of different samples, and parameters, so I wrote a Python script writing many scripts that are basically the same, but differ in either sge or job parameters. The said Python script saves the string file in a bash file, and then submits that said file to SGE. For example:
import os
...#some code above
string_to_write = generate_script_contents(sample,argument1,argument2)...
script_name = write_script_to_file_and_return_scriptname(sample,stromg_to_write)
os.system(f"qsub {script_name}")
Puting the above block in a loop of some kind ensures that I submit many jobs with the least amount of effort. Now the problem is that I have to write the contents of the script to the said script - I get extra step of creating a "physical" bash script. Can I somehow circumvent the actual process of writing a script, and just submitting the script content to the SGE?
The reason why I'm doing this is because each file I have will be split into unknown number of mini-files that will be subjected to some process. So for say, 8 files, I can get 40 mini-files - meaning 40 scripts being physically present in my directory. It's an eyesore. I know that I can delete the script after I submit it, but these are just extra steps.