From: Ashar Malik (
Date: Sat Jan 21 2023 - 05:36:52 CST

Let me simplify from what I have gathered from this.

You have say 5 python scripts say ….

You want to run them in an automated manner?

Why not make a master python script? And use the ‘os’ or subprocess library
to trigger the 5 scripts?
Sorry not a bash person - so will suggest an all Python solution.

My master script would read something like this

import os, subprocess


The above structure will run s2 only when s1 finishes whatever it’s doing.

Also I would structure the VMD command via subprocess like this

subprocess.Popen(“vmd -dispdev text -eofexit < tclscript.tcl”, ….)

The above structure allows you to capture stdout dump from shell using
subprocess.PIPE - if you don’t want that and you are writing to disk in
your tcl script you can simply run it through

os.system(“vmd -dispdev text -eofexit < tclscript.tcl > /dev/null 2>&1”)

Not sure if I understood your problem well enough - but the above structure
works for me and I have opened and closed VMD within one and many python
scripts using the above structure.

Hope this helps.
Otherwise write back and someone can also suggest a bash solution. Good

On Sat, 21 Jan 2023 at 7:19 PM, Ryan Woltz <> wrote:

> Dear community,
> I'm not exactly sure where to go with this question but hopefully
> someone can give me some direction. My colleague made multiple python
> scripts that open vmd and do a series of analysis that take 1-2 hours. I'd
> like to combine them into a single master script and run everything the
> full analysis for a continuous 24-48 hours instead of micromanaging 20+
> analysis which needs to be restarted at staggered intervals, I do not write
> in python and do most scripting in bash.
> The problem is when I use a bash master script that calls multiple
> python scripts that open vmd it'll "stopped" on the second opening of vmd--000000000000ce92ef05f2c495a7--