VMD-L Mailing List
From: Oliver Beckstein (orbeckst_at_jhmi.edu)
Date: Tue Nov 20 2007 - 14:28:45 CST
- Next message: John Stone: "Re: analysing big trajectories"
 - Previous message: Cesar Millan: "Potential map animation, help!"
 - Next in thread: John Stone: "Re: analysing big trajectories"
 - Reply: John Stone: "Re: analysing big trajectories"
 - Messages sorted by: [ date ] [ thread ] [ subject ] [ author ] [ attachment ]
 
Hi,
is there a way to analyse trajectories that are bigger than the 
available RAM? For instance, I have trajectories > 5GiB in size that I 
would like to analyze with VolMap but they can't be loaded because VMD 
insists on keeping the whole trajectory in memory.
A cumbersome work-around would be to split the trajectory into smaller 
chunks, run volmap on each chunk, then average the resulting dx files. 
However, I can think of situations when a simple average is not enough 
(for instance for time correlation functions) and it would very 
convenient if one could just have a (python-style) iterator over a 
trajectory (similar to the 'for timestep in universe.dcd: ....' idiom 
in http://code.google.com/p/mdanalysis/ ).
(Note: I don't think that increasing swap space is a solution because 
that leads to the computer almost grinding to halt when the trajectory 
is loaded.)
Thanks,
Oliver
-- Oliver Beckstein * orbeckst_at_jhmi.edu Johns Hopkins University, School of Medicine Dept. of Physiology, Biophysics 206 725 N. Wolfe St Baltimore, MD 21205, USA Tel.: +1 (410) 614-4435
- Next message: John Stone: "Re: analysing big trajectories"
 - Previous message: Cesar Millan: "Potential map animation, help!"
 - Next in thread: John Stone: "Re: analysing big trajectories"
 - Reply: John Stone: "Re: analysing big trajectories"
 - Messages sorted by: [ date ] [ thread ] [ subject ] [ author ] [ attachment ]
 



