PC GAMESS/Firefly-related discussion club



Learn how to ask questions correctly


Re: xdlb=.t. and mixed=.t. keywords cooperation

Alex Granovsky
gran@classic.chem.msu.su


Hi,

there are three disk-intensive stages during MP2 gradient calculations using new Firefly's code.

The first one is the first half-transformation.
It does uses dynamic load balancing (dlb) as well as p2p communications and disk I/O.
However, as the result of this stage, the half-transformed integrals
are evenly distributed across all the processes (nodes). This assumes
that equal I/O and computations are to be performed by every process
during second and third stages.

The second stage (second half-transformation) uses disk I/O and p2p
communication but does not use dlb.

Finally, the last one (two-electron gradient code) does not use p2p
and dlb, but rather only disk I/O.

Note, there is no part of Firefly that explicitly load balanced for
equal disk I/O load. Only overall CPU load is balanced instead.
Finally, the mixed=.t. keyword does not affect the details of any
parallelization scheme.

regards,
Alex Granovsky


On Mon Nov 16 '09 2:40pm, Sergey wrote
--------------------------------------
>Dear all,

>can anyone give me a hint how PCG/FF will deal with the following situation:

>Workstation A: 4 cores 15 GFlops/Core, 4x130 Mbs/disks
>Workstation B: 4 cores 10 GFlops/Core, 4x60 Mbs/disks

>when working over p2p with xdlb=.t. and mixed=.t. enabled on MP2 optimization jobs?

>Specifically, that results in ~60% efficiency on the second half-tranformation of MP2 for the A, and ~30% for the B workstation.

>Question: will the xdlb=.t. distribute pairs so that neither A or B workstations will wait NOT for each other, i.e. scale according disk throughoutput capacity, OR, will the pairs be distributed according processors performance ration and thus, the better disk performer will always wait OR will it adjust and adopt to the disk performance automatically based on averages calculated from the previous cycle of optimization?

>Thank you!


[ Previous ] [ Next ] [ Index ]           Sat Nov 21 '09 1:04pm
[ Reply ] [ Edit ] [ Delete ]           This message read 880 times