Firefly and PC GAMESS-related discussion club


 
Learn how to ask questions correctly  
 
 
We are NATO-free zone
 



Re: Job does not stop after TIMLIM is reached

Alex Granovsky
gran@classic.chem.msu.su


Dear Bernhard,

certain parts of Firefly, including MCSCF code, do not correctly
check for TIMLIM when running in parallel and this may result in
hangups. Moreover, it is not always possible for Firefly to perform
graceful shutdown, especially when running multi-threaded code.

I'll try to improve TIMLIM handling in the next minor version of Firefly.

Note, many parts of Firefly just ignore TIMLIM completely so that
the use of TIMLIM is not an universal solution.

All the best,
Alex





On Tue Feb 21 '17 11:35pm, Bernhard Dick wrote
----------------------------------------------
>Dear Alex,
>I usually set TIMLIM to a very large value and run Firefly on my desktop. Now I got it installed on a Linux cluster where I have to specify the runtime. We see that Firefly gives a message in the output file that time is up, but nevertheless crashes against the time limit set by the operator.

>I checked this behavior on my desktop (under Windows), and it appears to be the same: I set TIMLIM=2400, and Firefly writes into the output file:
>
>
> ITER    TOTAL ENERGY          DEL(E)    LAG.ASYMM.  SQCDF  MICIT   DAMP
>     ----------START APPROXIMATE SECOND ORDER MCSCF----------
>   1    -262.902475823    -262.902475823  0.010114 5.018E-04  1   0.0000
>   2    -262.904007205      -0.001531381  0.002691 1.113E-04  1   0.0000
> RUNNING OUT OF CPU TIME...
> MCSCF IS NOT CONVERGED!
> A $VEC GROUP OF CURRENT MO-S IS IN THE PUNCH FILE
> USE THIS WITH GUESS=MOREAD TO RESTART THIS RUN

> FINAL MCSCF ENERGY IS        0.0000000000 AFTER   2 ITERATIONS

>but the processors continue to run at full speed (100% in Task Manager), until I stop the run manually.

>What do I need to do to let Firefly stop it?
>best regards,
>Bernhard
>


[ Previous ] [ Next ] [ Index ]           Fri Feb 24 '17 0:10am
[ Reply ] [ Edit ] [ Delete ]           This message read 430 times