It has been reported that in some cases the following error occurs with TextWrangler during the course of a job run to view the resulting output file (irrespective of how the job was actually launched):
TextWrangler sometimes produces error messages which I imagine are to do with the file becoming too large.
"There is insufficient memory to complete this operation (MacOS Error code: -108)"
According to Apple http://support.apple.com/kb/HT1618 this error (-108) is called memFullErr meaning that the program "Ran out of memory [not enough room in heap zone". This error occurs because when the file is output by TextWrangler each time that the file is changed, each revision is written to the heap memory. This is done so that the program can support undo and other revert functions while a document is changed since it was initially opened. While useful for developers (the primary users of TextWrangler) this function is not useful or even logical for the sole purpose of viewing the resulting outputfile from a Firefly job run. Unfortunately, it seems that the ability to open a document in a read-only mode is not supported at this time in TextWrangler. While having a pure 64-bit version of TextWrangler (also not available at this time as it is distributed a 32-bit Universal binary) would make this error less likely to occur, the process might fill up the heap memory needless and cause many other problems in the process (such as squeezing out valuable memory for use in Firefly jobs themselves!).
This error is believed to possibly effect all Mac OS on which Firefly for Mac can run (Tiger, Leopard and Snow Leopard) and most likely also effects systems running either the 32-bit hybrid (default) kernel and pure 64-bit kernel systems (a boot time option for most newer machines). In general, this issue is most likely to appear for larger output files (such as geometry optimization or transition state searches) or for very verbose output options for bigger job runs. Lastly, this memory error is also more likely to appear is you do not appear sufficient additional memory available for TextWrangler during a job run (in essence for example if almost all of the available memory is used for the Firefly job itself).
Find below a list of possible workaround for this known issue. These suggestions will also included in the next Beta and Official documentation updates for release 7.1.H when it shall become available.
1) When this memory error occurs, goto Edit -> Undo History to clear it up for some time (the heap memory buffer will get cleared by doing this). The program will continue to automatically refresh the outputfile from a Firefly job just as normal after this. You may need to manually re-open the Firefly output file after experiencing the reported memory error issue.
2) Turn off the option to "Automatically refresh documents as they change on disk". This option is available as a checkbox by going to TextWrangler -> Preferences -> Application. If this option is disabled (not checked) then you can instead refresh the file contents when desired via File -> Reload from Disk. This is considered a manual refresh style workaround by disabling the auto-refresh option. This would serve as a complete solution to this issue, although the convenience of the autofresh of the output file would be gone.
3) If possible, add NPRINT=-5 option to the $CONTRL section of your input file to minimize the resulting Firefly output file size. This will naturally reduce the size of heap memory buffer used by TextWrangler (possibly by a factor five times or more actually) and might cause the issue to go away without any other changes made. You will need to read the relevant Firefly documentation to determine whether this smaller output file option is suitable for your particular needs (or else just try it and see for yourself).
I want to thank Nick Greeves for pointing out this bug and for the software developers at BareBones (that release TextWrangler) for their various tips and comments.
If any users have additional related tips, comments or problems please add them to this thread.