Maybe you should check how the 'maximum number of AIX data segments that a process can use' is set (enviromnent variable LDR_CNTRL). See IBM Performance Tuning Guide On aix ulimit
lies if it tells you data size is unlimited.
Question
In our production environment, when we are executing an .so file as part of batch we always encounter a fatal as below :
calloc failed for 9088 bytes Date 12-07-2013 01:55:05
Could you please let me know the possible reasons for the calloc memory issue.
Solution
OTHER TIPS
There are a number of possible reasons for this:
- The process ran out of memory. Relatively unlikely, and you've discounted this.
- The control information that
calloc()
uses has been corrupted by over-write, leading to an incorrect report that the program is out of memory. This could be because your program wrote outside the bounds of memory that it was allocated. - The line of code that reports the error is written incorrectly.
Since the size is small (9088 bytes), it is unlikely that you've run into arithmetic overflow issues.
Often, you'd be advised to use valgrind
to detect memory problems. However, that isn't available on AIX. Maybe you have Purify; that is an excellent tool for the job (but it is not free software).
There may be two reasons behind this
1)You do not have sufficient memory
2)Calloc is not getting contiguous memory block of the requested size.
In first case you do not have any option but if there is second case then you can break your one calloc call into 2.3..or more calls to get small blocks of contiguous memory.