Question

When I give the Keil compiler the "--callgraph" option, it statically calculates the exact "Maximum Stack Usage" for me.

Alas, today it is giving me a "Maximum Stack Usage = 284 bytes + Unknown(Functions without stacksize...)" message, along with a list of "Functions with no stack information".

Nigel Jones says that recursion is a really bad idea in embedded systems ("Computing your stack size" 2009), so I've been careful not to make any mutually recursive functions in this code.

Also, I make sure that none of my interrupt handlers ever re-enable interrupts until their final return-from-interrupt instruction, so I don't need to worry about re-entrant interrupt handlers.

Without recursion or re-entrant interrupt handlers, it should able to statically determine the maximum stack usage. (And so most of the answers to How to determine maximum stack usage? do not apply). My understanding is that the software that handles the "--callgraph" option first finds the maximum stack depth for each interrupt handler when it's not interrupted by a higher-priority interrupt, and the maximum stack depth of the main() function when it is not interrupted. Then it adds them all up to find the total (worst-case) maximum stack depth. That occurs when the main() background task is at its maximum depth when it is interrupted by the lowest-priority interrupt, and that interrupt is at its maximum depth when it is interrupted by the next-lowest-priority interrupt, and so on.

I suspect the software that handles --callgraph is getting confused about the small assembly-language functions in the "Functions with no stack information" list. The --callgraph documentation seems to imply that I need to manually calculate (or make a conservative estimate) how much stack they use -- they're very short, so that should be simple -- and then "Use frame directives in assembly language code to describe how your code uses the stack." One of them is the initial startup code that resets the stack to zero before jumping to main() -- so, in effect, this consumes zero stack. Another one is the "Fault" interrupt handler that locks up in an infinite loop until I cycle the power -- it's safe to assume this consumes zero stack.

I'm using the Keil uVision V4.20.03.0 to compile code for the LM3S1968 ARM Cortex-M3.

So how do I use "frame directives" to tell the software that handles "--callgraph" how much stack these functions use? Or is there some better approach to determine maximum stack usage?

(See How to determine maximum stack usage in embedded system with gcc? for almost the same question targeted to the gcc compiler.)

Was it helpful?

Solution

Use the --info=stack in the linker option. The map file will then include a stack usage for all functions with external linkage.

In a single tasking environment, the stack usage for main() will give you the total requirement. If you are using an RTOS such as RTX where each task has its own stack, then you need to look at the stack usage for all task entry points, and then add some more (64 bytes in the case of RTX) for the task context storage.

This and other techniques applicable to Keil and more generally are described here

OTHER TIPS

John Regehr of the University of Utah has a good discussion of measuring stack usage in embedded systems at http://www.embedded.com/design/prototyping-and-development/4025013/Say-no-to-stack-overflow, though note that the link to ftp.embedded.com is stale, and one occurrence of “without interrupts disabled” should have either the first or last word negated. In the commercial world, Coverity has a configurable stack overflow checker, and some versions of CodeWarrior have a semi-documented warn_stack_usage pragma. (It’s not mentioned in my version of the compiler documentation, but is in MetroWerks’ “Targeting Palm OS” document.)

Licensed under: CC-BY-SA with attribution
Not affiliated with StackOverflow
scroll top