Question

I have a question regarding stepper motor control while using the Microchip TCP/IP stack.

In the past I have used a timer for my stepper motor control. I set the period of a timer to the required time between pulses and then I change the motor phase outputs as needed in the timer tick ISR. In the cases where I have done this my stepper was moving at a maximum rate of around 400 pulses per second which means the interrupt was occurring every 2.5 milliseconds. And I was using USB for communication to the host.

I am now working on a new product which will use the TCP/IP stack to communicate with a PC over Ethernet. It will also be communicating with other devices via SPI and UART modules. This new device must be capable of operating a stepper up to 2000 pulses per second which means the interrupt may be firing every .5 milliseconds if I use the same timer/ISR approach to drive the stepper. The stepper is turned on and off based on commands received from the host, so communication with the host and operation of the motor need to occur harmoniously and simultaneously. If the stepper speed varies slightly that would not be a problem but it is not ideal. Also, if the stepper were to pause for say a 30ms in the middle of it's move that WOULD not be acceptable.

I am considering using a PIC24F with the instruction clock speed of 16MHz (32Mhz/2 using the internal FRC+PLL) for this project. Do you think that the interrupts for the stepper will disrupt the Ethernet communication or vise-versa? Is there a better way of doing this?

I have considered using a separate PIC for the stepper control and then I could send that pic target position commands or halt commands to start and stop the movement but that would add another firmware into the mix and complicate things all around.

Was it helpful?

Solution

It depends on the hardware, and the best answer is to give it a shot and try.

Your other options are to either use a separate PIC for stepper control like you mention, use pseudo-threading (userland threading, but that's usually not available in most compilers for the PIC platform).

But perhaps what will work best for you is to have the main loop of your software control the stepper motor (for .... move, sleep, continue) and then use interrupts to handle the TCP/IP requests coming in with which you will modify the state registers/variables.

Using interrupts is a good idea, but when you have something this high-priority, polling & looping are better options. To make sure everything is smooth, you'll need to be able to guarantee that your TCP/IP interrupts won't exceed xxx cycles (or milliseconds, same thing on a PIC, really), or else adding stepper control code to the TCP/IP interrupt handler.

Now none of this would be necessary if your PIC controller has prioritized interrupts. In that case, just put the stepper interrupt on a higher priority than the TCP interrupt and you'll be good to go. However, I don't believe PIC has that for user functions, but I could be mistaken. It may also be a good idea to migrate to another platform that does support prioritized interrupts, because that'll make for much cleaner code and make your life easier overall.

OTHER TIPS

I had the same problem which was about managing the Microcontroller cpu's time between controling a stepper motors and receiving data from pc and from sensors but I solved by divide the control program (pulses) into pieces each 1Kbyte and send them to microcontroller memory. then I was able to free its cpu time .

Does the host need to start and stop the stepper on a .5 ms boundary? I don't have any numbers to back this up, but my feeling is that the variable latency of the Ethernet link might dominate your responsiveness to the commands, especially if there are other devices on the same network.

Licensed under: CC-BY-SA with attribution
Not affiliated with StackOverflow
scroll top