5 million records doesn't sound like a lot to throw on Hadoop. What's the size of your data in gb?
I don't know any Hadoop monitoring tools for Windows but you should start with the basics - is your data splittable? Have a look at the resource manager's view - how many containers did you have for your map-reduce app? Were they distributed on all machines? (the capacity scheduler tends not to distribute the load on several machines if it can stick all of it on one). CPU usage per task attempt, io per task attempt?
You should also store, compare and analyze Windows performance counters - cpu, i/o, network to see if you have any bottlenecks.