As much as I wish this was a simple task I dont think it will be.
Essentially unity has only one native iOS UI Element and that is the main view which takes up the entire screen.
The only way I can think that an automation system can be put together is by asking unity itself where things are within the screen so you can simulate the events with the X:, Y: Coordinates.
UIATarget.localTarget().tap({x:100, y:200});
but to do this you will likely need to build a native plugin for unity.
Native plugins are not horribly complicated, but they do require an intelligent design and a specific setup for each operating system.
Here is a link to a playlist with a few iPhone native plugin videos that will walk you through the process of building a native plugin where you can interface into Objective-C code.
SuperLazyCoder Youtube Playlist Watch this video and the next 2 to get the full idea about iPhone plugins
The rest is a matter of putting together a unity script that can convert your objects into Screen Coordinates and piping those back to your test application.
Hope that helps :) I would be eager to see the outcome.