Question

To start, I have to say that I set autoset in date&time settings and time zone is the same for each device. So I use [NSDate date] to get time stamp in milliseconds, then encode to NSData and send to another device. On receiver data is being decoded and subtract with new [NSDate date]. So that I get total time needed for send and receive message. That I was thought because when sender is iPhone 4 iOS6 and receiver is iPhone 5 iOS7 then receiver have earlier time stamp than sender. I don't know why? Maybe [NSData date] isn't the most reliable class for that kind of operations? I use GCDAsyncUdpSocket for sending/receiving UDP.

Code sender

NSData *data2 = [self createRandomNSData:8192];
NSMutableData *dataToSend =[NSMutableData data];
[dataToSend appendBytes:&tag length:sizeof(int)];
long long currentTimeStamp = (long long)([[NSDate date] timeIntervalSince1970]*1000.0);
[dataToSend appendBytes:&currentTimeStamp length:sizeof(long long)];
[dataToSend appendData:data2];
NSLog(@"%i || %lld || %lu",tag, currentTimeStamp,(unsigned long)[dataToSend length]);
[_udpSocket sendData:dataToSend toHost:@"230.0.0.1" port:_port withTimeout:-1 tag:tag];
tag++;

Code receiver

char* dataBytes = [data bytes];
int inTag;
long long inCurrentTimeStamp;
[data getBytes:&inTag length:sizeof(int)];
[data getBytes:&inCurrentTimeStamp range:NSMakeRange(sizeof(int), sizeof(long long))];
long long currentTimeStamp = (long long)([[NSDate date] timeIntervalSince1970]*1000.0);
long long timeStampDiff = currentTimeStamp - inCurrentTimeStamp;
self.delay = timeStampDiff;
NSLog(@"%i || %lld || %lu",inTag, timeStampDiff,(unsigned long)[data length]);
Was it helpful?

Solution

NSDateFormatter *dateFormatter = [[NSDateFormatter alloc] init];
[dateFormatter setDateFormat:@"MM/dd/yyyy hh:mm:ss"];
[dateFormatter setLocale:[[NSLocale alloc] initWithLocaleIdentifier:@"en_US_POSIX"]];
NSString *strSystemTime = [dateFormatter stringFromDate:[NSDate date]];

I faced the same issue and resolved it by setting NSLocale. I hope this solution works for you too.

OTHER TIPS

Don't use a long long of the NSDate's timeIntervalSince1970 * 1000. Use the timeIntervalSince1970 expressed as a double, directly. That will save all the resolution of the date.

Simply add bytes to your data that are the sizeof(double).

Log the double value and it's byte stream before sending, and the double value and it's byte stream on receipt on the remote device and compare them.

If both devices are phones on the same network, and you have them to set their clocks automatically (settings>general>date and time) then their clocks should be synchronized within a fraction of a second.

On receiver data is being decoded and subtract with new [NSDate date]

That's the problem. It has nothing to do with the iOS version installed on the device: in general, if your timestamps are produced by different computers, you cannot subtract them, and expect any kind of precision to come out of it, because the device clocks are not synchronized enough to measure network latency, because of clock skew.

Consider this simplistic example: let's say the clocks on computers Alice and Bob are 10 seconds apart: when Alice's clock shows 12:00:00, Bob's clock shows 12:00:10.

Alice sends Bob its timestamp, which says 14:23:06. It takes the package one second to reach Bob, now Bob sees 14:23:17 when the package arrives. If Bob simply subtracts Alice's timestamp from his own, he would conclude that the package took 11 seconds to reach it.

If Bob sends Alice his timestamp now - let's say it's 14:23:18, Alice would receive it one second later, which by Alice's clock would be 14:23:09. Now Alice would conclude that the package took -9 (yes, negative nine!) seconds to reach it, which makes no sense at all.

Fortunately, if it is fair to assume that the latency is the same on both legs of the round-trip, you can measure the latency by factoring out the clock skew. The idea is to obtain two pairs of timestamps constructed in such a way that the clock skew is a factor in both pairs, but the sign of the skew is opposite.

Consider the timestamps from the above example:

A1=14:23:06 B1=14:23:17
B2=14:23:18 A2=14:23:09

Each pair, A1-B1 and B2-A2, contain the skew, but in the first pair the skew is positive, while in the second pair it is negative. Hence, if you average the two time differences, you would end up with your roundtrip delay.

((B1-A1)+(A2-B2)) / 2 =
(11 + -9) / 2         =
2 / 2                 = 1 second

This should be enough for you to implement a simple program for measuring the roundtrip latency in your system.

long  intervalValue= (long)([[NSDate date]  timeIntervalSince1970]);

NSString *intervalString =[NSString stringWithFormat:@"%ld",intervalValue];

int dif=13-[intervalString length];

for (int k=0; k<dif; k++) {

    intervalString=[NSString stringWithFormat:@"%@0",intervalString];
}

unsigned long long convertedValue=[intervalString longLongValue]+0530;
Licensed under: CC-BY-SA with attribution
Not affiliated with StackOverflow
scroll top