Question

I am trying to send some strings and image data from a python script to an objective C application running on OSX.

I am collecting the transmitted data, using GCDAsyncSocket, and appending it to an NSMutableData until the server disconnects. I am then processing that NSData and splitting it into it's original parts.

The transmitted data consists of the following:

ID string, filled out to 16 bytes.

Image number string, filled out to 16 bytes.

Raw image data.

Termination string, filled out to 16 bytes.

The problem is that i am not receiving/getting the last chunk of data, i end up missing the end of the JPEG image, resulting in a corrupt (though mostly displayed) image, and a missing termination string.

Here is the code i am using with GCDAsyncSocket to get the data, and process it:

Socket connection:

- (void)socket:(GCDAsyncSocket *)sock didAcceptNewSocket:(GCDAsyncSocket *)newSocket
{
// This method is executed on the socketQueue (not the main thread)

@synchronized(connectedSockets)
{
    [connectedSockets addObject:newSocket];
}

NSString *host = [newSocket connectedHost];
UInt16 port = [newSocket connectedPort];

dispatch_async(dispatch_get_main_queue(), ^{
    @autoreleasepool {

        [self logInfo:FORMAT(@"Accepted client %@:%hu", host, port)];

    }
});

[newSocket readDataToData:[GCDAsyncSocket CRLFData] withTimeout:-1 tag:0];

}

Socket Data Received

- (void)socket:(GCDAsyncSocket *)sock didReadData:(NSData *)data withTag:(long)tag
{
// This method is executed on the socketQueue (not the main thread)

dispatch_async(dispatch_get_main_queue(), ^{
    @autoreleasepool {

        NSLog(@"Thread Data Length is %lu", (unsigned long)[data length]);
        if (!imageBuffer){
            imageBuffer = [[NSMutableData alloc]init];
        }

        [imageBuffer appendData:[data subdataWithRange:NSMakeRange(0, [data length])]];
        NSLog(@"Total Data Length is %lu", (unsigned long)[imageBuffer length]);

    }
});

// Echo message back to client
[sock writeData:data withTimeout:-1 tag:ECHO_MSG];
    [sock readDataToData:[GCDAsyncSocket CRLFData] withTimeout:-1 tag:0];
}

Socket Disconnected

- (void)socketDidDisconnect:(GCDAsyncSocket *)sock withError:(NSError *)err
{
if (sock != listenSocket)
{
    dispatch_async(dispatch_get_main_queue(), ^{
        @autoreleasepool {

            [self logInfo:FORMAT(@"Client Disconnected")];
            NSData *cameraNumberData;
            NSData *imageNumberData;
            NSData *imageData;
            NSData *endCommandData;
            //if ([data length] > 40){
            cameraNumberData = [imageBuffer subdataWithRange:NSMakeRange(0, 16)];
            imageNumberData = [imageBuffer subdataWithRange:NSMakeRange(16, 16)];
            imageData = [imageBuffer subdataWithRange:NSMakeRange(32, [imageBuffer length]-34)];
            endCommandData = [imageBuffer subdataWithRange:NSMakeRange([imageBuffer length]-16, 16)];
            //}
            NSString *cameraNumberString = [[NSString alloc] initWithData:cameraNumberData encoding:NSUTF8StringEncoding];
            NSString *imageNumberString = [[NSString alloc] initWithData:imageNumberData encoding:NSUTF8StringEncoding];
            NSString *endCommandString = [[NSString alloc] initWithData:endCommandData encoding:NSUTF8StringEncoding];
            NSImage* image = [[NSImage alloc]initWithData:imageData];
            if (cameraNumberString)
            {
                NSLog(@"Image recieved from Camera no %@", cameraNumberString);
                [self logMessage:cameraNumberString];
            }
            else
            {
                [self logError:@"Error converting received data into UTF-8 String"];
            }

            if (imageNumberString)
            {
                NSLog(@"Image is number %@", imageNumberString);
                [self logMessage:imageNumberString];
            }
            else
            {
                [self logError:@"Error converting received data into UTF-8 String"];
            }

            if (image)
            {
                NSLog(@"We have an image");
                [self.imageView setImage:image];
            }
            else
            {
                [self logError:@"Error converting received data into image"];
            }

            if (endCommandString)
            {
                NSLog(@"Command String is %@", endCommandString);
                [self logMessage:endCommandString];
            }
            else
            {
                [self logError:@"No command string"];
            }

            //self.imageBuffer = nil;

        }
    });

        @synchronized(connectedSockets)
    {
        [connectedSockets removeObject:sock];
    }
}
}

I have used wireshark, and the data is being transmitted, it's just not getting through GCDAsynSocket.

So, i'm obviously missing something. Socket programming and encoding/decoding of data like this is relatively new to me, so i am probably being an idiot.

Help greatly appreciated!

Thanks

Gareth

Was it helpful?

Solution

Ok, so i finally got this working. It involved modifying the transmitting code in Python to send a completion string at the end of the data, and watching for that. The biggest takeaway was that i needed to re-call the readDataToData: method each time the socket read some data, otherwise it would just sit there and wait, and the transmitting socket would also just sit there.

I also had to implement re-calling the second receive with a tag so i could store the received data in the correct NSMutableData object in an NSMutableArray, otherwise i had no way of knowing after the first receive which transmitting socket the data was coming from as the ID was only at the beginning of the first message.

Here is the didReadData code:

- (void)socket:(GCDAsyncSocket *)sock didReadData:(NSData *)data withTag:(long)tag
{

dispatch_async(dispatch_get_main_queue(), ^{
    @autoreleasepool {

        NSInteger cameraNumberNumber = 0;
        NSString *cameraNumberString = [[NSString alloc]init];

        if (tag > 10){

            cameraNumberNumber = tag-11;
            DDLogVerbose(@"Second data loop, tag is %ld", tag);
        } else {

        NSData *cameraNumberData;
        //if ([data length] > 40){
        cameraNumberData = [data subdataWithRange:NSMakeRange(0, 16)];
        NSString *cameraNumberString = [[NSString alloc] initWithData:cameraNumberData encoding:NSUTF8StringEncoding];
        cameraNumberString = [cameraNumberString stringByTrimmingCharactersInSet:[NSCharacterSet whitespaceAndNewlineCharacterSet]];
        cameraNumberNumber = [cameraNumberString intValue]-1;

        }

        if (cameraNumberNumber+1 <= self.images.count){

                if ([self.images objectAtIndex:cameraNumberNumber] == [NSNull null]){
                        image* cameraImage = [[image alloc]init];
                        [self.images replaceObjectAtIndex: cameraNumberNumber withObject:cameraImage];
                    }

                image* cameraImage = [self.images objectAtIndex:cameraNumberNumber];
                [cameraImage.imageData appendData:[data subdataWithRange:NSMakeRange(0, [data length])]];
                cameraImage.cameraNumber = cameraNumberString;

                if (!imageBuffer){
                        imageBuffer = [[NSMutableData alloc]init];
                    }


                [imageBuffer appendData:[data subdataWithRange:NSMakeRange(0, [data length])]];
                DDLogVerbose(@"Total Data Length is %lu", (unsigned long)[imageBuffer length]);
        } else {

            DDLogInfo(@"Wrong camera quantity!");
            NSAlert *testAlert = [NSAlert alertWithMessageText:@"Wrong camera quantity!"
                                                 defaultButton:@"Ok"
                                               alternateButton:nil
                                                   otherButton:nil
                                     informativeTextWithFormat:@"We have recieved more images than cameras, please set No.Cameras correctly!"];

            [testAlert beginSheetModalForWindow:[self window]
                                  modalDelegate:self
                                 didEndSelector:@selector(stop)
                                    contextInfo:nil];

        }

                [sock readDataToData:[@"end" dataUsingEncoding:NSUTF8StringEncoding] withTimeout:-1 tag:cameraNumberNumber + 11];

    }

});
}

and here is the socketDidDisconnect code, a lot of things in here that don't make sense out of context, but it shows how i handled the received data.

- (void)socketDidDisconnect:(GCDAsyncSocket *)sock withError:(NSError *)err
{
if (sock != listenSocket)
{
    dispatch_async(dispatch_get_main_queue(), ^{
        @autoreleasepool {
            totalCamerasFetched = [NSNumber numberWithInt:1+[totalCamerasFetched intValue]];
            if ([totalCamerasFetched integerValue] >= [numberOfCameras integerValue]){

                for (image* cameraImage in self.images){

                        NSData *cameraNumberData;
                        NSData *imageNumberData;
                        NSData *imageData;
                        NSData *endCommandData;
                        NSInteger cameraNumberNumber = 0;
                        cameraNumberData = [cameraImage.imageData subdataWithRange:NSMakeRange(0, 16)];
                        imageNumberData = [cameraImage.imageData subdataWithRange:NSMakeRange(16, 16)];
                        imageData = [cameraImage.imageData subdataWithRange:NSMakeRange(32, [cameraImage.imageData length]-32)];
                        endCommandData = [cameraImage.imageData subdataWithRange:NSMakeRange([cameraImage.imageData length]-16, 16)];
                        NSString *cameraNumberString = [[NSString alloc] initWithData:cameraNumberData encoding:NSUTF8StringEncoding];
                        cameraNumberString = [cameraNumberString stringByTrimmingCharactersInSet:[NSCharacterSet whitespaceAndNewlineCharacterSet]];
                        NSString *imageNumberString = [[NSString alloc] initWithData:imageNumberData encoding:NSUTF8StringEncoding];
                        imageNumberString = [imageNumberString stringByTrimmingCharactersInSet:[NSCharacterSet whitespaceAndNewlineCharacterSet]];
                        NSString *endCommandString = [[NSString alloc] initWithData:endCommandData encoding:NSUTF8StringEncoding];
                        NSImage* image = [[NSImage alloc]initWithData:imageData];
                        cameraNumberNumber = [cameraNumberString intValue]-1;






                        if (cameraNumberString)
                            {
                                    DDLogInfo(@"Image recieved from Camera no %@", cameraNumberString);
                            }
                        else
                        {
                                    DDLogError(@"No Camera number in data");
                        }

                        if (imageNumberString)
                        {
                                    DDLogInfo(@"Image is number %@", imageNumberString);
                        }
                        else
                        {
                                    DDLogError(@"No Image number in data");
                        }




                        if (image)
                        {

                        DDLogVerbose(@"We have an image");


                        NSString* dataPath = [[NSString alloc]initWithFormat:@"%@/image%@/",self.exportLocation, imageNumberString];

                        if (![[NSFileManager defaultManager] fileExistsAtPath:dataPath]){

                                NSError* error;
                                [[NSFileManager defaultManager] createDirectoryAtPath:dataPath withIntermediateDirectories:NO attributes:nil error:&error];

                                if (error)
                                    {
                                            DDLogError(@"[%@] ERROR: attempting to write directory for images", [self class]);
                                            NSAssert( FALSE, @"Failed to create directory maybe out of disk space?");
                                        }
                            }

                        NSString* dataPathVideo = [[NSString alloc]initWithFormat:@"%@/video%@/",self.exportLocation, imageNumberString];

                        if (![[NSFileManager defaultManager] fileExistsAtPath:dataPathVideo]){

                                NSError* error;
                                [[NSFileManager defaultManager] createDirectoryAtPath:dataPathVideo withIntermediateDirectories:NO attributes:nil error:&error];

                                if (error)
                                {
                                    DDLogError(@"[%@] ERROR: attempting to write directory for images", [self class]);
                                    NSAssert( FALSE, @"Failed to create directory maybe out of disk space?");
                                }
                            }

                        NSString * exportLocationFull = [[NSString alloc]initWithFormat:@"%@/image%@/camera_%@.jpg",self.exportLocation, imageNumberString, cameraNumberString];
                            DDLogInfo(@"Full export URL = %@", exportLocationFull);
                        [imageData writeToFile:exportLocationFull atomically:YES];
                        self.currentSet = [NSNumber numberWithInt:[imageNumberString intValue]];

                        NSImage* imageToStore = [[NSImage alloc]initWithData:imageData];


                        [self.imagesToMakeVideo replaceObjectAtIndex: cameraNumberNumber withObject:imageToStore];


                        } else {
                            DDLogError(@"No image loacted in data");
                        }

                        if (endCommandString)
                        {
                            DDLogVerbose(@"Command String is %@", endCommandString);
                            //[self logMessage:endCommandString];
                        }
                        else
                        {
                            //[self logError:@"No command string"];
                        }

                        self.imageBuffer = nil;

                    }

                self.totalCamerasFetched = [NSNumber numberWithInt:0];
                [self loadandDisplayLatestImages];
                [self createVideowithImages:imagesToMakeVideo toLocation:[[NSString alloc]initWithFormat:@"%@/video%@/image_sequence_%@.mov",self.exportLocation, self.currentSet, self.currentSet]];
                processing = false;
            }//end of for loop
        }
    });

    @synchronized(connectedSockets)
    {
        [connectedSockets removeObject:sock];
    }
}

}

also here is how i modified the Python code to add the extra "end" tag.

def send_media_to(self, ip, port, media_name, media_number, media_dir):
    camera_number = self.camera.current_mode['option'].number
    sock = socket.socket(socket.AF_INET, socket.SOCK_STREAM)
    sock.connect((ip, port))
    try:
        sock.send(bytes(str(camera_number).ljust(16), 'utf-8'))
        sock.send(bytes(str(media_number).ljust(16), 'utf-8'))
        with open(media_dir + media_name, 'rb') as media:
            sock.sendall(media.read())
    finally:
        sock.send(bytes(str("end").ljust(16), 'utf-8'))
        sock.close()

Hopefully this helps someone else stuck in the same situation!

Licensed under: CC-BY-SA with attribution
Not affiliated with StackOverflow
scroll top