Frage

Sever code

                if(success){
                    out.write("true".getBytes().length);
                    out.write("true".getBytes());
                    out.flush();
                }
                else{
                    out.write("false".getBytes().length);
                    out.write("false".getBytes());
                    out.flush();
                }

Client Code

        int size = inputStream.read();
        byte[] buf = new byte[size];
        inputStream.read(buf);
        ns = new String(buf);
        Boolean.valueOf(ns);

Although the sever send the result client read it wrong. What is the problem in here? how can i solve it. As example sever send value true but client receive it as false

War es hilfreich?

Lösung

You need to step thread what you are doing exactly. Obviously the simplest way to sent a boolean is as a single byte like this.

out.write(success ? 1 : 0);

and to read this you would do

boolean success = in.read() != 0;

However, if you need to send a string, I would check what string you are reading and what the correct length is, because there is any number of reasons a binary protocol can fail, e.g. because the previous thing you read/wrote was incorrect.

Andere Tipps

Server and Client are probably using different charsets.

Use an explicit one (and the same) in both sides.

see http://docs.oracle.com/javase/6/docs/api/java/lang/String.html

public byte[] getBytes(String charsetName)
            throws UnsupportedEncodingException

and

public String(byte[] bytes,
          String charsetName)
   throws UnsupportedEncodingException
Lizenziert unter: CC-BY-SA mit Zuschreibung
Nicht verbunden mit StackOverflow
scroll top