Question

I have a server and a client that communicate with each other over an UDP socket. The server are opening port 10002 and are listening for incoming datagrams.

For the client to get the server ip it sends one broadcast datagram which the server responds to. The client code responsible for finding the ip address of the server looks like this:

    private IPEndPoint GetServerEP(TimeSpan timeout, UdpClient udpclient)
    {
        IPEndpoint server = new IPEndPoint(IPAddress.Broadcast, 10002);
        byte[] data = GetDiscoverDatagram();

        _udpclient.EnableBroadcast = true;
        udpclient.Send(data, data.Length, server);

        try
        {
            udpclient.Client.ReceiveTimeout = (int)timeout.TotalMilliseconds;
            udpclient.Receive(ref server);
        }
        catch (SocketException e)
        {
            string msg = string.Format("Server did not respond within {0} ms", timeout.TotalMilliseconds);
            throw new TimeoutException(msg, e);
        }

        return server;
    }

Upon running this, I can see that the server actually receives the broadcast datagram and responds with a packet bound for the same port as the client sends from. However, the client does not receive anything and timeouts.

What am I missing?

Was it helpful?

Solution

Stupid me (or; stupid firewall). The code worked, but the firewall blocked the response packet from the server. After disabling everything works like a charm.

Licensed under: CC-BY-SA with attribution
Not affiliated with StackOverflow
scroll top