You can place the file in the reach of a webserver and then use curl
from the clients
curl --range 10000-20000 http://the.server.ip/file.dat > result
would get 10000 bytes (from 10000 to 20000)
If the file is highly redundant and the network is slow probably using compression could help speeding up the transfer a lot. For example executing
nc -l -p 12345 | gunzip > chunk
on the client and then executing
dd skip=10000 count=10000 if=bigfile bs=1 | gzip | nc client.ip.address 12345
on the server you can transfer a section doing a gzip compression on the fly without the need of creating intermediate files.
EDIT
A single command to get a section of a file from a server using compression over the network is
ssh server 'dd skip=10000 count=10000 bs=1 if=bigfile | gzip' | gunzip > chunk