Domanda

I have a C# WCF service that hosts 120 GB of memory in a Dictionary<File,byte[]> for very fast access of file contents, which really worked well with me. Upon access, the file contents were wrapped within a MemoryStream and read

This service needs to be restarted everyday to load some static data from the database that could change on daily basis. The restart took so much time because of the huge data that need to be loaded again into memory

So I decided to host this memory in a different process on the same machine, and access it through sockets. The Data process will be always up and running. TcpListener/Client and NetworkStream were used in a similar fashion to the following

memoryStream.Read(position.PositionData, 0, position.SizeOfData);

position.NetworkStream.Write(position.PositionData, 0, position.SizeOfData);

Problem is: this was 10 times slower than hosting the memory in the same process. Slowdown is expected, but a factor of 10 is too much.

I thought of MemoryMappedFiles, but those are more useful for random access to a specific view of the file. My file access is sequential from the beginning all the way to the end.

Is there a different technology or library that could be used in my case? or is this just so expected?

È stato utile?

Soluzione

I assume you are using SQLServer. If so, Service Broker & SQLNotificaiton Or Query notification may be of your friends here. I presume, you need more of a push messaging model, which automatically propagate changes back to service (if something change in db). Therefore, avoid restarting memory/resource intensive process hence no need to remap your heavy weight dictionary.

Autorizzato sotto: CC-BY-SA insieme a attribuzione
Non affiliato a StackOverflow
scroll top