Pergunta

I will be deploying my first application based on WCF and would like to know the best way to deploy. Here is my architecture. Please see the attached image.

We have a WCF written using 4.0 framework and has 3 methods. A front end ASP.NET website (www.site.com) calls the WCF to save data as well as read data. In figure method1 is saving to data and method2 and 3 are for reading the data from SQL server 2008 R2 database.

In my ASP.Net webstie...

  1. I am calling the Method1 and closing the connection...like this.. ServiceClient client = new ServiceClient(); client.Method1(data to be saved) client.close();

  2. I am calling method 2 and 3 as follows ServiceClient client = new ServiceClient(); dropDown1list.datasource = client.Method2() dropDown2list.datasource = client.Method3() client.close();

Multiple users could be using the website at the same time to submit the data. Considering this architecture..what would be the best way to deploy the WCF so that it could handle multiple users at same time?. I read the article http://www.codeproject.com/Articles/89858/WCF-Concurrency-Single-Multiple-and-Reentrant-and and http://www.codeproject.com/Articles/86007/ways-to-do-WCF-instance-management-Per-call-Per.

I now believe I need to have my WCF service as

 [ServiceBehavior(ConcurrencyMode = ConcurrencyMode.Multiple ,  InstanceContextMode = InstanceContextMode.PerCall )]
public class Service : IService
{
   public bool Method1(data to be saved)
    {

    }

    public List<string> Method2()
    {

    }
    public List<string> Method2()
    {

    }

}

Am I right ?. Any suggestions ?. enter image description here

Foi útil?

Solução

Just answered a similar question yesterday. Based on your description and the picture, I don't see a need to change your architecture. If you're using one of the main WCF bindings (webHttpBinding, wsHttpBinding or BasicHTTPBinding), the service you deploy should easily be able handle dozens of concurrent users, all saving and reading at the same time.

Each client request will generate its own connection and web service objects, each of which can communicate concurrently with your database, whether that request is to read data or write data. When the response is sent back to the client, your WCF service will destroy the objects and clean up the memory for you as long as you're not doing something strange.

I've spent the last two years working on WCF web services on and industrial scale. Lately I've been working on a load testing / benchmarking project that spins up hundreds of concurrent users, each of which is slamming our WCF test server with XML artifacts that get loaded into the database. We've managed to load up to 160 packages (about 110kb - each per client) per second. WCF is not perfect, but it's quick, clean and scales really well.

My experience has been that your database will be your bottleneck, not your WCF web service. If your client wants to scale this archtecture up to an Amazon size web service, then you bring in an F5 load balancer and scale it up that way.

Licenciado em: CC-BY-SA com atribuição
Não afiliado a StackOverflow
scroll top