Question

Designing an Azure application where remote clients will be 'streaming' data/images at a frequency of ~1 write/sec. Data will go to table storage, and images will go to blob storage.

I may want to run logic before these writes are accepted. For instance, limiting the write frequency or validating the data in the case of a bug or tampering- or other supporting operations like thumbnails, service bus use or anything.

One option is to pipe all operations through a REST service running on a worker role. This service would push data out to storage, and perform needed operations. However, given that clients can access storage services directly (shared access signatures securing access), this seems like an unnecessary bottleneck though more can be spun up. Further, having a role increases costs if there is an opportunity to push this logic somewhere else.

Thanks

Was it helpful?

Solution

If your remote clients have a valid SAS they can do whatever they want with the resource based on what is defined in the SAS (Read, Write, Delete, ...). There is no way to add any extra logic on storage level.

What you can do is use a Windows Azure Web Site:

  1. You can use a Web Site which acts as a facade for the Blob/Table & Queue Services. That way you can control how these services are being accessed. Your remote clients won't access the storage services directly.
  2. You can use a Web Site to hand out the SAS. This will allow your clients to access the storage services directly while your Web Site controls if and how long they get to access the storage services. But keep in mind, even if you hand out a SAS for 1 minute, during that minute they'll be able to do whatever they want with the permissions defined in the SAS. If you want to do things like validation etc..., this won't be the best option for you.

And compared to Worker Roles, Web Sites are really cheap (even free).

Licensed under: CC-BY-SA with attribution
Not affiliated with StackOverflow
scroll top