What kind of caching model does Content distribution networks use ? Specifically do they use ( akamai, edgecast, bitgravity, cotendo etc ..) i.e. when they have a cache missing, do they come to source and make sure they distribute the cotent internally ?

有帮助吗?

解决方案

I would assume that each CDN supported a slightly different architecture. Akamai supports 2 levels of their own servers. The edge nodes which is what they create most of their servers as and then a second internal ring of replicated web servers (a smaller number).

If an item cannot be found in on the edge node it requests the information from an inner web server, if that fails then it evetually falls back to the origin, your server.

So yes requests do fall back to the source if they cannot be found in the CDN.

They do some replication amongst each other but you can't guarantee how many servers the information is replicated to and you have no idea how long each one will cache it for.

At an Akamai server the more an item is requested the longer it will stay in the cache. But this is not per company, it is for all requests to the machine. So if your information is on a server that is also being used by a site more popular than yours then it may not be cached very long. When I spoke to them, they couldn't give you that level of detail.

discovery.com Akamai CDN Article

许可以下: CC-BY-SA归因
不隶属于 StackOverflow
scroll top