Question

Judging by a conversation I had on #appengine at irc.freenode.net, I'm clearly not the only person baffled by GAE pricing, so I figured I'd throw this up on StackOverflow and ask for clarity. Essentially: given an app with the figures below, what should its "CPU time" bill be per year?

Suppose:
h = Google App Engine's charge per hour for CPU time. Currently, h = $0.10
f = Google App Engine's daily free quota of CPU hours. Currently, I think* f = 2853.5
t = total registered users
s = simultaneous users. Assume = t * 0.2
e = (requests/second)/simultaneous user. Assume = 0.5
r = requests/sec = s * e
R = requests/day = r * 3600 * 24
p = CPU hours/request. Assume 150ms/request. I.e. assume p = 0.15/3600
c = CPU hours/sec = r * p
C = CPU hours/day = c * 3600 * 24
y = average number of days in a year = 365.25 B = CPU time bill per year = (C - f) * h * y

Therefore, C = t * 0.2 * 0.5 * (0.15/3600) * 3600 * 24
So suppose I get 10000 registered users, that means C = 3600.

In that case:
B = (3600 - f) * h * y = 9146.5 * $0.10 * 365.25 = $40415 to the nearest dollar

Is that right, or have I misunderstood what CPU time is, how it is priced, or how the quotas work?

*The free daily quota is not clearly expressed, but I think it's 6.5 hours for general use plus 2,487 hours for datastore manipulation: 2853.5 hours/day in total, assuming that my app mostly spends its time handling requests by using a controller to generate views on the models in the datastore, and allowing CRUD operations on those models.

NB. For a transcript of the IRC discussion, please see the edit history of this question.

Was it helpful?

Solution 2

Nick Johnson essentially answered this question here. (Thank you, Nick!) The answer appears to be: "For the most part, yes, that is right."

OTHER TIPS

I think some of your estimates are too high.

20% of a site's registered users are using the service at any time. This is extremely high. It would mean that the average person is registered at only 5 websites, and spends 24 hours per day browsing those 5 sites. I think it would be closer to estimate that an average person is registered at 50 websites, and spends 2.4 hours per day browsing all of them combined, which would put you off by a factor of 100.

0.5 requests per second per simultaneous user. This depends on the site, but I would say the normal pattern is to have a one dynamic request to render the page template, and a series of static handlers to render images, CSS, and javascript. Static requests don't incur CPU charges. If there's one dynamic request per page, your estimate assumes the average user is navigating to a new page twice every second. I'd say once every 5 seconds is more reasonable.

I'm not sure this kind of estimation is particularly useful to begin with. Whether your site has 10,000 users or 10 million users, you're either monetizing traffic or you're losing money. If you're averaging 150ms of CPU time per request @ $0.10 per hour, one dollar buys you 240,000 requests. If you can't earn back $1 in ad revenue from 240,000 page views, you're doing something wrong.

These estimates don't factor in what you're paying for bandwidth or disk storage, or what you lose every time Google decides to put the datastore in read-only mode in the middle of a weekday afternoon. Or the development costs of learning the datastore, which imposes many constraints that you wouldn't have with a traditional relational database. Nor do they factor in what you gain in scalability; if your site turns out to be only mildly popular (like the overwhelming majority of the internet), you'll probably fit within the free quotas, and pay nothing. If you become extremely popular, your app will scale automatically, assuming you designed it well to begin with. This is as opposed to EC2 or Azure, where you're paying ~$77 per instance whether anyone's hitting it or not.

Licensed under: CC-BY-SA with attribution
Not affiliated with StackOverflow
scroll top