Question

I'm really asking this by proxy, another team at work has had a change request from our customer.

The problem is that our customer doesn't want their employees to login with one user more than one at the same time. That they are getting locked out and sharing logins.

Since this is on a web farm, what would be the best way to tackle this issue?

Wouldn't caching to the database cause performance issues?

Was it helpful?

Solution

You could look at using a distributed cache system like memcached

It would solve this problem pretty well (it's MUCH faster than a database), and is also excellent for caching pretty much anything else too

OTHER TIPS

It's just a cost of doing business.

Yes, caching to a database is slower than caching on your webserver. But you've got to store that state information in a centralized location, otherwise one webserver isn't going to know what users are logged into another.

Assumption: You're trying to prevent multiple concurrent log-ins by a single user.

A database operation at login and logout won't cause a performance problem.

  • If you are using a caching proxy, that will cause a problem:
  • a user will log out, but won't be able to log back in until the logout reaches the cache

Your biggest potential problem might be:

  • if the app/box crashes without a chance for the user to log out, the user's state in the database will remain "logged in".

It depends on how the authentication is done. If you store the last successful login datetime (whatever the backend), so maybe you can change the schema to store a flag "logged_in" and that won't involve an extra performance cost. (ok, it's not clean at all)

Licensed under: CC-BY-SA with attribution
Not affiliated with StackOverflow
scroll top