Distributed Caching

    Let’s say you are showing the last 10 news in your home page and in a minute, in average of a thousand users are visiting this page. For every page view you might be querying your database to display this information:

    Even if we think that our home page contains only this information, a site, that gets 10000 visits a minute would run 150 SQL queries per second.

    These queries, as their result doesn’t differ from user to user (always the last 10 news), might be cached in SQL server side automatically.

    Also as SQL connections which can be kept open simultaneously has a upper limit (connection pool limit) and when you reach that number, the connections start to wait in the queue and block each other.

    By taking into account that news don’t change every second, we could cache them in our WEB server memory for 5 minutes.

    Thus as soon as we transfer news list from SQL database, store them in local cache. For the next 5 minutes, for every user that visits the home page, news list is read from local cache instantly, without even hitting SQL:

    1. {
    2. var news = HttpRuntime.Cache["News"] as List<News>;
    3. if (news == null)
    4. {
    5. using (var connection = new SqlConnection("......"))
    6. news = connection.Query<News>("
    7. SELECT TOP 10 Title, NewsDate, Subject, Body
    8. FROM News
    9. ORDER BY NewsDate DESC")
    10. .ToList();
    11. TimeSpan.FromMinutes(5), ....);
    12. }
    13. }
    14. }

    All these cached information is stored in WEB server memory which is the fastest location to access them.

    Note that caching something doesn’t always mean that your application will work faster. How effectively you use cache is more important than caching alone. It is even possible to slow down your application with caching, if not used properly.