Question

So, this may seem like an edge case, but I'm simply wondering if this is possible. What I have is a combination of a static set and deferred set wrapped in an IEnumerable - for example:

public IEnumerable<T> MakeMyQuery<T>()
{
    // returns a List<T>
    var someStaticData = LoadMyStaticDataFromDatabase();

    // Returns IEnumerable<T>, but from another resource
    var deferredQuery = CreateADeferredQueryUsingYieldReturn(); 

    return someStaticData.Concat(deferredQuery);
}

So what happens here is that when I call .Take(someNumber) on my enumerable, it will return elements from my static Data first before attempting to evaluate the deferred component - effectively, I have "hidden" some potentially time consuming producing tasks behind the enumerable such that if I never need to get those elements, they actually never get evaluated because of the deferred nature of LINQ.

However, I don't think it's possible to cache this query for later use (I don't believe the state of the Iterator will be kept in cache, right?) Or is there an easy way to do this without enumerating the results to save?

Ideally, my flow would be like so:

public List<T> SomeMethod<T>(int numberOfGuys)
{
     IEnumerable<T> query = null;

     if(// Is in Cache)
       query = Cache["MyQuery"];
     else
     {
         query = MakeMyQuery();
         Cache["MyQuery"] = query;
     }

     return query.Take(numberOfGuys).ToList();
}

So I can re-use the same query over and over to request data, but potentially never have to requery the DB. Is there a way to do this?

Was it helpful?

Solution

I think you want to cache the result of query.Take(numberOfGuys).ToList() in a new List. Before calling MakeMyQuery() you could look at the number of elements in your cached list (if it exists) and if the number of elements in your cached list is greater or equal to numberOfGuys then you'd return the numberOfGuys from your cached list. Otherwise you'd replace your cached list with the new result of query.Take(numberOfGuys).ToList().

As default.krammer pointed out, what you probably really want to cache is the result of LoadMyStaticDataFromDatabase() since if numberOfGuys is always less than LoadMyStaticDataFromDatabase(), you will end up hitting the DB repeatedly until numberOfGuys is greater than the number returned by LoadMyStaticDataFromDatabase(). So you could do a combination of caching LoadMyStaticDataFromDatabase() in the MakeMyQuery<T>() method and also caching query.Take(numberOfGuys).ToList() in SomeMethod<T>(int numberOfGuys), which would allow you to only hit the DB once, but still take advantage of the deferred execution of your IEnumerable<T>.

OTHER TIPS

I know this may be a bit old fashioned but you could potentially fill an ADO.NET DataSet from the Database as DataSet and DataTable of the ADO.NET layer is the disconnected layer. Then Dataset could be held in memory for a determined amount of time by the application. Then you could post it back to the database. Create Your Dataset, fill it from an entity, ADO.NET connected layer, or Linq to SQL layer, it exists filled, you can further fill it with new data as needed, then compare it in a final query back to the database to merge the changes.

I know I did a project a while back where I did a mixture of Linq, ADO.NET and xml serialization to basically serialize data from ADO.NET to an xml file with the built in xml serialization of ADO.NET. Then read it with Linq to XML. It was similar to what you are saying in that the XML file was the cache essentially in file format and I just updated it with changes by counting it's distinct elements representing a key value in the database. If it's counts were off it updated, else it stayed the same. This was not applicable for large sets of millions of rows but for small things I wanted to ALWAYS have access to it was nice and was pretty fast.

I know that in the 70-516 MS Press book on .NET 4.0 Data Access there is a lab near the end of the book on caching if you can find it online. It basically targets a database, collects changes since last time, works off of that, then merges at the end. That way you are working with a differential constantly that is smaller in memory but tracking your working changes.

Maybe I'm not understanding your question entirely. If so, let me know and I'll reformulate my answer.

I believe what you have written will already behave as you want. Consider the following toy example (similar to the code you show). I haven't tested it, but what you should see is that if you Take less than 4 items, you never enter SuperExpensiveQuery.

static IEnumerable<int> SuperExpensiveQuery()
{
    Console.WriteLine("super expensive query (#1)");
    yield return 100;
    Console.WriteLine("super expensive query (#2)");
    yield return 200;
    Console.WriteLine("super expensive query (#3)");
    yield return 300;
    Console.WriteLine("super expensive query (#4)");
}

static IEnumerable<int> MakeMyQuery()
{
    var someStaticData = new int[] { 1, 2, 3 };
    var deferredQuery = SuperExpensiveQuery();
    return someStaticData.Concat(deferredQuery);
}

static void Test()
{
    var query = MakeMyQuery();
    for (int i = 0; i <= 7; i++)
    {
        Console.WriteLine("BEGIN Take({0})", i);
        foreach (var n in query.Take(i))
            Console.WriteLine("    {0}", n);
        Console.WriteLine("END Take({0})", i);
    }
    Console.ReadLine();
}

I had a similar requirement in one of my projects. What I ended up doing is creating a DataAccesLayer (DAL) Cache base that I inherit in the DAL for each of my components. I have a separate caching class that holds the cache. Note that all my objects had ID and Name. You can tailor the base class however you need.

DAL Base Class:

public abstract class DALBaseCache<T>
    {
        public List<T> ItemList
        {
            get
            {
                List<T> itemList = DALCache.GetItem<List<T>>(typeof(T).Name + "Cache");

                if (itemList != null)
                    return itemList;
                else
                {
                    itemList = GetItemList();
                    DALCache.SetItem(typeof(T).Name + "Cache", itemList);
                    return itemList;
                }
            }
        }

        /// <summary>
        /// Get a list of all the Items
        /// </summary>
        /// <returns></returns>
        protected abstract List<T> GetItemList();

        /// <summary>
        /// Get the Item based on the ID
        /// </summary>
        /// <param name="name">ID of the Item to retrieve</param>
        /// <returns>The Item with the given ID</returns>
        public T GetItem(int id)
        {
            return (from item in ItemList
                    where (int)item.GetType().GetProperty("ID").GetValue(item, null) == id
                    select item).SingleOrDefault();
        }

        /// <summary>
        /// Get the Item based on the Name
        /// </summary>
        /// <param name="name">Name of the Item to retrieve</param>
        /// <returns>The Item with the given Name</returns>
        public T GetItem(string name)
        {
            return (from item in ItemList
                    where (string)item.GetType().GetProperty("Name").GetValue(item, null) == name
                    select item).SingleOrDefault();
        }
    }

Then my Caching Class which basically holds a dictionary of my queries

public static class DALCache
{
    static Dictionary<string, object> _AppCache = new Dictionary<string, object>();

    public static T GetItem<T>(string key)
    {
        if(_AppCache.ContainsKey(key))
        {
            return (T) _AppCache[key];
        }
        else
        {
            return default(T);
        }
    }

    public static void SetItem(string key, object obj)
    {
        _AppCache.Add(key, obj);
    }
}

And finally an implementation with a cached List. I use EF to get my CustomerType list and cache it for remainder of the Application life. You can change this as you need.

public class CustomerTypeDAL: DALBaseCache<CustomerType>
{
    protected override List<CustomerType> GetItemList()
    {
        DBEntities entities = new DBEntities();
        return Mapper.Map <List<CustomerType>>(entities.GetAllCustomerTypes().ToList());
    }
}

Anywhere in your code you can use it as:

CustomerTypeDAL customerTypeDAL = new CustomerTypeDAL();
List<CustomerType> custTypes = customerTypeDAL.ItemList;

First time you call it, it will grab it from DB. After that it will go after cache.

Yes it is possible if you cache values while you are iterating.

It will look like this:

var lazyList = MakeMyQuery<int>().ToLazyList();
var list1 = lazyList.Take(2).Sum();
var list2 = lazyList.Take(3).Sum();
var list3 = lazyList.Take(1).Sum();

In this case: The f

  • The first 3 items are only being yielded from MakeMyQuery 1 time in total.
  • The fourth items has not been yielded.

An example of an implementation of a lazy list is here.

Licensed under: CC-BY-SA with attribution
Not affiliated with StackOverflow
scroll top