Run function when Dictionary contents changes - c#

I have a few ListViews that depend on the contents of Dictionary<string,string>s. I also have one function per dictionary for updating its associated ListView. However, these have to be called manually and this can cause all kinds of problems if forgotten. Is it possible to call a function whenever any value within a dictionary is changed?
My initial thoughts were to use DataBinding but Listview doesn't support this in WinForms or to raise some custom events but I wouldn't even know where to begin with that.
Is there a simple way to acheive this?
Just for clarification I'm talking about something that has this effect:
Dictionary[key] = "this"; //Automatically runs UpdateDictionaryBox();
Dictionary.Add(key,value); //The same
Dictionary.Remove(key); //And again

All credit to Tim Schmelter as his comment led me to this solution!
Essentially, the easiest method is to create a wrapper class around the Dictionary<string,string> and the ListView then use this to implement all of the dictionary methods that are needed and call the update at the end of .Add, .Remove [key] = value etc.
This way, once the class is implemented correctly, it emulates data binding.
My resulting class is:
class DictionaryListViewPseudoBinder : IEnumerable
{
private ListView ListView { get; }
private Dictionary<string,string> Dictionary { get; set; }
public DictionaryListViewPseudoBinder(ListView listView)
{
ListView = listView;
Dictionary = new Dictionary<string, string>();
}
public string this[string key]
{
get
{
return Dictionary.ContainsKey(key) ? Dictionary[key] : "";
}
set
{
if (Dictionary.ContainsKey(key))
{
Dictionary[key] = value;
RepopulateListView();
}
else
{
MessageBox.Show("Dictionary does not contain key " + key + " aborting...");
}
}
}
public void Add(string key, string value)
{
if (!Dictionary.ContainsKey(key))
{
Dictionary.Add(key, value);
RepopulateListView();
}
else
{
MessageBox.Show(string.Format("The Entry \"{0}\" already exists in {1}",key,ListView.Name));
}
}
public void Remove(string key)
{
if (Dictionary.ContainsKey(key))
{
Dictionary.Remove(key);
}
}
public bool ContainsKey(string key)
{
return Dictionary.ContainsKey(key);
}
public bool ContainsKVP(KeyValuePair<string, string> kvp)
{
if (!Dictionary.ContainsKey(kvp.Key))
{
return false;
}
else
{
return Dictionary[kvp.Key] == kvp.Value;
}
}
private void RepopulateListView()
{
ListView.Items.Clear();
foreach (KeyValuePair<string, string> kvp in Dictionary)
{
ListView.Items.Add(kvp.Key).SubItems.Add(kvp.Value);
}
}
public IEnumerator GetEnumerator()
{
return Dictionary.GetEnumerator();
}
}
NB: The above is not fully tested yet and not all methods are fully implemented at this time, but it shows the general framework necessary for this functionality.

Related

Looking for a way to do less locking while caching

I am using the code below to cache items. It's pretty basic.
The issue I have is that every time it caches an item, section of the code locks. So with roughly a million items arriving every hour or so, this is a problem.
I've tried creating a dictionary of static lock objects per cacheKey, so that locking is granular, but that in itself becomes an issue with managing expiration of them, etc...
Is there a better way to implement minimal locking?
private static readonly object cacheLock = new object();
public static T GetFromCache<T>(string cacheKey, Func<T> GetData) where T : class {
// Returns null if the string does not exist, prevents a race condition
// where the cache invalidates between the contains check and the retrieval.
T cachedData = MemoryCache.Default.Get(cacheKey) as T;
if (cachedData != null) {
return cachedData;
}
lock (cacheLock) {
// Check to see if anyone wrote to the cache while we where
// waiting our turn to write the new value.
cachedData = MemoryCache.Default.Get(cacheKey) as T;
if (cachedData != null) {
return cachedData;
}
// The value still did not exist so we now write it in to the cache.
cachedData = GetData();
MemoryCache.Default.Set(cacheKey, cachedData, new CacheItemPolicy(...));
return cachedData;
}
}
You may want to consider using ReaderWriterLockSlim, which you can obtain write lock only when needed.
Using cacheLock.EnterReadLock(); and cacheLock.EnterWriteLock(); should greatly improve the performance.
That link I gave even have an example of a cache, exactly what you need, I copy here:
public class SynchronizedCache
{
private ReaderWriterLockSlim cacheLock = new ReaderWriterLockSlim();
private Dictionary<int, string> innerCache = new Dictionary<int, string>();
public int Count
{ get { return innerCache.Count; } }
public string Read(int key)
{
cacheLock.EnterReadLock();
try
{
return innerCache[key];
}
finally
{
cacheLock.ExitReadLock();
}
}
public void Add(int key, string value)
{
cacheLock.EnterWriteLock();
try
{
innerCache.Add(key, value);
}
finally
{
cacheLock.ExitWriteLock();
}
}
public bool AddWithTimeout(int key, string value, int timeout)
{
if (cacheLock.TryEnterWriteLock(timeout))
{
try
{
innerCache.Add(key, value);
}
finally
{
cacheLock.ExitWriteLock();
}
return true;
}
else
{
return false;
}
}
public AddOrUpdateStatus AddOrUpdate(int key, string value)
{
cacheLock.EnterUpgradeableReadLock();
try
{
string result = null;
if (innerCache.TryGetValue(key, out result))
{
if (result == value)
{
return AddOrUpdateStatus.Unchanged;
}
else
{
cacheLock.EnterWriteLock();
try
{
innerCache[key] = value;
}
finally
{
cacheLock.ExitWriteLock();
}
return AddOrUpdateStatus.Updated;
}
}
else
{
cacheLock.EnterWriteLock();
try
{
innerCache.Add(key, value);
}
finally
{
cacheLock.ExitWriteLock();
}
return AddOrUpdateStatus.Added;
}
}
finally
{
cacheLock.ExitUpgradeableReadLock();
}
}
public void Delete(int key)
{
cacheLock.EnterWriteLock();
try
{
innerCache.Remove(key);
}
finally
{
cacheLock.ExitWriteLock();
}
}
public enum AddOrUpdateStatus
{
Added,
Updated,
Unchanged
};
~SynchronizedCache()
{
if (cacheLock != null) cacheLock.Dispose();
}
}
I don't know how MemoryCache.Default is implemented, or whether or not you have control over it.
But in general, prefer using ConcurrentDictionary over Dictionary with lock in a multi threaded environment.
GetFromCache would just become
ConcurrentDictionary<string, T> cache = new ConcurrentDictionary<string, T>();
...
cache.GetOrAdd("someKey", (key) =>
{
var data = PullDataFromDatabase(key);
return data;
});
There are two more things to take care about.
Expiry
Instead of saving T as the value of the dictionary, you can define a type
struct CacheItem<T>
{
public T Item { get; set; }
public DateTime Expiry { get; set; }
}
And store the cache as a CacheItem with a defined expiry.
cache.GetOrAdd("someKey", (key) =>
{
var data = PullDataFromDatabase(key);
return new CacheItem<T>() { Item = data, Expiry = DateTime.UtcNow.Add(TimeSpan.FromHours(1)) };
});
Now you can implement expiration in an asynchronous thread.
Timer expirationTimer = new Timer(ExpireCache, null, 60000, 60000);
...
void ExpireCache(object state)
{
var needToExpire = cache.Where(c => DateTime.UtcNow >= c.Value.Expiry).Select(c => c.Key);
foreach (var key in needToExpire)
{
cache.TryRemove(key, out CacheItem<T> _);
}
}
Once a minute, you search for all cache entries that need to be expired, and remove them.
"Locking"
Using ConcurrentDictionary guarantees that simultaneous read/writes won't corrupt the dictionary or throw an exception.
But, you can still end up with a situation where two simultaneous reads cause you to fetch the data from the database twice.
One neat trick to solve this is to wrap the value of the dictionary with Lazy
ConcurrentDictionary<string, Lazy<CacheItem<T>>> cache = new ConcurrentDictionary<string, Lazy<CacheItem<T>>>();
...
var data = cache.GetOrData("someKey", key => new Lazy<CacheItem<T>>(() =>
{
var data = PullDataFromDatabase(key);
return new CacheItem<T>() { Item = data, Expiry = DateTime.UtcNow.Add(TimeSpan.FromHours(1)) };
})).Value;
Explanation
with GetOrAdd you might end up invoking the "get from database if not in cache" delegate multiple times in the case of simultaneous requests.
However, GetOrAdd will end up using only one of the values that the delegate returned, and by returning a Lazy, you guaranty that only one Lazy will get invoked.

Call code once when controller is first accessed [duplicate]

I have read lots of information about page caching and partial page caching in a MVC application. However, I would like to know how you would cache data.
In my scenario I will be using LINQ to Entities (entity framework). On the first call to GetNames (or whatever the method is) I want to grab the data from the database. I want to save the results in cache and on the second call to use the cached version if it exists.
Can anyone show an example of how this would work, where this should be implemented (model?) and if it would work.
I have seen this done in traditional ASP.NET apps , typically for very static data.
Here's a nice and simple cache helper class/service I use:
using System.Runtime.Caching;
public class InMemoryCache: ICacheService
{
public T GetOrSet<T>(string cacheKey, Func<T> getItemCallback) where T : class
{
T item = MemoryCache.Default.Get(cacheKey) as T;
if (item == null)
{
item = getItemCallback();
MemoryCache.Default.Add(cacheKey, item, DateTime.Now.AddMinutes(10));
}
return item;
}
}
interface ICacheService
{
T GetOrSet<T>(string cacheKey, Func<T> getItemCallback) where T : class;
}
Usage:
cacheProvider.GetOrSet("cache key", (delegate method if cache is empty));
Cache provider will check if there's anything by the name of "cache id" in the cache, and if there's not, it will call a delegate method to fetch data and store it in cache.
Example:
var products=cacheService.GetOrSet("catalog.products", ()=>productRepository.GetAll())
Reference the System.Web dll in your model and use System.Web.Caching.Cache
public string[] GetNames()
{
string[] names = Cache["names"] as string[];
if(names == null) //not in cache
{
names = DB.GetNames();
Cache["names"] = names;
}
return names;
}
A bit simplified but I guess that would work. This is not MVC specific and I have always used this method for caching data.
I'm referring to TT's post and suggest the following approach:
Reference the System.Web dll in your model and use System.Web.Caching.Cache
public string[] GetNames()
{
var noms = Cache["names"];
if(noms == null)
{
noms = DB.GetNames();
Cache["names"] = noms;
}
return ((string[])noms);
}
You should not return a value re-read from the cache, since you'll never know if at that specific moment it is still in the cache. Even if you inserted it in the statement before, it might already be gone or has never been added to the cache - you just don't know.
So you add the data read from the database and return it directly, not re-reading from the cache.
For .NET 4.5+ framework
add reference: System.Runtime.Caching
add using statement:
using System.Runtime.Caching;
public string[] GetNames()
{
var noms = System.Runtime.Caching.MemoryCache.Default["names"];
if(noms == null)
{
noms = DB.GetNames();
System.Runtime.Caching.MemoryCache.Default["names"] = noms;
}
return ((string[])noms);
}
In the .NET Framework 3.5 and earlier versions, ASP.NET provided an in-memory cache implementation in the System.Web.Caching namespace. In previous versions of the .NET Framework, caching was available only in the System.Web namespace and therefore required a dependency on ASP.NET classes. In the .NET Framework 4, the System.Runtime.Caching namespace contains APIs that are designed for both Web and non-Web applications.
More info:
https://msdn.microsoft.com/en-us/library/dd997357(v=vs.110).aspx
https://learn.microsoft.com/en-us/dotnet/framework/performance/caching-in-net-framework-applications
Steve Smith did two great blog posts which demonstrate how to use his CachedRepository pattern in ASP.NET MVC. It uses the repository pattern effectively and allows you to get caching without having to change your existing code.
http://ardalis.com/Introducing-the-CachedRepository-Pattern
http://ardalis.com/building-a-cachedrepository-via-strategy-pattern
In these two posts he shows you how to set up this pattern and also explains why it is useful. By using this pattern you get caching without your existing code seeing any of the caching logic. Essentially you use the cached repository as if it were any other repository.
I have used it in this way and it works for me.
https://msdn.microsoft.com/en-us/library/system.web.caching.cache.add(v=vs.110).aspx
parameters info for system.web.caching.cache.add.
public string GetInfo()
{
string name = string.Empty;
if(System.Web.HttpContext.Current.Cache["KeyName"] == null)
{
name = GetNameMethod();
System.Web.HttpContext.Current.Cache.Add("KeyName", name, null, DateTime.Noew.AddMinutes(5), Cache.NoSlidingExpiration, CacheitemPriority.AboveNormal, null);
}
else
{
name = System.Web.HttpContext.Current.Cache["KeyName"] as string;
}
return name;
}
AppFabric Caching is distributed and an in-memory caching technic that stores data in key-value pairs using physical memory across multiple servers. AppFabric provides performance and scalability improvements for .NET Framework applications. Concepts and Architecture
Extending #Hrvoje Hudo's answer...
Code:
using System;
using System.Runtime.Caching;
public class InMemoryCache : ICacheService
{
public TValue Get<TValue>(string cacheKey, int durationInMinutes, Func<TValue> getItemCallback) where TValue : class
{
TValue item = MemoryCache.Default.Get(cacheKey) as TValue;
if (item == null)
{
item = getItemCallback();
MemoryCache.Default.Add(cacheKey, item, DateTime.Now.AddMinutes(durationInMinutes));
}
return item;
}
public TValue Get<TValue, TId>(string cacheKeyFormat, TId id, int durationInMinutes, Func<TId, TValue> getItemCallback) where TValue : class
{
string cacheKey = string.Format(cacheKeyFormat, id);
TValue item = MemoryCache.Default.Get(cacheKey) as TValue;
if (item == null)
{
item = getItemCallback(id);
MemoryCache.Default.Add(cacheKey, item, DateTime.Now.AddMinutes(durationInMinutes));
}
return item;
}
}
interface ICacheService
{
TValue Get<TValue>(string cacheKey, Func<TValue> getItemCallback) where TValue : class;
TValue Get<TValue, TId>(string cacheKeyFormat, TId id, Func<TId, TValue> getItemCallback) where TValue : class;
}
Examples
Single item caching (when each item is cached based on its ID because caching the entire catalog for the item type would be too intensive).
Product product = cache.Get("product_{0}", productId, 10, productData.getProductById);
Caching all of something
IEnumerable<Categories> categories = cache.Get("categories", 20, categoryData.getCategories);
Why TId
The second helper is especially nice because most data keys are not composite. Additional methods could be added if you use composite keys often. In this way you avoid doing all sorts of string concatenation or string.Formats to get the key to pass to the cache helper. It also makes passing the data access method easier because you don't have to pass the ID into the wrapper method... the whole thing becomes very terse and consistant for the majority of use cases.
Here's an improvement to Hrvoje Hudo's answer. This implementation has a couple of key improvements:
Cache keys are created automatically based on the function to update data and the object passed in that specifies dependencies
Pass in time span for any cache duration
Uses a lock for thread safety
Note that this has a dependency on Newtonsoft.Json to serialize the dependsOn object, but that can be easily swapped out for any other serialization method.
ICache.cs
public interface ICache
{
T GetOrSet<T>(Func<T> getItemCallback, object dependsOn, TimeSpan duration) where T : class;
}
InMemoryCache.cs
using System;
using System.Reflection;
using System.Runtime.Caching;
using Newtonsoft.Json;
public class InMemoryCache : ICache
{
private static readonly object CacheLockObject = new object();
public T GetOrSet<T>(Func<T> getItemCallback, object dependsOn, TimeSpan duration) where T : class
{
string cacheKey = GetCacheKey(getItemCallback, dependsOn);
T item = MemoryCache.Default.Get(cacheKey) as T;
if (item == null)
{
lock (CacheLockObject)
{
item = getItemCallback();
MemoryCache.Default.Add(cacheKey, item, DateTime.Now.Add(duration));
}
}
return item;
}
private string GetCacheKey<T>(Func<T> itemCallback, object dependsOn) where T: class
{
var serializedDependants = JsonConvert.SerializeObject(dependsOn);
var methodType = itemCallback.GetType();
return methodType.FullName + serializedDependants;
}
}
Usage:
var order = _cache.GetOrSet(
() => _session.Set<Order>().SingleOrDefault(o => o.Id == orderId)
, new { id = orderId }
, new TimeSpan(0, 10, 0)
);
public sealed class CacheManager
{
private static volatile CacheManager instance;
private static object syncRoot = new Object();
private ObjectCache cache = null;
private CacheItemPolicy defaultCacheItemPolicy = null;
private CacheEntryRemovedCallback callback = null;
private bool allowCache = true;
private CacheManager()
{
cache = MemoryCache.Default;
callback = new CacheEntryRemovedCallback(this.CachedItemRemovedCallback);
defaultCacheItemPolicy = new CacheItemPolicy();
defaultCacheItemPolicy.AbsoluteExpiration = DateTime.Now.AddHours(1.0);
defaultCacheItemPolicy.RemovedCallback = callback;
allowCache = StringUtils.Str2Bool(ConfigurationManager.AppSettings["AllowCache"]); ;
}
public static CacheManager Instance
{
get
{
if (instance == null)
{
lock (syncRoot)
{
if (instance == null)
{
instance = new CacheManager();
}
}
}
return instance;
}
}
public IEnumerable GetCache(String Key)
{
if (Key == null || !allowCache)
{
return null;
}
try
{
String Key_ = Key;
if (cache.Contains(Key_))
{
return (IEnumerable)cache.Get(Key_);
}
else
{
return null;
}
}
catch (Exception)
{
return null;
}
}
public void ClearCache(string key)
{
AddCache(key, null);
}
public bool AddCache(String Key, IEnumerable data, CacheItemPolicy cacheItemPolicy = null)
{
if (!allowCache) return true;
try
{
if (Key == null)
{
return false;
}
if (cacheItemPolicy == null)
{
cacheItemPolicy = defaultCacheItemPolicy;
}
String Key_ = Key;
lock (Key_)
{
return cache.Add(Key_, data, cacheItemPolicy);
}
}
catch (Exception)
{
return false;
}
}
private void CachedItemRemovedCallback(CacheEntryRemovedArguments arguments)
{
String strLog = String.Concat("Reason: ", arguments.RemovedReason.ToString(), " | Key-Name: ", arguments.CacheItem.Key, " | Value-Object: ", arguments.CacheItem.Value.ToString());
LogManager.Instance.Info(strLog);
}
}
I use two classes. First one the cache core object:
public class Cacher<TValue>
where TValue : class
{
#region Properties
private Func<TValue> _init;
public string Key { get; private set; }
public TValue Value
{
get
{
var item = HttpRuntime.Cache.Get(Key) as TValue;
if (item == null)
{
item = _init();
HttpContext.Current.Cache.Insert(Key, item);
}
return item;
}
}
#endregion
#region Constructor
public Cacher(string key, Func<TValue> init)
{
Key = key;
_init = init;
}
#endregion
#region Methods
public void Refresh()
{
HttpRuntime.Cache.Remove(Key);
}
#endregion
}
Second one is list of cache objects:
public static class Caches
{
static Caches()
{
Languages = new Cacher<IEnumerable<Language>>("Languages", () =>
{
using (var context = new WordsContext())
{
return context.Languages.ToList();
}
});
}
public static Cacher<IEnumerable<Language>> Languages { get; private set; }
}
I will say implementing Singleton on this persisting data issue can be a solution for this matter in case you find previous solutions much complicated
public class GPDataDictionary
{
private Dictionary<string, object> configDictionary = new Dictionary<string, object>();
/// <summary>
/// Configuration values dictionary
/// </summary>
public Dictionary<string, object> ConfigDictionary
{
get { return configDictionary; }
}
private static GPDataDictionary instance;
public static GPDataDictionary Instance
{
get
{
if (instance == null)
{
instance = new GPDataDictionary();
}
return instance;
}
}
// private constructor
private GPDataDictionary() { }
} // singleton
HttpContext.Current.Cache.Insert("subjectlist", subjectlist);
You can also try and use the caching built into ASP MVC:
Add the following attribute to the controller method you'd like to cache:
[OutputCache(Duration=10)]
In this case the ActionResult of this will be cached for 10 seconds.
More on this here

Creating a custom property class for multiple re-use within a class

Suppose I have a C# class that has multiple properties that all look like this:
private bool _var1Dirty = true;
private Double? _var1;
public Double? Var1
{
get
{
if (_var1Dirty)
{
_var1 = Method_Var1();
_var1Dirty = false;
}
return _var1;
}
}
And the only differences between each of these properties would be:
The type of return var (in this case Double?, but could just as easily be int, string, etc)
The method call - Method_Var1() (Each property would have a different one)
Is there any way I could write this as a custom class?
Something along the lines of:
public class Prop
{
public delegate T Func();
private bool _dirty = true;
private T _val;
public T Val
{
get
{
if (_dirty)
{
_val = Func;
_dirty = false;
}
return _val;
}
}
}
And then I could pass into it the:
Return type T
Method Func
(PS - I know this won't compile / is dead wrong, but I wanted to give an idea of what I'm looking for)
Any help / guidance would be really appreciated.
Thanks!!!
You're close. You can do something along the lines of this:
public class Dirty<T>
{
public Dirty(Func<T> valueFactory)
{
this.valueFactory = valueFactory;
dirty = true;
}
private Func<T> valueFactory;
private bool dirty;
private T value;
public T Value
{
get
{
if (dirty)
{
value = valueFactory();
dirty = false;
}
return value;
}
}
}
And you consume it like this:
Dirty<double?> dirtyDouble = new Dirty<double?>(() => SomethingThatReturnsADouble());
double? value = dirtyDouble.Value;
I'm not sure what the dirty checking actually does, but if you need someone more complicated than a bool you can always turn it into some Func<T> the checks for dirtiness.
Edit:
Given #mikez comment and your answer, you can save yourself the creation of the Dirty<T> class by using the built in Lazy<T>, which also guarantess thread safety:
public class F
{
private Lazy<double?> lazyDouble = new Lazy<double?>(() =>
MethodThatReturnsNullableDouble(), true);
public double? Value
{
get
{
return lazyDouble.Value;
}
}
}

How to sort a DataGridView that is bound to a collection of custom objects?

So I have been following this guide for data binding on Windows Forms controls (MAD props to the author, this guide is great), and I have used this to create a custom class and bind a DataGridView to a collection of this class:
class CompleteJobListEntry
{
private string _jobName;
private Image _jobStatus;
private DateTime _jobAddedDate;
private string _jobAddedScanner;
private string _jobAddedUser;
private string _jobLastActivity;
private DateTime _jobActivityDate;
private string _jobActivityUser;
public string JobName { get { return _jobName; } set { this._jobName = value; } }
public Image JobStatus { get { return _jobStatus; } set { this._jobStatus = value; } }
public DateTime JobAddedDate { get { return _jobAddedDate; } set { this._jobAddedDate = value; } }
public string JobAddedScanner { get { return _jobAddedScanner; } set { this._jobAddedScanner = value; } }
public string JobAddedUser { get { return _jobAddedUser; } set { this._jobAddedUser = value; } }
public string JobLastActivity { get { return _jobLastActivity; } set { this._jobLastActivity = value; } }
public DateTime JobActivityDate { get { return _jobActivityDate; } set { this._jobActivityDate = value; } }
public string JobActivityUser { get { return _jobActivityUser; } set { this._jobActivityUser = value; } }
}
At this point, I import a bunch of data from various SQL databases to populate the table, and it turns out great. The guide even provides an excellent starting point for adding filters, which I intend to follow a bit later. For now, though, I am stuck on the sorting of my newly generated DataGridView. Looking around, I've discovered that the DataGridView has its own Sort method, usable like:
completeJobListGridView.Sort(completeJobListGridView.Columns["JobName"], ListSortDirection.Ascending);
However, when I try to do this, I get an InvalidOperationException that tells me "DataGridView control cannot be sorted if it is bound to an IBindingList that does not support sorting." I've found both the IBindingList and IBindingListView interfaces, but making my class inherit either of these interfaces doesn't solve the problem.
How do I do this? I am completely stuck here...
If your data is in a collection, you should be able to use the BindingListView library to easily add sorting capabilities to your DGV. See How do I implement automatic sorting of DataGridView? and my answer to How to Sort WinForms DataGridView bound to EF EntityCollection<T> for more information and code snippets.

Composite Pattern Simplified

What do I lose by not implementing the Component and treating everything as a Composite?
I have given up the implementation for Leaf node:
I.e.
class Component : IComponent
{
/*...*/
}
Now plz take a look at my code.
public interface IComponent
{
int ID { get;set; }
string Name { get;set;}
void Add(IComponent item);
void Remove(IComponent item);
List<IComponent> Items { get; }
void Show();
}
public class Composite : IComponent
{
private int _id;
public int ID
{
get { return _id; }
set { _id = value; }
}
private string _name;
public string Name
{
get { return _name; }
set { _name = value; }
}
public Composite(int id, string name)
{
_id = id;
_name = name;
}
private List<IComponent> _items = new List<IComponent>();
public void Add(IComponent item)
{
_items.Add(item);
}
public void Remove(IComponent item)
{
_items.Remove(item);
}
public List<IComponent> Items
{
get
{
return new List<IComponent>(_items);
}
}
public void Show()
{
Console.WriteLine("ID=" + _id + "; Name=" + _name);
}
}
class Program
{
static void Main(string[] args)
{
IComponent root = new Composite(1, "World");
IComponent asia = new Composite(2, "Asia");
IComponent europe = new Composite(3, "Europe");
root.Add(asia);
root.Add(europe);
asia.Add(new Composite(4, "China"));
asia.Add(new Composite(5, "Japan"));
europe.Add(new Composite(6, "Germany"));
europe.Add(new Composite(7, "Russia"));
root.Show();
Program.Traverse(root.Items);
Console.ReadLine();
}
static void Traverse(List<IComponent> items)
{
foreach (IComponent c in items)
{
c.Show();
Traverse(c.Items);
}
}
}
What is wrong with this approach of Composite Pattern? What kind of problem can I face with this type of design?
You're giving up any chance to subclass the "Leaf", if it turns out you have different types of "nodes" you'll probably end up polluting the structure in one way or another. And you're violating the single responsipbility priciple too. It is very easy to get pollution of all sorts with the composite pattern, and I think it always pays off to do it cleanly.
If I understand correctly, there is not a concept of leaf node, in composite pattern.
Any node which doesn't have a child is automatically a leaf node.
Looking at your code, this is not needed
private List<IComponent> _items = new List<IComponent>();
public void Add(IComponent item)
{
_items.Add(item);
}
public void Remove(IComponent item)
{
_items.Remove(item);
}
public List<IComponent> Items
{
get
{
return new List<IComponent>(_items);
}
}
I am looking at ControlCollection class, which is a property of Control class.
Though not exactly composite pattern, every Control knows of who it is child of, which is missing from your code.
My understanding could be totally off. Experts can correct me :)
EDIT: I looked at the dofactory reference, which seems to have a concept of leaf class in composite pattern. My mistake of totally not understanding it.
But I would suggest you look at the way .net implements kind of composite pattern by having Control, ControlCollection & related classes.
EDIT2: If the code above is to be removed, you will have another class which is a collection of IComponent which you can expose by using a property IList<IComponent>, which in turn will have methods to add/remove.
EDIT3: .net doesn't restrict the user from adding child controls in the above class hierarchy. You could use the dofactory way of design, if you want to restrict the ability for someone to define a leaf node (one that doesn't have any child node).
EDIT4: The way dofactory code shows, you will have to define a leaf node, which will throw NotImplementedException for Add/`Remove'.

Categories

Resources