Threading With List Property - c#

public static class People
{
List<string> names {get; set;}
}
public class Threading
{
public static async Task DoSomething()
{
var t1 = new Task1("bob");
var t2 = new Task1("erin");
await Task.WhenAll(t1,t2);
}
private static async Task Task1(string name)
{
await Task.Run(() =>
{
if(People.names == null) People.names = new List<string>();
Peoples.names.Add(name);
}
}
}
Is that dangerous to initialize a list within a thread? Is it possible that both threads could initialize the list and remove one of the names?
So I was thinking of three options:
Leave it like this since it is simple - only if it is safe though
Do same code but use a concurrentBag - I know thread safe but is initialize safe
Using [DataMember(EmitDefaultValue = new List())] and then just do .Add in Task1 and not worry about initializing. But the only con to this is sometimes the list wont need to be used at all and it seems like a waste to initialize it everytime.

Okay so what I figured worked best for my case was I used a lock statement.
public class Class1
{
private static Object thisLock = new Object();
private static async Task Task1(string name)
{
await Task.Run(() =>
{
AddToList(name);
}
}
private static AddToList(string name)
{
lock(thisLock)
{
if(People.names == null) People.names = new List<string>();
People.names.Add(name);
}
}
}
public static class People
{
public static List<string> names {get; set;}
}

for a simple case like this the easiest way to get thread-safety is using the lock statement:
public static class People
{
static List<string> _names = new List<string>();
public static void AddName(string name)
{
lock (_names)
{
_names.Add(name);
}
}
public static IEnumerable<string> GetNames()
{
lock(_names)
{
return _names.ToArray();
}
}
}
public class Threading
{
public static async Task DoSomething()
{
var t1 = new Task1("bob");
var t2 = new Task1("erin");
await Task.WhenAll(t1,t2);
}
private static async Task Task1(string name)
{
People.AddName(name);
}
}
of course it's not very usefull (why not just add without the threads) - but I hope you get the idea.
If you don't use some kind of lock and concurrently read and write to a List you will most likely get an InvalidOperationException saying the collection has changed during read.
Because you don't really know when a user will use the collection you might return the easiest way to get thread-saftey is copying the collection into an array and returning this.
If this is not practical (collection to large, ..) you have to use the classes in System.Collections.Concurrrent for example the BlockingCollection but those are a bit more involved.

Related

Signal and Wait in C#

In the below application there are two parties that are calling ChannelReservationCache to fetch or add information.
I want to use the "signal and wait" thing in my ChannelReservationCache class so that in case ChannelReservationCache.AddChannelState() is adding the cache, parallelly if the WebApi call hits the ChannelReservationCache.GetChannel() then the GetChannel() should wait for the execution of AddChannelState() and vice-versa.
How this can be done in ChannelReservationCache class?
Will there be any deadlock?
public class ChannelReservationCache
{
private readonly IDictionary<int, string> channelStates = new Dictionary<int, string>>();
private readonly object lockObject = new object();
private static readonly object lock = new object();
private static ChannelReservationCache instance = null;
private ChannelReservationCache() {}
public static ChannelReservationCache Instance
{
get
{
lock(lock) {
if (instance == null) {
instance = new ChannelReservationCache();
}
return instance;
}
}
}
public void AddChannelState(int level, string channel)
{
lock (this.lockObject)
{
//other code that makes the function take long time.
this.AddChannel(level, channel);
}
}
public Channel GetChannel(int level)
{
//other code that makes the function take long time.
Channel c = new Channel()
channelStates.TryGetValue(destinationId, out var c);
return c;
}
private void AddChannel(int level, string channel)
{
Channel c = new Channel();
c.ChannelName = channel;
c.IsActive = true;
channelStates.Add(level, resourceState)
}
}
public class Channel
{
public string ChannelName {get; set;}
public bool IsActive {get; set;}
}
public class RMQRequestHandler
{
public Task HandleChannelRequest(int level, Channel messages)
{
ChannelReservationCache.Instance.AddChannelState(level, messages)
}
}
[Route("api/v1")]
public class ChannnelController: ControllerBase
{
[HttpGet]
[Route("ChannelResource")]
public IActionResult GetChannelResource([FromQuery] int id)
{
ChannelReservationCache crc = ChannelReservationCache.Instance.GetChannel(id);
return this.Ok(crc);
}
}
First off there is a far simpler solution for you:
Simply change
private readonly IDictionary<int, string> channelStates = new Dictionary<int, string>();
To:
private readonly IDictionary<int, string> channelStates = new System.Collections.Concurrent.ConcurrentDictionary<int, string>();
//using ConcurrentDictionary instead of Dictionary
And forget about the thread concurrency locking etc...
In reality it is pretty hard to beat the performance of ConcurrentDictionary by writing our own locking structures to wrap a normal Dictionary. It is possible using ReaderWriterLockSlim to lock the dictionary and Interlocked to maintain a custom implementation for its count property. But this is a micro optimization that would only pay out over millions of itterations.
Now to answer your question:
One issue here:
AddChannelState is threadsafe but GetChannel is not
Think of it this way. Your writer is using a threadsafe lock but your reader is not. GetChannel also needs a lock in it.
Suggestion
If you are Not using Lazy < T > in your singleton then it is probably best to make the following change
Code Example below where the cost of thread synchronization is avoided after it has been initialized. Bearing in mind that Lock (or Monitor.Enter / Exit) is one of the most expensive operations to perform. Sure there will be minimum locking contentions once it is initialized however the memory barrier is enforced each and every time plus the Monitor is being checked.
Following reference link is discussing Interlocked but the context of the memory barrier cost is the same:
https://learn.microsoft.com/en-us/archive/msdn-magazine/2005/october/understanding-low-lock-techniques-in-multithreaded-apps
private static ChannelReservationCache instance;
private static readonly object lockInstance = new object();
//lock (lockInstance) enforces a memory barrier plus the monitor which is a cost you do not need to bear once the instance has been initialized
public static ChannelReservationCache Instance
{
get
{
if (instance == null)
{
lock (lockInstance)
{
if (instance == null)
{
instance = new ChannelReservationCache();
}
}
}
return instance;
}
}

Async method - weird behavior: my app is getting stuck

I have a weird problem with my async method. I made second project in my solution in VS to connect my app to an API (using RestSharp for it). I made dependencies etc.
The problem is when I call this method from the UI by clicking a button (it only start backend code, there is no relation to UI etc.) app getting stuck. There is no errors, the only things I can see in the output window are "The thread ****** has exited with code 0 (0x0)." and it's going infinitely.
I took that code (only from project responsible for connecting and taking data from an api) and made a new solution, new project, but exacly copied code and it is working fine.
This is method what I am calling in the "main" WPF app using ICommand etc:
private void Api()
{
_orderService = new OrderService();
}
And those are classes in API project:
BLContext.cs
public class BLContext
{
private RestClient _client;
public RestClient Client { get; set; }
private string _token;
public string Token { get; set; }
public BLContext()
{
Client = new RestClient("https://api.baselinker.com/connector.php");
Token = "************************";
}
}
BaseAPIRepository.cs
public class BaseAPIRepository
{
private BLContext _bl = new BLContext();
RestRequest Request = new RestRequest();
public BaseAPIRepository() { }
public async Task<List<Order>> GetOrders()
{
List<Order> orders = new List<Order>();
List<JToken> orderList = new List<JToken>();
StartRequest("getOrders");
Request.AddParameter("parameters", "{ \"status_id\": 13595 }");
Request.AddParameter("parameters", "{ \"get_unconfirmed_orders\": false }");
RestResponse restResponse = await _bl.Client.PostAsync(Request);
JObject response = (JObject)JsonConvert.DeserializeObject(restResponse.Content);
orderList = response["orders"].ToList();
foreach (JToken order in orderList)
{
Order newOrder = new Order();
newOrder.Id = (int)order["order_id"];
newOrder.ProductsInOrder = GetProductsFromOrder((JArray)order["products"]);
orders.Add(newOrder);
}
return orders;
}
public void StartRequest(string method)
{
Request.AddParameter("token", _bl.Token);
Request.AddParameter("method", method);
}
public List<OrderedProduct> GetProductsFromOrder(JArray productsInOrder)
{
List<OrderedProduct> tmpListOfProducts = new List<OrderedProduct>();
foreach (var item in productsInOrder)
{
OrderedProduct tmpOrderedProduct = new OrderedProduct();
//tmpOrderedProduct.Id = (int)item["product_id"];
tmpOrderedProduct.Signature = (string)item["sku"];
tmpOrderedProduct.Quantity = (int)item["quantity"];
tmpListOfProducts.Add(tmpOrderedProduct);
}
return tmpListOfProducts;
}
}
OrderService.cs
public class OrderService
{
private BaseAPIRepository _repo;
private List<Order> _ordersList;
public List<Order> OrdersList { get; set; }
public OrderService()
{
_repo = new BaseAPIRepository();
OrdersList = new List<Order>();
OrdersList = _repo.GetOrders().Result;
Console.WriteLine("Test line to see if it passed 24th line.");
}
}
App is getting stuck on line:
RestResponse restResponse = await _bl.Client.PostAsync(Request);
The core problem - as others have noted - is that your code is blocking on asynchronous code, which you shouldn't do (as I explain on my blog). This is particularly true for UI apps, which deliver a bad user experience when the UI thread is blocked. So, even if the code wasn't deadlocking, it wouldn't be a good idea to block on the asynchronous code anyway.
There are certain places in a UI app where the code simply cannot block if you want a good user experience. View and ViewModel construction are two of those places. When a VM is being created, the OS is asking your app to display its UI right now, and waiting for a network request before displaying data is just a bad experience.
Instead, your application should initialize and return its UI immediately (synchronously), and display that. If you have to do a network request to get some data to display, it's normal to synchronously initialize the UI into a "loading" state, start the network request, and then at that point the construction/initialization is done. Later, when the network request completes, the UI is updated into a "loaded" state.
If you want to take this approach, there's a NotifyTask<T> type in my Nito.Mvvm.Async package which may help. Its design is described in this article and usage looks something like this (assuming OrderService is actually a ViewModel):
public class OrderService
{
private BaseAPIRepository _repo;
public NotifyTask<List<Order>> OrdersList { get; set; }
public OrderService()
{
_repo = new BaseAPIRepository();
OrdersList = NotifyTask.Create(() => _repo.GetOrders());
}
}
Then, instead of data-binding to OrderService.OrdersList, you can data-bind to OrderService.OrdersList.Result, OrderService.OrdersList.IsCompleted, etc.
You should never call Task.Result on an incomplete Task to avoid deadlocking the application. Always await a Task.
C# doesn't allow async constructors. Constructors are meant to return fast after some brief initialization. They are not a place for long-running operations or starting background threads (even if async constructors were allowed).
There are a few solutions to avoid the requirement of async constructors.
A simple alternative solution using Lazy<T> or AsyncLazy<T> (requires to install the Microsoft.VisualStudio.Threading package via the NuGet Package Manager). Lazy<T> allows to defer the instantiation or allocation of expensive resources.
public class OrderService
{
public List<object> Orders => this.OrdersInitializer.GetValue();
private AsyncLazy<List<object>> OrdersInitializer { get; }
public OrderService()
=> this.OrdersInitializer = new AsyncLazy<List<object>>(InitializeOrdersAsync, new JoinableTaskFactory(new JoinableTaskContext()));
private async Task<List<object>> InitializeOrdersAsync()
{
await Task.Delay(TimeSpan.FromSeconds(5));
return new List<object> { 1, 2, 3 };
}
}
public static void Main()
{
var orderService = new OrderService();
// Trigger async initialization
orderService.Orders.Add(4);
}
You can expose the data using a method instead of a property
public class OrderService
{
private List<object> Orders { get; set; }
public async Task<List<object>> GetOrdersAsync()
{
if (this.Orders == null)
{
await Task.Delay(TimeSpan.FromSeconds(5));
this.Orders = new List<object> { 1, 2, 3 };
}
return this.Orders;
}
}
public static async Task Main()
{
var orderService = new OrderService();
// Trigger async initialization
List<object> orders = await orderService.GetOrdersAsync();
}
Use an InitializeAsync method that must be called before using the instance
public class OrderService
{
private List<object> orders;
public List<object> Orders
{
get
{
if (!this.IsInitialized)
{
throw new InvalidOperationException();
}
return this.orders;
}
private set
{
this.orders = value;
}
}
public bool IsInitialized { get; private set; }
public async Task<List<object>> InitializeAsync()
{
if (this.IsInitialized)
{
return;
}
await Task.Delay(TimeSpan.FromSeconds(5));
this.Orders = new List<object> { 1, 2, 3 };
this.IsInitialized = true;
}
}
public static async Task Main()
{
var orderService = new OrderService();
// Trigger async initialization
await orderService.InitializeAsync();
}
Instantiate the instance by passing the expensive arguments to the constructor
public class OrderService
{
public List<object> Orders { get; }
public async Task<List<object>> OrderService(List<object> orders)
=> this.Orders = orders;
}
public static async Task Main()
{
List<object> orders = await GetOrdersAsync();
// Instantiate with the result of the async operation
var orderService = new OrderService(orders);
}
private static async Task<List<object>> GetOrdersAsync()
{
await Task.Delay(TimeSpan.FromSeconds(5));
return new List<object> { 1, 2, 3 };
}
Use a factory method and a private constructor
public class OrderService
{
public List<object> Orders { get; set; }
private OrderServiceBase()
=> this.Orders = new List<object>();
public static async Task<OrderService> CreateInstanceAsync()
{
var instance = new OrderService();
await Task.Delay(TimeSpan.FromSeconds(5));
instance.Orders = new List<object> { 1, 2, 3 };
return instance;
}
}
public static async Task Main()
{
// Trigger async initialization
OrderService orderService = await OrderService.CreateInstanceAsync();
}

Is reading from a static property thread safe?

Is this code thread safe?
public class SomeType
{
public int i {get;} = 5;
public string s {get;} = "Asdf";
public double d {get;} = 1.5;
}
public class SomeClass
{
public static SomeType SomeProp { get; } = new SomeType();
}
public static async Task Main()
{
await Task.WhenAll(
Task.Run(() => { _ = SomeClass.SomeProp; }),
Task.Run(() => { _ = SomeClass.SomeProp; })
);
}
I am concerned about the concurrent read from SomeClass.SomeProp.
It seems to be thread safe since - AFAIK - just reading is always thread safe.
But - again AFAIK - properties are lazily initialized so the first read from SomeProp will actually write new SomeType() to it? So if SomeProp has never been read from and then two threads try to concurrently read from it for the first time then we have a concurrent write and a data race?
Is this the case? If so, then how to make it thread safe (and do I have to protect the read with a full-blown lock?)

How to safely write to the same List

I've got a public static List<MyDoggie> DoggieList;
DoggieList is appended to and written to by multiple processes throughout my application.
We run into this exception pretty frequently:
Collection was modified; enumeration operation may not execute
Assuming there are multiple classes writing to DoggieList how do we get around this exception?
Please note that this design is not great, but at this point we need to quickly fix it in production.
How can we perform mutations to this list safely from multiple threads?
I understand we can do something like:
lock(lockObject)
{
DoggieList.AddRange(...)
}
But can we do this from multiple classes against the same DoggieList?
you can also create you own class and encapsulate locking thing in that only, you can try like as below ,
you can add method you want like addRange, Remove etc.
class MyList {
private object objLock = new object();
private List<int> list = new List<int>();
public void Add(int value) {
lock (objLock) {
list.Add(value);
}
}
public int Get(int index) {
int val = -1;
lock (objLock) {
val = list[0];
}
return val;
}
public void GetAll() {
List<int> retList = new List<int>();
lock (objLock) {
retList = new List<T>(list);
}
return retList;
}
}
Good stuff : Concurrent Collections very much in detail :http://www.albahari.com/threading/part5.aspx#_Concurrent_Collections
making use of concurrent collection ConcurrentBag Class can also resolve issue related to multiple thread update
Example
using System.Collections.Concurrent;
using System.Threading.Tasks;
public static class Program
{
public static void Main()
{
var items = new[] { "item1", "item2", "item3" };
var bag = new ConcurrentBag<string>();
Parallel.ForEach(items, bag.Add);
}
}
Using lock a the disadvantage of preventing concurrent readings.
An efficient solution which does not require changing the collection type is to use a ReaderWriterLockSlim
private static readonly ReaderWriterLockSlim _lock = new ReaderWriterLockSlim();
With the following extension methods:
public static class ReaderWriterLockSlimExtensions
{
public static void ExecuteWrite(this ReaderWriterLockSlim aLock, Action action)
{
aLock.EnterWriteLock();
try
{
action();
}
finally
{
aLock.ExitWriteLock();
}
}
public static void ExecuteRead(this ReaderWriterLockSlim aLock, Action action)
{
aLock.EnterReadLock();
try
{
action();
}
finally
{
aLock.ExitReadLock();
}
}
}
which can be used the following way:
_lock.ExecuteWrite(() => DoggieList.Add(new Doggie()));
_lock.ExecuteRead(() =>
{
// safe iteration
foreach (MyDoggie item in DoggieList)
{
....
}
})
And finally if you want to build your own collection based on this:
public class SafeList<T>
{
private readonly ReaderWriterLockSlim _lock = new ReaderWriterLockSlim();
private readonly List<T> _list = new List<T>();
public T this[int index]
{
get
{
T result = default(T);
_lock.ExecuteRead(() => result = _list[index]);
return result;
}
}
public List<T> GetAll()
{
List<T> result = null;
_lock.ExecuteRead(() => result = _list.ToList());
return result;
}
public void ForEach(Action<T> action) =>
_lock.ExecuteRead(() => _list.ForEach(action));
public void Add(T item) => _lock.ExecuteWrite(() => _list.Add(item));
public void AddRange(IEnumerable<T> items) =>
_lock.ExecuteWrite(() => _list.AddRange(items));
}
This list is totally safe, multiple threads can add or get items in parallel without any concurrency issue. Additionally, multiple threads can get items in parallel without locking each other, it's only when writing than 1 single thread can work on the collection.
Note that this collection does not implement IEnumerable<T> because you could get an enumerator and forget to dispose it which would leave the list locked in read mode.
make DoggieList of type ConcurrentStack and then use pushRange method. It is thread safe.
using System.Collections.Concurrent;
var doggieList = new ConcurrentStack<MyDoggie>();
doggieList.PushRange(YourCode)

Synchronous Task Execution in C#

I have a method in a Singleton class which will be called from different threads. But I need to execute them one by one. like
The method ImageUtil.Instance.LoadImage(imageID) will be called from multiple threads. But I want to load image one by one. So at a time only one image will load.
public class ImageUtil
{
#region Singleton Implementation
private ImageUtil()
{
taskList = new List<Task<object>>();
}
public static ImageUtil Instance { get { return Nested.instance; } }
private class Nested
{
// Explicit static constructor to tell C# compiler
// not to mark type as before field init
static Nested()
{
}
internal static readonly ImageUtil instance = new ImageUtil();
}
#endregion
Queue<Task<Object>> taskList;
bool isProcessing;
public async Task<Object> LoadImage(String imageID)
{
//Here what I need to put to execute "return await LoadImageInternal(imageID);"
//one by one. So that if one image is loading and mean time some other thread
//calls this method then the last thread have to wait until current loading finish.
}
private async Task<Object> LoadImageInternal(String imageID)
{
//Business Logic for image retrieval.
}
}
SemaphoreSlim has a WaitAsync method that allows you to enforce critical sections asynchronously:
private readonly SemaphoreSlim loadSemaphore = new SemaphoreSlim(1, 1);
public async Task<Object> LoadImage(String imageID)
{
await loadSemaphore.WaitAsync();
try
{
return await LoadImageInternal(imageID);
}
finally
{
loadSemaphore.Release();
}
}
This pattern is presented in Stephen Toub's article.
List<Task<Object>> taskList;
private static readonly object _syncLock = new object();
public Task<Object> LoadImage(String imageID)
{
return Task<Object>.Factory.StartNew(() =>
{
lock (_syncLock)
{
return LoadImageInternal(imageID).Result;
}
});
}
private async Task<Object> LoadImageInternal(String imageID)
{
//Business Logic for image retrieval.
}
That should accomplish what you asked for, but personally I would tackle this differently with a long-running task and a Queue of some sort. The long running task would simply loop forever and check the Queue for new items and then execute them one at a time, this would prevent a lot of unnecessary thread context switching.
//This is how you can implement it using yield return to return one image at a time
public IEnumerable<Task<string>> GetPerItemAsync(IEnumerable<string> images)
{
foreach (var image in images)
{
yield return LoadImage(image);
}
}
public static Task<string> LoadImage(string image)
{
var result = image.Trim(); // Some complex business logic on the image, NOT
return new Task<string>(() => result);
}
//Call your method in you client
//First, get your images from the data source
var listOfimages = Context.Images.ToList();
//Get results
var result = GetPerItemAsync(listOfimages).FirstOrDefault();

Categories

Resources