question on how to achieve threading in c# for this scenario - c#

I have this scenario where a windows service runs on the server. Every hour or so it reads a log file and saves the contents to db.
Now, there are going to be three files and these must be read and saved to three different tables. (i will read these connection strings etc from config file) This can be achieved by threading I know. So I want to call the existing 'readfile' method in a thread.
Am not familiar with threading, but is this the way to go?
NameValueConfigurationCollection config = ConfigurationManager.GetSection("LogDirectoryPath") as NameValueConfigurationCollection;
foreach (DictionaryEntry keyvalue in configs)
{
Thread t = new Thread(new ThreadStart(ExecuteProcess(keyvalue.Key.ToString())));
t.IsBackground = true;
t.Start();
}
private void ExecuteProcess(string path)
{
var xPathDocument = new XPathDocument(path);
XPathNavigator xPathNavigator = xPathDocument.CreateNavigator();
string connectionString = GetXPathQuery(
xPathNavigator,
"/connectionString/#value");
string commandText = GetXPathQuery(
xPathNavigator,
"/commandText/#value");
string filePath = GetXPathQuery(
xPathNavigator,
"/filePath/#value");
SqlConnection sqlConnection = new SqlConnection(connectionString);
sqlConnection.Open();
ProcessFiles(sqlConnection, commandText, filePath);
}
Do I have to make the method static ? What about the variables used?

In .NET 4 you could leverage the Task Parallel Library for this, so you would not have to explicitly create threads, but just express what code you want to execute in parallel:
Parallel.ForEach(configs.Select( x => x.Key.ToString()), path => ExecuteProcess(path));

My personal preference is to use ThreadPool for threads which I wish to run in parallel in a fire and forget scenario e.g. something like
foreach (DictionaryEntry keyvalue in configs)
{
ThreadPool.QueueUserWorkItem((state) =>
{
ExecuteProcess(keyvalue.Key.ToString());
});
}
The Parallel Task library is also useful but not guaranteed to go parallel and instead adapts, as I understand it, to the number of processors. It does however synchronise all your worker threads back again. So it depends on what it is you are trying to achieve,

Related

Aysnc method still blocking the ui interface

I have this method which is using ODBC and executes the reader async but the method is still blocking my UI for some reason it's loading 4000 records but I am wondering if someone can look at my code see where I am going wrong.
async Task<BindingList<PurchaseLinkHeaderC>> GetPurchaseOrders( IProgress<int> progress)
{
BindingList<PurchaseLinkHeaderC> _purhcaseOrderList = new BindingList<PurchaseLinkHeaderC>();
try
{
string sageDsn = ConfigurationManager.AppSettings["SageDSN"];
string sageUsername = ConfigurationManager.AppSettings["SageUsername"];
string sagePassword = ConfigurationManager.AppSettings["SagePassword"];
//using (var connection = new OdbcConnection("DSN=SageLine50v24;Uid=Manager;Pwd=;"))
using (var connection =
new OdbcConnection(String.Format("DSN={0};Uid={1};Pwd={2};", sageDsn, sageUsername, sagePassword)))
{
connection.Open();
string fromD = dtpFrom.Value.ToString("yyyy-MM-dd");
string toD = dtpTo.Value.ToString("yyyy-MM-dd");
string SQL =
"SELECT 'ORDER_NUMBER', 'ORDER_OR_QUOTE', 'ORDER_DATE', 'DELIVERY_DATE', 'ORDER_STATUS_CODE', 'ORDER_STATUS', 'DELIVERY_STATUS_CODE', 'DELIVERY_STATUS', 'ACCOUNT_REF', 'NAME', 'ADDRESS_1', 'ADDRESS_2', 'ADDRESS_3', 'ADDRESS_4', 'ADDRESS_5', 'C_ADDRESS_1', 'C_ADDRESS_2', 'C_ADDRESS_3', 'C_ADDRESS_4', 'C_ADDRESS_5', 'DEL_NAME', 'DEL_ADDRESS_1', 'DEL_ADDRESS_2', 'DEL_ADDRESS_3', 'DEL_ADDRESS_4', 'DEL_ADDRESS_5', 'VAT_REG_NUMBER', 'REFERENCE', 'CONTACT_NAME', 'TAKEN_BY', 'SUPP_ORDER_NUMBER', 'SUPP_TEL_NUMBER', 'NOTES_1', 'NOTES_2', 'NOTES_3', 'SUPP_DISC_RATE', 'FOREIGN_ITEMS_NET', 'FOREIGN_ITEMS_TAX', 'FOREIGN_ITEMS_GROSS', 'ITEMS_NET', 'ITEMS_TAX', 'ITEMS_GROSS', 'TAX_RATE_1', 'TAX_RATE_2', 'TAX_RATE_3', 'TAX_RATE_4', 'TAX_RATE_5', 'NET_AMOUNT_1', 'NET_AMOUNT_2', 'NET_AMOUNT_3', 'NET_AMOUNT_4', 'NET_AMOUNT_5', 'TAX_AMOUNT_1', 'TAX_AMOUNT_2', 'TAX_AMOUNT_3', 'TAX_AMOUNT_4', 'TAX_AMOUNT_5', 'COURIER_NUMBER', 'COURIER_NAME', 'CONSIGNMENT', 'CARR_NOM_CODE', 'CARR_TAX_CODE', 'CARR_DEPT_NUMBER', 'CARR_DEPT_NAME', 'FOREIGN_CARR_NET', 'FOREIGN_CARR_TAX', 'FOREIGN_CARR_GROSS', 'CARR_NET', 'CARR_TAX', 'CARR_GROSS', 'FOREIGN_INVOICE_NET', 'FOREIGN_INVOICE_TAX', 'FOREIGN_INVOICE_GROSS', 'INVOICE_NET', 'INVOICE_TAX', 'INVOICE_GROSS', 'CURRENCY', 'CURRENCY_TYPE', 'EURO_GROSS', 'EURO_RATE', 'FOREIGN_RATE', 'SETTLEMENT_DUE_DAYS', 'SETTLEMENT_DISC_RATE', 'FOREIGN_SETTLEMENT_DISC_AMOUNT', 'FOREIGN_SETTLEMENT_TOTAL', 'SETTLEMENT_DISC_AMOUNT', 'SETTLEMENT_TOTAL', 'PAYMENT_REF', 'PRINTED', 'PRINTED_CODE', 'POSTED', 'POSTED_CODE', 'QUOTE_STATUS_ID', 'RECURRING_REF', 'DUNS_NUMBER', 'PAYMENT_TYPE', 'BANK_REF', 'GDN_NUMBER', 'PROJECT_ID', 'ANALYSIS_1', 'ANALYSIS_2', 'ANALYSIS_3', 'INVOICE_PAYMENT_ID', 'RESUBMIT_INVOICE_PAYMENT_REQUIRED', 'RECORD_CREATE_DATE', 'RECORD_MODIFY_DATE', 'RECORD_DELETED' FROM 'PURCHASE_ORDER' WHERE ORDER_DATE >='{0}' and ORDER_DATE <='{1}'";
int counter = 0;
using (var command = new OdbcCommand(string.Format(SQL, fromD, toD), connection))
{
using (var reader = await command.ExecuteReaderAsync())
{
while (await reader.ReadAsync())
{
var purhcaseOrders = new PurchaseLinkHeaderC();
if ((reader["ORDER_NUMBER"] != ""))
{
counter++;
string orderNumber = Convert.ToString(reader["ORDER_NUMBER"]);
purhcaseOrders.Order_Number = OrderNumber.ToString();
purhcaseOrders.PurchaseOrderNo = Convert.ToInt32(reader["ORDER_NUMBER"]);
purhcaseOrders.Name = reader["NAME"].ToString();
purhcaseOrders.Selected_PurchaseOrder = false;
_purhcaseOrderList.Add(purhcaseOrders);
}
}
}
}
}
}
catch (Exception ex)
{
var logger = NLog.LogManager.GetCurrentClassLogger();
logger.Info(ex, "Error at GetSalesOrders " + ex.ToString());
}
return _purhcaseOrderList;
}
I want to be able to show a progress bar of the list being loaded so I was attempting to call it as this way. But I am not sure how to attach the progress method either.
var progressIndicator = new Progress<int>(ReportProgress);
//call async method
BindingList<PurchaseLinkHeaderC> purchaseOrders = await GetPurchaseOrders(progressIndicator);
_masterPurchaseOrders = purchaseOrders;
I am hoping that someone can help here.
When you await, the default behaviour is to use the sync-context, if one, when coming back from the async operation. In the case of a UI application, the sync-context is: the UI.
So; right now, there are a lot of things that come back to the UI. This is useful when context matters, but in your case: it doesn't - since you are simply returning a list.
This means that you should be able to add .ConfigureAwait(false) to a lot of those await expressions - for example:
while (await reader.ReadAsync().ConfigureAwait(false))
This disconnects the sync-context behaviour, and may improve what you are seeing. You would ideally to add that to all of the await calls in the utility method (GetPurchaseOrders).
You may also wish to look for any missing async operations - for example, the connection.Open(); could be a await connection.OpenAsync().ConfigureAwait(false);
Note that the calling code should not use ConfigureAwait(false) - since the binding-list touches the UI, it needs the sync-context. So: don't add ConfigureAwait(false) to the await GetPurchaseOrders(...) call.
There is also one other possibility: you say that you are using ODBC and "sage". It is entirely possible that the ODBC/sage API doesn't support await, and it is being implemented as "sync over async". If this is the case, it gets tricky. You might need to use a thread instead of async/await in that case - perhaps via ThreadPool.QueueUserWorkItem. There are ways to invoke async code on worker threads, but if the "async" code is actually "sync code that pretends to be async", there's not really any point, and you might as well do it "the old way". This usually means:
start a worker (ThreadPool)
do some work on the worker (your existing code, but perhaps using the non-async implementation)
at the end of the worker, use Control.Invoke to push the work back to the UI thread for the final "update the UI" step

Run method with parameters in parallel

I have a method which takes an argument and run it against database, retrieve records, process and save processed records to a new table. Running the method from the service with one parameter works. What i am trying to achieve now is make the parameter dynamic. I have implemented a method to retrieve the parameters and it works fine. Now i am trying to run methods parallel from the list of parameter's provided. My current implementation is:
WorkerClass WorkerClass = new WorkerClass();
var ParametersList = WorkerClass.GetParams();
foreach (var item in ParametersList){
WorkerClass WorkerClass2 = new WorkerClass();
Parallel.Invoke(
()=>WorkerClass2.ProcessAndSaveMethod(item)
);
}
On the above implementation i think defining a new WorkerClass2 defies the whole point of Parallel.Invoke but i am having an issue with data mixup when using already defined WorkerClass. The reason for the mix up is Oracle connection is opened inside the Init() Method of the class and static DataTable DataCollectionList; is defined on a class level thus creating an issue.
Inside the method ProcessAndSaveMethod(item) i have:
OracleCommand Command = new OracleCommand(Query, OracleConnection);
OracleDataAdapter Adapter = new OracleDataAdapter(Command);
Adapter.Fill(DataCollectionList);
Inside init():
try
{
OracleConnection = new OracleConnection(Passengers.OracleConString);
DataCollectionList = new DataTable();
OracleConnection.Open();
return true;
}
catch (Exception ex)
{
OracleConnection.Close();
DataCollectionList.Clear();
return false;
}
And the function isn't run parallely as i was trying to do. Is there another way to implement this?
To run it in parallel you need to call Parallel.Invoke only once with all the tasks to be completed:
Parallel.Invoke(
ParametersList.Select(item =>
new Action(()=>WorkerClass2.ProcessAndSaveMethod(item))
).ToArray()
);
If you have a list of somethings and want it processed in parallel, there really is no easier way than PLinq:
var parametersList = SomeObject.SomeFunction();
var resultList = parametersList.AsParallel()
.Select(item => new WorkerClass().ProcessAndSaveMethod(item))
.ToList();
The fact that you build up a new connection and use a lot of variables local to the one item you process is fine. It's actually the preferred way to do multi-threading: keep as much local to the thread as you can.
That said, you have to measure if multi-threading is actually the fastest way to solve your problem. Maybe you can do your processing sequentially and then do all your database stuff in one go with bulk inserts, temporary tables or whatever is suited to your specific problem. Splitting a task into smaller tasks for more processors to run is not always faster. It's a tool and you need to find out if that tool is helping in your specific situation.
I achieved parallel processing using the below code and also avoided null pointer exception from DbCon.open() caused by connection pooling using the max degree of parallelism parameter.
Parallel.ForEach(ParametersList , new ParallelOptions() { MaxDegreeOfParallelism = 5 }, item=>
{
WorkerClass Worker= new WorkerClass();
Worker.ProcessAndSaveMethod(item);
});

Using SQL Server application locks to solve locking requirements

I have a large application based on Dynamics CRM 2011 that in various places has code that must query for a record based upon some criteria and create it if it doesn't exist else update it.
An example of the kind of thing I am talking about would be similar to this:
stk_balance record = context.stk_balanceSet.FirstOrDefault(x => x.stk_key == id);
if(record == null)
{
record = new stk_balance();
record.Id = Guid.NewGuid();
record.stk_value = 100;
context.AddObject(record);
}
else
{
record.stk_value += 100;
context.UpdateObject(record);
}
context.SaveChanges();
In terms of CRM 2011 implementation (although not strictly relevant to this question) the code could be triggered from synchronous or asynchronous plugins. The issue is that the code is not thread safe, between checking if the record exists and creating it if it doesn't, another thread could come in and do the same thing first resulting in duplicate records.
Normal locking methods are not reliable due to the architecture of the system, various services using multiple threads could all be using the same code, and these multiple services are also load balanced across multiple machines.
In trying to find a solution to this problem that doesn't add massive amounts of extra complexity and doesn't compromise the idea of not having a single point of failure or a single point where a bottleneck could occur I came across the idea of using SQL Server application locks.
I came up with the following class:
public class SQLLock : IDisposable
{
//Lock constants
private const string _lockMode = "Exclusive";
private const string _lockOwner = "Transaction";
private const string _lockDbPrincipal = "public";
//Variable for storing the connection passed to the constructor
private SqlConnection _connection;
//Variable for storing the name of the Application Lock created in SQL
private string _lockName;
//Variable for storing the timeout value of the lock
private int _lockTimeout;
//Variable for storing the SQL Transaction containing the lock
private SqlTransaction _transaction;
//Variable for storing if the lock was created ok
private bool _lockCreated = false;
public SQLLock (string lockName, int lockTimeout = 180000)
{
_connection = Connection.GetMasterDbConnection();
_lockName = lockName;
_lockTimeout = lockTimeout;
//Create the Application Lock
CreateLock();
}
public void Dispose()
{
//Release the Application Lock if it was created
if (_lockCreated)
{
ReleaseLock();
}
_connection.Close();
_connection.Dispose();
}
private void CreateLock()
{
_transaction = _connection.BeginTransaction();
using (SqlCommand createCmd = _connection.CreateCommand())
{
createCmd.Transaction = _transaction;
createCmd.CommandType = System.Data.CommandType.Text;
StringBuilder sbCreateCommand = new StringBuilder();
sbCreateCommand.AppendLine("DECLARE #res INT");
sbCreateCommand.AppendLine("EXEC #res = sp_getapplock");
sbCreateCommand.Append("#Resource = '").Append(_lockName).AppendLine("',");
sbCreateCommand.Append("#LockMode = '").Append(_lockMode).AppendLine("',");
sbCreateCommand.Append("#LockOwner = '").Append(_lockOwner).AppendLine("',");
sbCreateCommand.Append("#LockTimeout = ").Append(_lockTimeout).AppendLine(",");
sbCreateCommand.Append("#DbPrincipal = '").Append(_lockDbPrincipal).AppendLine("'");
sbCreateCommand.AppendLine("IF #res NOT IN (0, 1)");
sbCreateCommand.AppendLine("BEGIN");
sbCreateCommand.AppendLine("RAISERROR ( 'Unable to acquire Lock', 16, 1 )");
sbCreateCommand.AppendLine("END");
createCmd.CommandText = sbCreateCommand.ToString();
try
{
createCmd.ExecuteNonQuery();
_lockCreated = true;
}
catch (Exception ex)
{
_transaction.Rollback();
throw new Exception(string.Format("Unable to get SQL Application Lock on '{0}'", _lockName), ex);
}
}
}
private void ReleaseLock()
{
using (SqlCommand releaseCmd = _connection.CreateCommand())
{
releaseCmd.Transaction = _transaction;
releaseCmd.CommandType = System.Data.CommandType.StoredProcedure;
releaseCmd.CommandText = "sp_releaseapplock";
releaseCmd.Parameters.AddWithValue("#Resource", _lockName);
releaseCmd.Parameters.AddWithValue("#LockOwner", _lockOwner);
releaseCmd.Parameters.AddWithValue("#DbPrincipal", _lockDbPrincipal);
try
{
releaseCmd.ExecuteNonQuery();
}
catch {}
}
_transaction.Commit();
}
}
I would use this in my code to create a SQL Server application lock using the unique key I am querying for as the lock name like this
using (var sqlLock = new SQLLock(id))
{
//Code to check for and create or update record here
}
Now this approach seems to work, however I am by no means any kind of SQL Server expert and am wary about putting this anywhere near production code.
My question really has 3 parts
1. Is this a really bad idea because of something I haven't considered?
Are SQL Server application locks completely unsuitable for this purpose?
Is there a maximum number of application locks (with different names) you can have at a time?
Are there performance considerations if a potentially large number of locks are created?
What else could be an issue with the general approach?
2. Is the solution actually implemented above any good?
If SQL Server application locks are usable like this, have I actually used them properly?
Is there a better way of using SQL Server to achieve the same result?
In the code above I am getting a connection to the Master database and creating the locks in there. Does that potentially cause other issues? Should I create the locks in a different database?
3. Is there a completely alternative approach that could be used that doesn't use SQL Server application locks?
I can't use stored procedures to create and update the record (unsupported in CRM 2011).
I don't want to add a single point of failure.
You can do this much easier.
//make sure your plugin runs within a transaction, this is the case for stage 20 and 40
//you can check this with IExecutionContext.IsInTransaction
//works not with offline plugins but works within CRM Online (Cloud) and its fully supported
//also works on transaction rollback
var lockUpdateEntity = new dummy_lock_entity(); //simple technical entity with as many rows as different lock barriers you need
lockUpdateEntity.Id = Guid.parse("well known guid"); //well known guid for this barrier
lockUpdateEntity.dummy_field=Guid.NewGuid(); //just update/change a field to create a lock, no matter of its content
//--------------- this is untested by me, i use the next one
context.UpdateObject(lockUpdateEntity);
context.SaveChanges();
//---------------
//OR
//--------------- i use this one, but you need a reference to your OrganizationService
OrganizationService.Update(lockUpdateEntity);
//---------------
//threads wait here if they have no lock for dummy_lock_entity with "well known guid"
stk_balance record = context.stk_balanceSet.FirstOrDefault(x => x.stk_key == id);
if(record == null)
{
record = new stk_balance();
//record.Id = Guid.NewGuid(); //not needed
record.stk_value = 100;
context.AddObject(record);
}
else
{
record.stk_value += 100;
context.UpdateObject(record);
}
context.SaveChanges();
//let the pipeline flow and the transaction complete ...
For more background info refer to http://www.crmsoftwareblog.com/2012/01/implementing-robust-microsoft-dynamics-crm-2011-auto-numbering-using-transactions/

Entity Framework new transaction is not allowed because there are other threads running in the session, multi thread save

I'm tryng to save on a DB the log of a multi thread processo but I'm getting the following error: new transaction is not allowed because there are other threads running in the session.
in each tread I have this function:
internal bool WriteTrace(IResult result, string message, byte type)
{
SPC_SENDING_TRACE trace = new SPC_SENDING_TRACE(
message,
Parent.currentLine.CD_LINE,
type,
Parent.currentUser.FULLNAME,
Parent.guid);
Context.SPC_SENDING_TRACE.AddObject(trace);
if (Context.SaveChanges(result) == false)
return false;
return true;
}
the Context is different for each thread, but the connection with the DB is always the same.
is there a way to solve this problem?
thank you
Andrea
You should create a context for each transaction and then dispose it, you can do that like this:
using(var ctx = new MyContext()) {
//do transaction here
}
After the closed bracket the context is disposed.
For better understanding refer to this post where you can find a great answer by ken2k. Hope you can fix you issue :)
UPDATE:
You should also try adding .ToList() to every LINQ query you have. When you iterate over a LINQ result, you can't make any changes until the iteration has finished. Check if you have something like that or share more code i.e. the piece of code where you call WriteTrace. Hope that this time this actually helps you.
I use entity framework in a multi-threaded environment, where any thread, ui and background (both STA and MTA), can concurrently update the same database. I resolved this problem by re-creating the entity connection from scratch at the start of usage on any new background thread. Examining the entity connection instance ConnectionString shows a reader guid which I assume is used to link common connection instances. By recreating the entity connection from scratch the guid values are different for each thread and no conflict appears to occur.
// Build the connection string.
var sqlBuilder = new SqlConnectionStringBuilder();
sqlBuilder.DataSource = serverName;
sqlBuilder.InitialCatalog = databaseName;
sqlBuilder.MultipleActiveResultSets = true;
...
var providerString = sqlBuilder.ToString();
var sqlConnection = new SqlConnection(providerString);
// Build the emtity connection.
Assembly metadataAssembly = Assembly.GetExecutingAssembly();
Assembly[] metadataAssemblies = { metadataAssembly };
var metadataBase = #"res://*/{0}.csdl|res://*/{0}.ssdl|res://*/{0}.msl";
var dbModelMetadata = String.Format(metadataBase, objectContextTypeModelName);
// eg: "res://*/Models.MyDatabaseModel.csdl|res://*/Models.MyDatabaseModel.ssdl|res://*/Models.MyDatabaseModel.msl"
var modelMetadataPaths = modelMetadata.Split('|');
var metadataWorkspace = new MetadataWorkspace(modelMetadataPaths, metadataAssemblies);
var entityDbConnection = new EntityConnection(metadataWorkspace, sqlConnection);
return entityDbConnection;

C# Thread Spawn Parameter Passing

Scenario
I have a line of code whereby I pass a good number of parameters into a method.
CODE as described above
foreach(Asset asset in assetList)
{
asset.ContributePrice(m_frontMonthPrice, m_Vol, m_divisor, m_refPrice, m_type,
m_overrideVol, i, m_decimalPlaces, metalUSDFID, metalEURFID);
}
What I really want to do...
What I really want to do is spawn a new thread everytime I call this method so that it does the work quicker (there are a lot of assets).
Envisaged CODE
foreach(Asset asset in assetList)
{
Thread myNewThread =
new Thread(new ThreadStart(asset.ContributePrice (m_frontMonthPrice, m_Vol,
m_divisor, m_refPrice, m_type, m_overrideVol, i, m_decimalPlaces, metalUSDFID,
metalEURFID)));
myNewThread.Start();
}
ISSUES
This is something which has always bothered me......why can't I pass the parameters into the thread.....what difference does it make?
I can't see a way around this that won't involve lots of refactoring......
.......This is an old application, built piece by piece as a result of feature creep.
Therefore, the code itself is messy and hard to read/follow.
I thought I had pinpointed an area to save some time and increase the processing speed but now I've hit a wall with this.
SUGGESTIONS?
Any help or suggestions would be greatly appreciated.
Cheers.
EDIT:
I'm using .Net 3.5.......I could potentially update to .Net 4.0
If you're using C# 3, the easiest way would be:
foreach(Asset asset in assetList)
{
Asset localAsset = asset;
ThreadStart ts = () => localAsset.ContributePrice (m_frontMonthPrice, m_Vol,
m_divisor, m_refPrice, m_type, m_overrideVol, i,
m_decimalPlaces, metalUSDFID, metalEURFID);
new Thread(ts).Start();
}
You need to take a "local" copy of the asset loop variable to avoid weird issues due to captured variables - Eric Lippert has a great blog entry on it.
In C# 2 you could do the same with an anonymous method:
foreach(Asset asset in assetList)
{
Asset localAsset = asset;
ThreadStart ts = delegate { localAsset.ContributePrice(m_frontMonthPrice,
m_Vol, m_divisor, m_refPrice, m_type, m_overrideVol, i,
m_decimalPlaces, metalUSDFID, metalEURFID); };
new Thread(ts).Start();
}
In .NET 4 it would probably be better to use Parallel.ForEach. Even before .NET 4, creating a new thread for each item may well not be a good idea - consider using the thread pool instead.
Spawning a new thread for each task will most likely make the task run significantly slower. Use the thread pool for that as it amortizes the cost of creating new threads. If you're on .NET 4 take a look at the new Task class.
If you need to pass parameters to a thread when starting it, you must use the ParameterizedThreadStart delegate. If you need to pass several parameters, consider encapsulating them in a type.
You could use ParameterizedThreadStart. You'll need to wrap all of your parameters into a single object. (Untested code below).
struct ContributePriceParams
{
public decimal FrontMonthPrice;
public int Vol;
//etc
}
//...
foreach(Asset asset in assetList)
{
ContributePriceParams pStruct = new pStruct() {FrontMonthPrice = m_frontMonthPrice, Vol = m_vol};
ParameterizedThreadStart pStart = new ParameterizedThreadStart(asset.ContributePrice);
Thread newThread = new Thread(pStart);
newThread.Start(pStruct);
}

Categories

Resources