Posting multiple documents to CouchDB using myCouch - c#

I'm migrating an SQL database to couchDB. I'm having problem when I post multiple documents, say around 8K doc ids. Code below:
MyClass cl = new MyClass();
foreach (DataRow row in dteqEvent.Rows)
{
NewSnDocument pn = new NewSnDocument();
pn.id = row[1].ToString(); //this is the document id
pn.val = row[2].ToString();
string json = JsonConvert.SerializeObject(pn);
cl.PostToCouch(json); //method under MyClass to post documents
}
Then under MyClass I have the method below:
public async void PostToCouch(string json)
{
using (var client = new MyCouchClient(HostServer, Database))
{
var resp = await client.Documents.PostAsync(json);
Console.WriteLine(resp.StatusCode);
}
}
The first 2K ids are POSTed successfully then it gives me an error after that. Error says: "Unable to connect to the remote server." InnerException states "No connection could be made because the target machine actively refused it." Is this something to do with my couchDB configuration.
Is there an alternative way of POSTing multiple documents. I saw a bulk operation in MyCouch but it is not clear to me: https://github.com/danielwertheim/mycouch/wiki/documentation#bulk-operations
Thanks in advance!
UPDATE:
Alright I managed to solve my problem by tweaking the code a little bit:
MyClass cl = new MyClass();
List<NewSnDocument> pnList = new List<NewSnDocument>();
foreach (DataRow row in dteqEvent.Rows)
{
NewSnDocument pn = new NewSnDocument();
pn.id = row[1].ToString(); //this is the document id
pn.val = row[2].ToString();
pnList.Add(pn);
}
cl.PostToCouch(pnList);
Then method under MyClass:
public async void PostToCouch(List<NewSnDocument> obj)
{
int r = obj.Count;
using (var client = new MyCouchClient(HostServer, Database))
{
for(int i=0; i<r; i++)
{
string json = JsonConvert.SerializeObject(obj[i]);
var resp = await client.Documents.PostAsync(json);
Console.WriteLine(resp.StatusCode);
}
}

I think even your updated code doesn't look right. I'm not sure, please take a look at the comments/modifications I made in your code:
MyClass cl = new MyClass();
List<NewSnDocument> pnList = new List<NewSnDocument>(); //List of documents
foreach (DataRow row in dteqEvent.Rows)
{
NewSnDocument pn = new NewSnDocument();
pn.id = row[1].ToString();
pn.val = row[2].ToString();
// cl.PostToCouch(pnList);
pnList.push(pn);//You need to push each document to the list of documents
//I'm not sure about C#, but there should some "push" method
//or something equivalent to "push"
}
cl.PostToCouch(pnList);//"pnList" contains a list of documents
//So it should be posted to CouchDB outside "foreach" loop
//After all documents have been pushed into it

Related

Get ElasticSearch bulk queue status with NEST

I have a program that performs several bulk index operation on an ElasticSearch cluster. At some point, I start getting errors like this one (snipped):
RemoteTransportException[...][indices:data/write/bulk[s]]]; nested: EsRejectedExecutionException[rejected execution (queue capacity 100) ...];
Is there a way I can verify the status of the bulk upload queue, ideally using NEST, so that I can slow down the client application in case I see that the queue on the server is getting full?
The NodesInfo method looks interesting, but I don't see how to access the information I need:
using Nest;
using System;
class Program {
static void Main(string[] args) {
ElasticClient client = new ElasticClient(new ConnectionSettings(new Uri("http://whatever:9200/")));
var nodesInfoResponse = client.NodesInfo();
if (nodesInfoResponse.IsValid) {
foreach (var n in nodesInfoResponse.Nodes) {
Console.WriteLine($"Node: {n.Key}");
var bulk = n.Value.ThreadPool["bulk"];
// ???
}
}
}
}
You need to use NodesStats() and not NodesInfo().
var nodesStatsResponse = client.NodesStats();
if (nodesStatsResponse.IsValid)
{
foreach (var node in nodesStatsResponse.Nodes)
{
long bulkThreadPoolQueueSize = node.Value.ThreadPool["bulk"].Queue;
}
}
UPDATE:
The above query will bring in a lot of information than required. A highly optimized request for getting the same information is through the usage of _cat/thread_pool API. See below:
var catThreadPoolResponse = client.CatThreadPool(d => d.H("host", "bulk.queue"));
if (catThreadPoolResponse.IsValid)
{
foreach (var record in catThreadPoolResponse.Records)
{
string nodeName = record.Host;
long bulkThreadPoolQueueSize = int.Parse(record.Bulk.Queue);
Console.WriteLine($"Node [{nodeName}] : BulkThreadPoolQueueSize [{bulkThreadPoolQueueSize}]");
}
}

Xamarin ListView display item from SQLite Database C# Android

in my case i wanted to display items from local SQLite database which i created as shown below:
public string CreateDB() //create database
{
var output = "";
string dbPath = Path.Combine(Environment.GetFolderPath(Environment.SpecialFolder.MyDocuments), "IsoModule.db3");
output = "Database Created";
return output;
}
public string CreateTable() //create table
{
try
{
string dbPath = Path.Combine(Environment.GetFolderPath(Environment.SpecialFolder.MyDocuments), "IsoModule.db3");
var db = new SQLiteConnection(dbPath);
db.CreateTable<UserInfo>();
db.CreateTable<TableInfo>();
string result = "Table(s) created";
return result;
}
catch (Exception ex)
{
return ("Error" + ex.Message);
}
}
and this is my code where i wish to retrieve data
string path = Path.Combine(Environment.GetFolderPath(Environment.SpecialFolder.MyDocuments), "IsoModule.db3");
var tablelistout = new SQLiteConnection(path);
var alltables = tablelistout.Table<TableInfo>();
foreach (var listing in alltables)
{
var from = new string[]
{
listing.tname + " - " + listing.status
};
ListView listtable = (ListView)FindViewById(Resource.Id.listtable);
listtable.Adapter = new ArrayAdapter(this, Android.Resource.Layout.SimpleListItem1, from);
}
the code runs with NO ERROR but it only display last item in the table. it is confusing me, so i would like to ask how can i retrieve all the data from specific table?
or if someone has asked the same question please share me the link. many appreciate.
var alltables = tablelistout.Table<TableInfo>();
var data = new List<string>();
foreach (var listing in alltables)
{
data.Add(listing.tname + " - " + listing.status);
}
ListView listtable = (ListView)FindViewById(Resource.Id.listtable);
listtable.Adapter = new ArrayAdapter(this, Android.Resource.Layout.SimpleListItem1, data.ToArray());
All I did was move 2 things out of the loop. First, I moved out the initialization of the array. Second, I moved out the listView + assignation of the adapter.
Your issue is that in the loop, you were always overriding everything you had done in the previous iteration (leaving you with the last item like you said).
Also, You should take note that it will be important for you to create a custom adapter if you plan on having a decent amount of data. ArrayAdapter is a native Android class which is then wrapped by a C# Xamarin object, meaning you will have both a C# and Java object per row. It adds overhead as both garbage collectors will have work to do and can cause performance issues. Xamarin devs tend to generally avoid it with the exception of quick prototyping.
On another note, I would use the FindViewById<T>(Int32) instead of the FindViewById(Int32) and casting it. FindViewById<ListView>(Resource.Id.listtable) in your case. Just taking advantage of the power of generics.

How do I bind a datatable in a controller to a model via a repository and a view?

I have been working on a project for months and seem to always get stuck on binding the results so that I can pass them to a view where a user can bind the results to a model. I created a repository though I am unsure if it works but I will work on that when I figure this part out.
Well, my problem is the repository list, I can not figure out how to add the csvData columns to the DCResultsRepository list that is DCResults1, as I have understood is what I should do.
Beneath I have written the code of the first controller, the problem is located in this area:
for(int y = 0; y <csvData.Columns.Count; y++)
{
string dc = csvData.Columns[y].ColumnName.ToString();
var repository = new DCResultsRepository {
SelectedDCResults= dc
};
List<string> DCList DCResultsRepos.DCResults1 = new List<string>();
var DC = DCResultsRepos.DCResults1(DCResultsRepos.SelectedDCResults.ToList());
}
DCResultsRepos.DCResults1.Contains(DCResultsRepos.SelectedDCResults);
This is the whole controller so that you can see what I am trying to do a bit more easily:
[HttpPost]
public ActionResult Import(HttpPostedFileBase file, string Delimiter, string Firstrow)
{
DataTable csvData = new DataTable();
if (file != null && file.ContentLength > 0)
{
try
{
using (TextFieldParser csvReader = new TextFieldParser(file.InputStream))
{
csvReader.SetDelimiters(new string[] { Delimiter });
csvReader.HasFieldsEnclosedInQuotes = Firstrow == "true" ? true : false;
string[] colFields = csvReader.ReadFields();
foreach (string column in colFields)
{
DataColumn DataColumn = new DataColumn(column);
DataColumn.AllowDBNull = true;
csvData.Columns.Add(DataColumn);
}
while (!csvReader.EndOfData)
{
string[] fieldData = csvReader.ReadFields();
for (int i = 0; i < fieldData.Length; i++)
{
if (fieldData[i] == "")
{
fieldData[i] = null;
}
}
csvData.Rows.Add(fieldData);
}
DataSet csvdata = new DataSet();
csvdata.Tables.Add(csvData);
}
BFProj2.Models.OurColumns o = new Models.OurColumns();
DCResultsRepository DCResultsRepos = new DCResultsRepository();
o.DCResults = new List<string>();
for(int y = 0; y <csvData.Columns.Count; y++)
{
string dc = csvData.Columns[y].ColumnName.ToString();
var repository = new DCResultsRepository {
SelectedDCResults= dc
};
//DCResultsRepos.DCResults1 = new List<string>();
List<string> DCList DCResultsRepos.DCResults1 = new List<string>();
var DC = DCResultsRepos.DCResults1(DCResultsRepos.SelectedDCResults.ToList());
}
DCResultsRepos.DCResults1.Contains(DCResultsRepos.SelectedDCResults);
}
catch (Exception ex)
{
//return View("Error"+ ex.GetType().ToString());
}
Ska få med DCResults1 listan med csvData i.
}
return View();
}
I am unsure how the repository should be thought I will work on that after I have figured out what I can do to bind csvData and repository so that I can pass it on to a get actionresult. But if it is important for answering the question the repository looks like this:
using System;
using System.Collections.Generic;
using System.Linq;
using System.Web;
using BFProj2.Models;
using System.Data.Entity;
namespace BFProj2
{
public class DCResultsRepository
{
public List<string> DCResults1 { get; set; }
public string SelectedDCResults { get; set; }
}
}
Also this is another question that I have asked a short while back that I have been trying to use when figuring out how to let the user bind the csvData results to a model: How do I bind a datatable to model and then to a dropdownlist in mvc?
Though I am still confused. And I have looked at a lot of similar questions and read up about it. But there are always these differences that makes it seem that what I am trying to do to be impossible.
How am I supposed to bind the csvData columns to a model through a repository?
I think part of the problem is that your controller seems clogged with logic that shouldn't be in the controller. This makes reading your own code harder, as well as writing the same basic operations, because it contains all kinds of operations that do not make sense when writing a controller method. The single responsibility principle states that every class should have a single responsibility, and that responsibility should be entirely encapsulated by the class. All its services should be narrowly aligned with that responsibility. The controllers responsibility is to mediate between the models and the view, not to parse data from files.
I would recommend reading the implementing of repository and uow patterns. Once these are in place you could even go further by seperating queries and commands, in which the data is bound to models by (data/query)Handlers. This is all done is seperate files so that your controller methods are clean and do what they are supposed to do.

Dynamics CRM SDK: Execute Multiple Requests for Bulk Update of around 5000 Records

I have written a function to update Default Price List for all the Active Products on the CRM 2013 Online.
//The method takes IOrganization service and total number of records to be created as input
private void UpdateMultipleProducts(IOrganizationService service, int batchSize, EntityCollection UpdateProductsCollection, Guid PriceListGuid)
{
//To execute the request we have to add the Microsoft.Xrm.Sdk of the latest SDK as reference
ExecuteMultipleRequest req = new ExecuteMultipleRequest();
req.Requests = new OrganizationRequestCollection();
req.Settings = new ExecuteMultipleSettings();
req.Settings.ContinueOnError = true;
req.Settings.ReturnResponses = true;
try
{
foreach (var entity in UpdateProductsCollection.Entities)
{
UpdateRequest updateRequest = new UpdateRequest { Target = entity };
entity.Attributes["pricelevelid"] = new EntityReference("pricelevel", PriceListGuid);
req.Requests.Add(updateRequest);
}
var res = service.Execute(req) as ExecuteMultipleResponse; //Execute the collection of requests
}
//If the BatchSize exceeds 1000 fault will be thrown.In the catch block divide the records into batchable records and create
catch (FaultException<OrganizationServiceFault> fault)
{
if (fault.Detail.ErrorDetails.Contains("MaxBatchSize"))
{
var allowedBatchSize = Convert.ToInt32(fault.Detail.ErrorDetails["MaxBatchSize"]);
int remainingCreates = batchSize;
while (remainingCreates > 0)
{
var recordsToCreate = Math.Min(remainingCreates, allowedBatchSize);
UpdateMultipleProducts(service, recordsToCreate, UpdateProductsCollection, PriceListGuid);
remainingCreates -= recordsToCreate;
}
}
}
}
Code Description : There are around 5000 active product records in the System. So I am updating Default Price List for all of them using above code.
But, I am missing here something so that, it has updated only 438 records. It loops through the While statement correctly, but it is not updating all of them here.
What should be the Batchsize when we run this function for the First Time?
Any one can help me here?
Thank you,
Mittal.
You pass remainingCreates as the batchSize parameter but your code never references batchSize so you are just going to reenter that while loop every time.
Also, I'm not sure how you are doing all your error handling but you need to update your catch block so that it doesn't just let FaultExceptions pass-through if they don't contain a MaxBatchSize value. Right now, if you take a FaultException regarding something other than batch size it will be ignored.
{
if (fault.Detail.ErrorDetails.Contains("MaxBatchSize"))
{
var allowedBatchSize = Convert.ToInt32(fault.Detail.ErrorDetails["MaxBatchSize"]);
int remainingCreates = batchSize;
while (remainingCreates > 0)
{
var recordsToCreate = Math.Min(remainingCreates, allowedBatchSize);
UpdateMultipleProducts(service, recordsToCreate, UpdateProductsCollection, PriceListGuid);
remainingCreates -= recordsToCreate;
}
}
else throw;
}
Instead of reactive handling, i prefer proactive handling of the MaxBatchSize, this is true when you already know what is MaxMatchSize is.
Following is sample code, here while adding OrgRequest to collection i keep count of batch and when it exceeds I call Execute and reset the collection to take fresh batch.
foreach (DataRow dr in statusTable.Rows)
{
Entity updEntity = new Entity("ABZ_NBA");
updEntity["ABZ_NBAid"] = query.ToList().Where(a => a.NotificationNumber == dr["QNMUM"].ToString()).FirstOrDefault().TroubleTicketId;
//updEntity["ABZ_makerfccall"] = false;
updEntity["ABZ_rfccall"] = null;
updEntity[cNBAttribute.Key] = dr["test"];
req.Requests.Add(new UpdateRequest() { Target = updEntity });
if (req.Requests.Count == 1000)
{
responseWithResults = (ExecuteMultipleResponse)_orgSvc.Execute(req);
req.Requests = new OrganizationRequestCollection();
}
}
if (req.Requests.Count > 0)
{
responseWithResults = (ExecuteMultipleResponse)_orgSvc.Execute(req);
}

C# Downloadable Excel Files from Class Library

I'm looking for some advice. I'm building on an additional feature to a C# project that someone else wrote. The solution of the project consists of an MVC web application, with a few class libraries.
What I'm editing is the sales reporting function. In the original build, a summary of the sales reports were generated on the web application. When the user generates the sales report, a Reporting class is called in one of the C# class libraries. I'm trying to make the sales reports downloadable in an excel file when the user selects a radio button.
Here is a snippet of code from the Reporting class:
public AdminSalesReport GetCompleteAdminSalesReport(AdminSalesReportRequest reportRequest)
{
AdminSalesReport report = new AdminSalesReport();
string dateRange = null;
List<ProductSale> productSales = GetFilteredListOfAdminProductSales(reportRequest, out dateRange);
report.DateRange = dateRange;
if (titleSales.Count > 0)
{
report.HasData = true;
report.Total = GetTotalAdminSales(productSales);
if (reportRequest.Type == AdminSalesReportRequest.AdminSalesReportType.Complete)
{
report.ProductSales = GetAdminProductSales(productSales);
report.CustomerSales = GetAdminCustomerSales(productSales);
report.ManufacturerSales = GetAdminManufacturerSales(productSales);
if (reportRequest.Download)
{
FileResult ExcelDownload = GetExcelDownload(productSales);
}
}
}
return report;
}
So as you can see, if reportRequest.Download == true, the class should start up the process of creating the excel file. All the GetAdminSales functions do it use linq queries to sort out the sales if they are being displayed on the webpage.
So I have added this along with the GetAdminSales functions:
private FileResult GetExcelDownload(List<TitleSale> titleSales)
{
CustomisedSalesReport CustSalesRep = new CustomisedSalesReport();
Stream SalesReport = CustSalesRep.GenerateCustomisedSalesStream(productSales);
return new FileStreamResult(SalesReport, "application/ms-excel")
{
FileDownloadName = "SalesReport" + DateTime.Now.ToString("MMMM d, yyy") + ".xls"
};
}
and to format the excel sheet, I'm using the NPOI library, and my formatter class is laid out like so:
public class CustomisedSalesReport
{
public Stream GenerateCustomisedSalesStream(List<ProductSale> productSales)
{
return GenerateCustomisedSalesFile(productSales);
}
private Stream GenerateCustomisedSalesFile(List<ProductSale> productSales)
{
MemoryStream ms = new MemoryStream();
HSSFWorkbook templateWorkbook = new HSSFWorkbook();
HSSFSheet sheet = templateWorkbook.CreateSheet("Sales Report");
HSSFRow dataRow = sheet.CreateRow(0);
HSSFCell cell = dataRow.CreateCell(0);
cell = dataRow.CreateCell(0);
cell.SetCellValue(DateTime.Now.ToString("MMMM yyyy") + " Sales Report");
dataRow = sheet.CreateRow(2);
string[] colHeaders = new string[] {
"Product Code",
"Product Name",
"Qty Sold",
"Earnings",
};
int colPosition = 0;
foreach (string colHeader in colHeaders)
{
cell = dataRow.CreateCell(colPosition++);
cell.SetCellValue(colHeader);
}
int row = 4;
var adminTotalSales = GetAdminProductSales(productSales);
foreach (SummaryAdminProductSale t in adminTotalSales)
{
dataRow = sheet.CreateRow(row++);
colPosition = 0;
cell = dataRow.CreateCell(colPosition++);
cell.SetCellValue(t.ProductCode);
cell = dataRow.CreateCell(colPosition++);
cell.SetCellValue(t.ProductName);
cell = dataRow.CreateCell(colPosition++);
cell.SetCellValue(t.QtySold);
cell = dataRow.CreateCell(colPosition++);
cell.SetCellValue(t.Total.ToString("0.00"));
}
}
templateWorkbook.Write(ms);
ms.Position = 0;
return ms;
}
Again like before, the GetAdminSales (GetAdminProductSales, etc) are contained in the bottom of the class, and are just linq queries to gather the data.
So when I run this, I don't get any obvious errors. The summary sales report appears on screen as normal but no excel document downloads. What I have done, which may be putting this off is in my class library I have referened the System.Web.Mvc dll in order to download the file (I have not done it any other way before - and after reading up on the net I got the impression I could use it in a class library).
When I debug through the code to get a closer picture of what's going on, everything seems to be working ok, all the right data is being captured but I found that from the very start - the MemoryStream ms = new Memory Stream declaration line in my formatter class shows up this (very hidden mind you) :
ReadTimeout '((System.IO.Stream)(ms)).ReadTimeout'
threw an exception of type
'System.InvalidOperationException' int
{System.InvalidOperationException}
+{"Timeouts are not supported on this stream."} System.SystemException
{System.InvalidOperationException}
I get the same for 'WriteTimeout'...
Apologies for the long windedness of the explaination. I'd appreciate it if anyone could point me in the right direction, either to solve my current issue, or an alternative way of making this work.
Without getting bogged down in the details, the obvious error is that in GenerateCustomisedSalesFile you create a MemoryStream ms, do nothing with it, then return it.

Categories

Resources