Problem speeding up report rendering with FastReports - c#

I'm working on report rendering with FastReports - I'm doing this coming from a system to render with Crystal Reports. When using Crystal, I found that preloading a report and then binding parameters on request sped up crystal dramatically, since most of the time for a small layout like an invoice is in the setup. I'm now trying to achieve the same with FastReports.
It's unclear how much time setup takes however, so I'd also be interested in whether this is not a worthwhile endeavour.
My issue is that I have used a JSON API call, and used ConnectionStringExpression with a single parameter. In a nutshell, changing the parameter does not reload the data when I call Prepare.
Here's my code, with the second report load taken out, it renders the same report twice.
var report = new Report();
report.Load("C:\\dev\\ia\\products\\StratusCloud\\AppFiles\\Reports\\SalesQuoteItems.frx");
var urlTemplate = "http://localhost:9502/data/sales-quote/{CardCode#}/{DocEntry#}";
var reportParms = new Dictionary<string, dynamic>();
reportParms.Add("CardCode#", "C20000");
reportParms.Add("DocEntry#", 77);
var connectionstring = "Json=" + System.Text.RegularExpressions.Regex.Replace(urlTemplate, "{([^}]+)}", (m) => {
if (reportParms.ContainsKey(m.Groups[1].Value))
{
return string.Format("{0}", reportParms[m.Groups[1].Value]);
}
return m.Value;
});
var dataapiparm = report.Dictionary.Parameters.FindByName("DataAPIUrl#");
if (dataapiparm != null)
{
dataapiparm.Value = connectionstring;
}
foreach(FastReport.Data.Parameter P in report.Dictionary.Parameters)
{
if (reportParms.ContainsKey(P.Name))
{
P.Value = reportParms[P.Name];
}
}
report.Prepare();
var pdfExport = new PDFSimpleExport();
pdfExport.Export(report, "test1.pdf");
//report = new Report();
//report.Load("C:\\dev\\ia\\products\\StratusCloud\\AppFiles\\Reports\\SalesQuoteItems.frx");
reportParms["DocEntry#"] = 117;
connectionstring = "Json=" + System.Text.RegularExpressions.Regex.Replace(urlTemplate, "{([^}]+)}", (m) => {
if (reportParms.ContainsKey(m.Groups[1].Value))
{
return string.Format("{0}", reportParms[m.Groups[1].Value]);
}
return m.Value;
});
dataapiparm = report.Dictionary.Parameters.FindByName("DataAPIUrl#");
if (dataapiparm != null)
{
dataapiparm.Value = connectionstring;
}
foreach (FastReport.Data.Parameter P in report.Dictionary.Parameters)
{
if (reportParms.ContainsKey(P.Name))
{
P.Value = reportParms[P.Name];
}
}
report.Prepare();
pdfExport.Export(report, "test2.pdf");
Cheers,
Mark

Fast Project definitely doesn't recalculate the ConnectionStringExpression on report.Prepare, so I went back to another method that I was looking at. It turns out that if the ConnectionString itself is rewritten, then report.Prepare does refetch the data.
A simple connection string without a schema takes a long time to process, so I remove everything beyond the semi-colon and keep it, replace the url portion of the connectionstring, and then stick teh same schema information back on the end.
Copying the schema information into each report generation connection string seems to remove around 10 seconds from report.Prepare!
At the moment it's the best that I can do, and I wonder if there is another more efficient method of rerunning the same report against new data (having the same schema).

Related

How to vertically format embedded fields

Current Formatting For Embed Fields
Here is an embed I currently use for my semi-public Ark Servers.
First field is the Map name,
Second field is the direct connect IP Address,
Third field is if/where there is a community base on that map.
As you can see it works as intended but if there's to much info on a single line in the field the formatting is screwed up. Is there a way to fix this?
I'm using 3 separate stream builders to build the different fields and then adding them to the embed. If code is needed I can post a "dumbed down version" so it doesn't take up the whole page.
var linkHeading = "steam://connect/";
var sb = new StringBuilder();
var sb2 = new StringBuilder();
var sb3 = new StringBuilder();
var embed = new EmbedBuilder();
embed.WithColor(new Color(0, 255, 0));
embed.Title = "List of Server Ips";
JObject o1;
using (StreamReader file = File.OpenText("serverips.json"))
using (JsonTextReader reader = new JsonTextReader(file))
{
o1 = (JObject)JToken.ReadFrom(reader);
}
var ipsObject = JsonConvert.DeserializeObject<Rootobject>(o1.ToString());
sb.AppendLine("The Island: ");
sb2.AppendLine($"{linkHeading}{ipsObject.TheIsland.ip}:{ipsObject.TheIsland.port}/");
if(ipsObject.TheIsland.comm != "")
{
sb3.AppendLine($"Comm: {ipsObject.TheIsland.comm}");
} else { sb3.AppendLine($"No Comm Info Available"); };
sb.AppendLine("Aberration: ");
sb2.AppendLine($"{linkHeading}{ipsObject.Aberration.ip}:{ipsObject.Aberration.port}/");
if (ipsObject.Aberration.comm != "")
{
sb3.AppendLine($"Comm: {ipsObject.Aberration.comm}");
} else { sb3.AppendLine($"No Comm Info Available"); };
embed.WithDescription($"Cluster Ip and Comm Information");
embed.AddField(x =>
{
x.Name = "Map";
x.Value = sb.ToString();
x.IsInline = true;
});
embed.AddField(x =>
{
x.Name = "IP";
x.Value = sb2.ToString();
x.IsInline = true;
});
embed.AddField(x =>
{
x.Name = "Comm?";
x.Value = sb3.ToString();
x.IsInline = true;
});
await Context.User.SendMessageAsync(null, false, embed.Build());
await ReplyAsync("Server Ip List was sent directly to your inbox! :)");
You don't have that much control over how embed fields are displayed. The only thing you control in regards to fields are if they inline or not. The rendering is completely up to Discord and the end users screen size. For example, your current output on mobile will ignore the inline setting and list the fields one on top of the other instead of side by side.
Unless your fields consistently contain a small amount of text each you can't guarantee how the end use will see the output. If you need to guarantee some sort of consistent structured display across all devices, your best bet is to use an image.

Xamarin ListView display item from SQLite Database C# Android

in my case i wanted to display items from local SQLite database which i created as shown below:
public string CreateDB() //create database
{
var output = "";
string dbPath = Path.Combine(Environment.GetFolderPath(Environment.SpecialFolder.MyDocuments), "IsoModule.db3");
output = "Database Created";
return output;
}
public string CreateTable() //create table
{
try
{
string dbPath = Path.Combine(Environment.GetFolderPath(Environment.SpecialFolder.MyDocuments), "IsoModule.db3");
var db = new SQLiteConnection(dbPath);
db.CreateTable<UserInfo>();
db.CreateTable<TableInfo>();
string result = "Table(s) created";
return result;
}
catch (Exception ex)
{
return ("Error" + ex.Message);
}
}
and this is my code where i wish to retrieve data
string path = Path.Combine(Environment.GetFolderPath(Environment.SpecialFolder.MyDocuments), "IsoModule.db3");
var tablelistout = new SQLiteConnection(path);
var alltables = tablelistout.Table<TableInfo>();
foreach (var listing in alltables)
{
var from = new string[]
{
listing.tname + " - " + listing.status
};
ListView listtable = (ListView)FindViewById(Resource.Id.listtable);
listtable.Adapter = new ArrayAdapter(this, Android.Resource.Layout.SimpleListItem1, from);
}
the code runs with NO ERROR but it only display last item in the table. it is confusing me, so i would like to ask how can i retrieve all the data from specific table?
or if someone has asked the same question please share me the link. many appreciate.
var alltables = tablelistout.Table<TableInfo>();
var data = new List<string>();
foreach (var listing in alltables)
{
data.Add(listing.tname + " - " + listing.status);
}
ListView listtable = (ListView)FindViewById(Resource.Id.listtable);
listtable.Adapter = new ArrayAdapter(this, Android.Resource.Layout.SimpleListItem1, data.ToArray());
All I did was move 2 things out of the loop. First, I moved out the initialization of the array. Second, I moved out the listView + assignation of the adapter.
Your issue is that in the loop, you were always overriding everything you had done in the previous iteration (leaving you with the last item like you said).
Also, You should take note that it will be important for you to create a custom adapter if you plan on having a decent amount of data. ArrayAdapter is a native Android class which is then wrapped by a C# Xamarin object, meaning you will have both a C# and Java object per row. It adds overhead as both garbage collectors will have work to do and can cause performance issues. Xamarin devs tend to generally avoid it with the exception of quick prototyping.
On another note, I would use the FindViewById<T>(Int32) instead of the FindViewById(Int32) and casting it. FindViewById<ListView>(Resource.Id.listtable) in your case. Just taking advantage of the power of generics.

Multithreading with Windows Store Applications

We are currently creating a Windows Store Application which gains information from an RSS feed and inputs this information into an ObservableCollection. The issue we are having is when the information is being gained, the Applications UI becomes unresponsive.
In order to get around this, I thought about creating a new thread and calling the method within this. Though, after some research we realised that this was no longer possible in Windows Store Apps. How can we get around this?
The method that collects the information is below.
public void getFeed()
{
setupImages();
string[] feedUrls = new string[] {
"http://www.igadgetos.co.uk/blog/category/gadget-news/feed/",
"http://www.igadgetos.co.uk/blog/category/gadget-reviews/feed/",
"http://www.igadgetos.co.uk/blog/category/videos/feed/",
"http://www.igadgetos.co.uk/blog/category/gaming/feed/",
"http://www.igadgetos.co.uk/blog/category/jailbreak-2/feed/",
"http://www.igadgetos.co.uk/blog/category/kickstarter/feed/",
"http://www.igadgetos.co.uk/blog/category/cars-2/feed/",
"http://www.igadgetos.co.uk/blog/category/software/feed/",
"http://www.igadgetos.co.uk/blog/category/updates/feed/"
};
{
try
{
XNamespace dc = "http://purl.org/dc/elements/1.1/";
XNamespace content = "http://purl.org/rss/1.0/modules/content/";
foreach (var feedUrl in feedUrls)
{
var doc = XDocument.Load(feedUrl);
var feed = doc.Descendants("item").Select(c => new ArticleItem() //Creates a copy of the ArticleItem Class.
{
Title = c.Element("title").Value,
//There are another 4 of these.
Post = stripTags(c.Element(content + "encoded").Value) }
).OrderByDescending(c => c.PubDate);
this.moveItems = feed.ToList();
foreach (var item in moveItems)
{
item.ID = feedItems.Count;
feedItems.Add(item);
}
}
lastUpdated = DateTime.Now;
}
catch
{
MessageDialog popup = new MessageDialog("An error has occured downloading the feed, please try again later.");
popup.Commands.Add(new UICommand("Okay"));
popup.Title = "ERROR";
popup.ShowAsync();
}
}
}
How would we be able to cause the Application to not freeze as we gain this information, as Threading is not possible within Windows Store Applications.
E.g - We planned to use;
Thread newThread = new Thread(getFeed);
newThread.Start
You need to use the well documented async pattern for your operations that happen on the UI thread. The link given by Paul-Jan in the comments is where you need to start. http://msdn.microsoft.com/en-us/library/windows/apps/hh994635.aspx

Out of Memory at line XXXX

can anyone help me how to resolve the out of memory error on my asp page? im using linq to sql.. after adding data several data.. like more than 10 rows. in the grid. an out of memory error occurs.. attached herewith is my add function..
public ServiceDetail checkservicedetailid()
{
string ServiceName = ViewState["Tab"].ToString();
ServiceDetail checkservicedetailid = ServiceDetails_worker.get(a => a.ServiceName == ServiceName && a.MarginAnalysisID == checkmarginanalysisid().MarginAnalysisID).SingleOrDefault();
return checkservicedetailid;
}
public IEnumerable<ServiceDetail> get(Expression<Func<ServiceDetail, Boolean>> express)
{
return ServiceDetailsDB.ServiceDetails.Where(express);
}
protected void btnSaveEmptyOC_Click(object sender, EventArgs e)
{
try
{
if (checkservicedetailid() != null)
{
CashExpense tblCashExpenses = new CashExpense();
Guid CashExpensesID = Guid.NewGuid();
tblCashExpenses.CashExpensesID = CashExpensesID;
tblCashExpenses.ServiceDetailsID = checkservicedetailid().ServiceDetailsID;
tblCashExpenses.Description = txtDescriptionEmptyOC.Text;
tblCashExpenses.Quantity = Decimal.Parse(txtQTYEmptyOC.Text);
tblCashExpenses.UnitCost = Decimal.Parse(txtUnitCostEmptyOC.Text);
tblCashExpenses.CreatedBy = User.Identity.Name;
tblCashExpenses.DateCreated = DateTime.Now;
tblCashExpenses.CashExpensesTypeID = "OTHER";
CashExpenses_worker.insert(tblCashExpenses);
CashExpenses_worker.submit();
//Clear items after saving
txtDescriptionEmptyOC.Text = "";
txtQTYEmptyOC.Text = "";
txtUnitCostEmptyOC.Text = "";
ValidationMessage.ShowValidationMessage(MessageCenter.CashExpenseMaintenace.InsertOC2, "SaveEmptyOC", this.Page);
MyAuditProvider.Insert(this.GetType().ToString(), ViewState["MarginAnalysisID"].ToString(), MessageCenter.Mode.ADD, MessageCenter.CashExpenseMaintenace.InsertOC2, Page.Request, User);
divOtherCost.Visible = false;
grd_othercost.Visible = true;
btnaddothercost.Visible = true;
}
else
{
//Displays a Message on the Validation Summary (Service Id does not exist)
ValidationMessage.ShowValidationMessage(MessageCenter.CashExpenseMaintenace.SaveServiceDetailOC, "SaveEmptyOC", this.Page);
}
}
catch
{
//Displays a Message on the Validation Summary (Error on Saving)
ValidationMessage.ShowValidationMessage(MessageCenter.CashExpenseMaintenace.InsertOCError, "SaveEmptyOC", this.Page);
}
finally
{
//Rebinds the Grid
populategrd_othercost();
}
}
I'm guessing from your code here:
ServiceDetail checkservicedetailid = ServiceDetails_worker.get(
a => a.ServiceName == ServiceName &&
a.MarginAnalysisID == checkmarginanalysisid().MarginAnalysisID
).SingleOrDefault();
that .get() is taking a Func<SomeType, bool>, and you are doing something like:
var row = dbCtx.SomeTable.Where(predicate);
(please correct me here if I'm incorrect)
This, however, is using LINQ-to-Objects, meaning: it is loading every row from the table to the client and testing locally. That'll hurt memory, especially if a different db-context is created for each row. Additionally, the checkmarginanalysisid() call is being executed per row, when presumably it doesn't change between rows.
You should be testing this with an Expression<Func<SomeType, bool>> which would be translated to TSQL and executed at the server. You may also need to remove untranslatable methods, i.e.
var marginAnalysisId = checkmarginanalysisid().MarginAnalysisID;
ServiceDetail checkservicedetailid = ServiceDetails_worker.get(
a => a.ServiceName == ServiceName &&
a.MarginAnalysisID == marginAnalysisId
).SingleOrDefault();
where that is get(Expression<Func<SomeType, bool>>).
I tried all of the solution given to me both by my peers as well as the solution provided here, from GC.Collect, to disposing linq datacontext after use etc. however the error keeps on occurring, i then tried to remove the update panel, Ive read a site that showed how ridiculous update panel when it comes to handling data esp when a function is done repeatedly. And poof! the memory problem is gone!

Adding AsParallel() call cause my code to break on writing a file

I'm building a console application that have to process a bunch of document.
To stay simple, the process is :
for each year between X and Y, query the DB to get a list of document reference to process
for each of this reference, process a local file
The process method is, I think, independent and should be parallelized as soon as input args are different :
private static bool ProcessDocument(
DocumentsDataset.DocumentsRow d,
string langCode
)
{
try
{
var htmFileName = d.UniqueDocRef.Trim() + langCode + ".htm";
var htmFullPath = Path.Combine("x:\path", htmFileName;
missingHtmlFile = !File.Exists(htmFullPath);
if (!missingHtmlFile)
{
var html = File.ReadAllText(htmFullPath);
// ProcessHtml is quite long : it use a regex search for a list of reference
// which are other documents, then sends the result to a custom WS
ProcessHtml(ref html);
File.WriteAllText(htmFullPath, html);
}
return true;
}
catch (Exception exc)
{
Trace.TraceError("{0,8}Fail processing {1} : {2}","[FATAL]", d.UniqueDocRef, exc.ToString());
return false;
}
}
In order to enumerate my document, I have this method :
private static IEnumerable<DocumentsDataset.DocumentsRow> EnumerateDocuments()
{
return Enumerable.Range(1990, 2020 - 1990).AsParallel().SelectMany(year => {
return Document.FindAll((short)year).Documents;
});
}
Document is a business class that wrap the retrieval of documents. The output of this method is a typed dataset (I'm returning the Documents table). The method is waiting for a year and I'm sure a document can't be returned by more than one year (year is part of the key actually).
Note the use of AsParallel() here, but I never got issue with this one.
Now, my main method is :
var documents = EnumerateDocuments();
var result = documents.Select(d => {
bool success = true;
foreach (var langCode in new string[] { "-e","-f" })
{
success &= ProcessDocument(d, langCode);
}
return new {
d.UniqueDocRef,
success
};
});
using (var sw = File.CreateText("summary.csv"))
{
sw.WriteLine("Level;UniqueDocRef");
foreach (var item in result)
{
string level;
if (!item.success) level = "[ERROR]";
else level = "[OK]";
sw.WriteLine(
"{0};{1}",
level,
item.UniqueDocRef
);
//sw.WriteLine(item);
}
}
This method works as expected under this form. However, if I replace
var documents = EnumerateDocuments();
by
var documents = EnumerateDocuments().AsParrallel();
It stops to work, and I don't understand why.
The error appears exactly here (in my process method):
File.WriteAllText(htmFullPath, html);
It tells me that the file is already opened by another program.
I don't understand what can cause my program not to works as expected. As my documents variable is an IEnumerable returning unique values, why my process method is breaking ?
thx for advises
[Edit] Code for retrieving document :
/// <summary>
/// Get all documents in data store
/// </summary>
public static DocumentsDS FindAll(short? year)
{
Database db = DatabaseFactory.CreateDatabase(connStringName); // MS Entlib
DbCommand cm = db.GetStoredProcCommand("Document_Select");
if (year.HasValue) db.AddInParameter(cm, "Year", DbType.Int16, year.Value);
string[] tableNames = { "Documents", "Years" };
DocumentsDS ds = new DocumentsDS();
db.LoadDataSet(cm, ds, tableNames);
return ds;
}
[Edit2] Possible source of my issue, thanks to mquander. If I wrote :
var test = EnumerateDocuments().AsParallel().Select(d => d.UniqueDocRef);
var testGr = test.GroupBy(d => d).Select(d => new { d.Key, Count = d.Count() }).Where(c=>c.Count>1);
var testLst = testGr.ToList();
Console.WriteLine(testLst.Where(x => x.Count == 1).Count());
Console.WriteLine(testLst.Where(x => x.Count > 1).Count());
I get this result :
0
1758
Removing the AsParallel returns the same output.
Conclusion : my EnumerateDocuments have something wrong and returns twice each documents.
Have to dive here I think
This is probably my source enumeration in cause
I suggest you to have each task put the file data into a global queue and have a parallel thread take writing requests from the queue and do the actual writing.
Anyway, the performance of writing in parallel on a single disk is much worse than writing sequentially, because the disk needs to spin to seek the next writing location, so you are just bouncing the disk around between seeks. It's better to do the writes sequentially.
Is Document.FindAll((short)year).Documents threadsafe? Because the difference between the first and the second version is that in the second (broken) version, this call is running multiple times concurrently. That could plausibly be the cause of the issue.
Sounds like you're trying to write to the same file. Only one thread/program can write to a file at a given time, so you can't use Parallel.
If you're reading from the same file, then you need to open the file with only read permissions as not to put a write lock on it.
The simplest way to fix the issue is to place a lock around your File.WriteAllText, assuming the writing is fast and it's worth parallelizing the rest of the code.

Categories

Resources