Best way to embed image coming from database in html view - c#

I'm using byte[] symbols to store images data in a database like this:
ITEM_IMAGE VARBINARY(MAX),
And then when I retrieve the image and display it, I proceed like this:
<img src="data:image/png;base64, #(Convert.ToBase64String(Model.mChildCard.NormalImage))" alt="#Model.mChildCard.mCardName" title="#Model.mChildCard.mCardName" class="nullify"/>
I do this because I cannot guarantee that our application will have write access on the server it will be deployed and, instead of storing the images in normal files (and there are a LOT of images, talking about 70k and more), we choose to store them in database and retrieve them as such.
Now I want to make sure this is the best way of handling those files in razor views as there may be a lot of images displayed at once. Will it have an impact on the speed it is rendered? What "weight" will have the database? Is there a better way to do things?

public FileStreamResult GetDBImage(string imageId)
{
using (var conn = GetConnection())
{
conn.Open();
using (var cmd = conn.CreateCommand)
{
cmd.CommandText = "SELECT ITEM_IMAGE FROM ... WHERE id=#id";
cmd.Parameters.Add("#id", imageId);
using (var rdr = cmd.ExecuteReader())
return File(rdr.GetStream(0), "image/png")
}
}
}
Also, consider using async.

To serve images:
Your new controller action:
public ActionResult GetImage(string imageID)
{
byte[] imgArray;
//call your db and get your byte array.
if(imgArray != null)
{
return File(imageArray, "image/png");
}
else
{
throw new HttpException(404);
}
}
Add a route:
routes.MapRoute("Images", "images/{imageId}", New With {.controller = "yourImageController", .action = "GetImage")
And from your HTML:
<img src="#Url.Action("GetImage", "YourImageController", new{ #imageId=Model.mChildCard.imageId})" alt="#Model.mChildCard.mCardName" title="#Model.mChildCard.mCardName" class="nullify"/>

Related

DataTable Takes Forever to Load Data from SqlDataReader

I am loading data from an MS SQL Server Table using the following code:
using (SqlDataReader rdr = cmd.ExecuteReader())
{
if (rdr.HasRows)
{
dt.Load(rdr); //takes forever to load
}
if (dt.Rows.Count > 0 && !dt.HasErrors)
{
Parallel.For (0, dt.Rows.Count, i =>
{
byte[] docBytes = (byte[])(dt.Rows[i]["DocObject"]); File.WriteAllBytes(Path.Combine(Path.Combine(AppDomain.CurrentDomain.BaseDirectory, "Documents\\"), $"{dt.Rows[i]["FileName"].ToString().ToLower()}"), docBytes);
});
}
}
}
The SQL query executes in less than one second. The data contains an SQL image column that holds binary document data. I used Stopwatch from System.Diagnostics to time the execution and found that this single dt.Load(rdr) statement is taking approximately 5 minutes to load about 5,000 records. My application needs to load several millions of rows and at this rate the app would be unusable. This is a Windows Forms application built using standard Windows Forms. Any ideas why dt.Load(rdr) takes forever? Any ideas on either rewriting this code or improving its performance would be greatly appreciated.
Try something like this, instead of loading all the rows into memory on the client:
using (SqlDataReader rdr = cmd.ExecuteReader(CommandBehavior.SequentialAccess))
{
while (rdr.Read())
{
string fn = rdr.GetString(0);
using (var rs = rdr.GetStream(1))
{
var fileName = $"c:\\temp\\{fn}.txt";
using (var fs = File.OpenWrite(fileName))
{
rs.CopyTo(fs);
}
Console.WriteLine(fileName);
}
}
}
Below code is untested.It is just an idea.
Another approach will be to define entity class and populate the list with SqldataReader.And Do not use DataTable at all.
Also one should close Database connection as soon as possible.So while fetching do not do other work.
Hope you are using connection pool in connection string
public class Example
{
public byte DocObject {get;set;}
public string FileName {get;set;}
}
List<Example> objList=new List<Example>();
using (SqlDataReader rdr = cmd.ExecuteReader())
{
while (rdr.Read())
{
Example obj=new Example();
obj.DocObject=(byte[])rdr["DocObject"] //suitable cast
obj.FileName =rdr["FileName "].toSting() //suitable cast
objList.Add(obj);
}
}
}
if (objList.Count > 0)
{
Parallel.For (0, objList.Count, i =>
{
byte[] docBytes = (byte[])(objList[i]["DocObject"]); File.WriteAllBytes(Path.Combine(Path.Combine(AppDomain.CurrentDomain.BaseDirectory, "Documents\\"), $"{objList[i]["FileName"].ToString().ToLower()}"), docBytes);
});
}
}

How to get JSON data from sql server 2016 and send it as it is (using IHttpActionResult) to client by web API

After Sql server 2016 we can select data direct as JSON by this statement:
SELECT Top (10) * from Products FOR JSON AUTO
So we no longer need to assign them to objects and convert them to JSON into code.
I think we can reduce the complexity of the process and get better performance.
I use web API 2 and I want to receive and direct send it to cleint .
Is this any new function or method works with SqlCommand to do this? Cloud you help me please?
This sample shows how to read the JSON from SQL Server using a SqlCommand:
var queryWithForJson = "SELECT ... FOR JSON";
var conn = new SqlConnection("<connection string>");
var cmd = new SqlCommand(queryWithForJson, conn);
conn.Open();
var jsonResult = new StringBuilder();
var reader = cmd.ExecuteReader();
if (!reader.HasRows)
{
jsonResult.Append("[]");
}
else
{
while (reader.Read())
{
jsonResult.Append(reader.GetValue(0).ToString());
}
}
In your ApiController, you can return the string using the ResponseMessage-method:
public IHttpActionResult Get()
{
var jsonResult = new StringBuilder();
/// get JSON
var response = new HttpResponseMessage(System.Net.HttpStatusCode.OK);
response.Content = new StringContent(jsonResult.ToString());
return ResponseMessage(response);
}
However, though technically feasible, IMHO there are some disadvantages that you should take into account when going this route:
You loose the ability to negotiate the content type that you return from your Web API. If you later on have to serve a client that requests XML, you cannot do this easily.
Another disadvantage, maybe minor disadvantage, is that you reduce the ability to scale the JSON conversion. Usually you have one database server whereas you can have several web frontends. Obviously you need the database server to get the data, but you can put the load of the conversion in a place that you can scale better.
I assume that it is more efficient to let SQL Server deliver the data in binary format to the frontends that perform the conversion. I doubt that it will be much faster to put this load on the database server - of course this depends on the infrastructure.

In ASP.Net Core, is it possible to start streaming JSON results?

I am using ASP.Net Core WebAPI.
I have a method that retrieves 10000 results from the database at a time, but I notice that it takes 1.17s to "wait" and 0.3s for the actual transfer (based on Chrome's network graph).
With the results from the database (postgres) are iterated through the DataReader and converted into a struct, added to a list, and ultimately returned as a JsonResult.
I do not know what to expect exactly for options, but I would like to be able to start returning as soon as possible to make the total request lower. I am also doing this for the first time on this platform, so I may not be doing things the best way.
[HttpGet("{turbine:int}")]
public IActionResult GetBearingTemperature(int turbine)
{
using (var connection = Database.GetConnection())
{
connection.Open();
int? page = GetPage();
var command = connection.CreateCommand();
if (page.HasValue)
{
command.CommandText = #"select turbine, timestamp, mainbearingtemperature from readings where turbine = :turbine limit 10000 offset :offset;";
command.Parameters.AddWithValue("offset", NpgsqlTypes.NpgsqlDbType.Integer, page.Value * 10000);
} else
{
command.CommandText = #"select turbine, timestamp, mainbearingtemperature from readings where turbine = :turbine limit 10000;";
}
command.Parameters.AddWithValue("turbine", NpgsqlTypes.NpgsqlDbType.Integer, 4, turbine);
var reader = command.ExecuteReader();
var collection = new List<BearingTemperature>();
if (reader.HasRows)
{
var bt = new BearingTemperature();
while (reader.Read())
{
bt.Time = reader.GetDateTime(1);
bt.Turbine = reader.GetInt32(0);
bt.Value = reader.GetDouble(2);
collection.Add(bt);
}
return new JsonResult(collection);
}
else
{
return new EmptyResult();
}
}
}
private int? GetPage()
{
if (Request.Query.ContainsKey("page"))
{
return int.Parse(Request.Query["page"]);
}
else return null;
}
struct BearingTemperature
{
public int Turbine;
public DateTime Time;
public double Value;
}
So I know this question is old, but this is very much possible in Asp.Net Core 2.2 (probably even from earlier versions, ever since IEnumerable<T> was supported as a return result on an action).
While I'm not entirely familiar with postgres and DataReader, the functionality is there to get it streaming the result to the client. Appending to a list, and returning the result in its entirety takes up a lot of memory depending on the size of the result, and streaming helps us avoid that.
Here is an example of an action, that returns an IEnumerable<string> that is streamed to the client (it is sent in chunks until everything has been delivered using the Transfer-Encoding: chunked header).
[HttpGet]
public IEnumerable<string> Get()
{
return GetStringsFor(10000);
}
private static readonly Random random = new Random();
private IEnumerable<string> GetStringsFor(int amount)
{
const string chars = "ABCDEFGHIJKLMNOPQRSTUVWXYZabcdefghijklmnopqrstuvwxyz0123456789";
while (amount-- > 0)
{
yield return new string(Enumerable.Repeat(chars, random.Next(1000)).Select(s => s[random.Next(s.Length)]).ToArray());
}
}
This will ensure that not everything is loaded into memory, but sent on demand. You would be able to implement something similar in your case when you're reading to data into memory, because that is one time where the system could just start sending the result instead.
private IEnumerable<BearingTemperature> ReadTemperatures(SqlDataReader reader)
{
if (reader.HasRows)
{
var bt = new BearingTemperature();
while (reader.Read())
{
bt.Time = reader.GetDateTime(1);
bt.Turbine = reader.GetInt32(0);
bt.Value = reader.GetDouble(2);
yield return bt;
}
}
yield break;
}
[HttpGet("{turbine:int}")]
public IEnumerable<BearingTemperature> GetBearingTemperature(int turbine)
{
using (var connection = Database.GetConnection())
{
<snip>
var reader = command.ExecuteReader();
return ReadTemperatures(reader);
}
}
Considering that your database is going to execute the query and return the entire result set, it's not possible for you to stream a partial result set (though you can google streaming database for other offerings). What you could do instead is use a paging technique combined with ajax to retrieve slices of the total result set and compose them together on the client to keep the responsiveness high and create the illusion of streaming query results.
You'll want to look at OFFSET and LIMIT clauses
On your api, you'd include parameters for offset and limit to allow the client to step through and retrieve the result set in whatever size chunks it wants, you can play with it to determine what seems responsive enough. Then on your client, you'll need a loop over an ajax call to your api, probably using jquery, and keep looping page after page adding the results to the bound collection on the client or create ui elements, or whatever, until the results come back empty.
Alternatively, if showing the whole 10k records at once isn't necessary, you could simply page the results and provide an interface to step through the pages. One that I've used for such a purpose is from Sakura on git hub: PagedList

How to add a new MvcSitemapProvider node at runtime

I'm working on a webshop-like asp.net mvc 4 website with a wcf-service datalayer. My application is build with maincategories, subcategories and products. Each product can only be in one subcategory and my url's are like this:
/maincategoryname/subcategoryname/{productid}/producttitle
And the corresponding breadcrumb trail:
Home > Maincategory > Subcategory > Producttitle
I'm currently using MvcSitemapProvider to generate my navigation menu's and breadcrumbs. I'm loading all the url's as dynamic nodes without cache. This solution works for a couple of products but when I add 1000 products the sitemap takes 6,5 second to populate, wich is way too long.
I turned on caching in MvcSitemapProvider. This way the application loads much faster. But when a user adds a new product and navigates to this new product (page). The url is not yet in the sitemap file because it uses cache. This way my navigation and breadcrumbs are not generated.
My question is:
Is it possible to add a new node to the sitemap at runtime after a user adds a new product?
The accepted answer is now a little out of date. In MvcSiteMapProvider v4, there is no longer a GetCacheDescription() method in a DynamicNodeProvider. This didn't seem to work anyway.
You can now invalidate the cache manually by using the [SiteMapCacheRelease] attribute on the action methods that update the data:
[MvcSiteMapProvider.Web.Mvc.Filters.SiteMapCacheRelease]
[HttpPost]
public ActionResult Edit(int id)
{
// Update the record
return View();
}
Or by calling a static method:
MvcSiteMapProvider.SiteMaps.ReleaseSiteMap();
You also have the option now to extend the framework to supply your own cache dependencies.
MvcSiteMapProvider allows for Dynamic Sitemaps that solve for Cache Dependancies.
You can enable this by creating a class which implements IDynamicNodeProvider.
Below is an example that generates dynamic nodes based on a database query, and also sets up a cache dependency on that same query.
public class ProductNodesProvider : IDynamicNodeProvider
{
static readonly string AllProductsQuery =
"SELECT Id, Title, Category FROM dbo.Product;";
string connectionString =
ConfigurationManager.ConnectionStrings ["db"].ConnectionString;
/// Create DynamicNode's out of all Products in our database
public System.Collections.Generic.IEnumerable<DynamicNode> GetDynamicNodeCollection()
{
var returnValue = new List<DynamicNode> ();
using (SqlConnection connection = new SqlConnection(connectionString)) {
SqlCommand command = new SqlCommand (AllProductsQuery, connection);
connection.Open ();
SqlDataReader reader = command.ExecuteReader ();
try {
while (reader.Read()) {
DynamicNode node = new DynamicNode ();
node.Title = reader [1];
node.ParentKey = "Category_" + reader [2];
node.RouteValues.Add ("productid", reader [0]);
returnValue.Add (node);
}
} finally {
reader.Close ();
}
}
return returnValue;
}
/// Create CacheDependancy on SQL
public CacheDescription GetCacheDescription ()
{
using (SqlConnection connection = new SqlConnection(connectionString)) {
SqlCommand command = new SqlCommand (AllProductsQuery, connection);
SqlCacheDependency dependancy = new SqlCacheDependency (command);
return new CacheDescription ("ProductNodesProvider")
{
Dependencies = dependancy
};
}
}
}
While this is all very nifty - and should invalidate the cache when your customers change products in the datbase - the whole SqlCacheDependancy can be tricky and is SQL Server-Version dependent.
You may go with a custom CacheDependacy instead, if you're using the cache to store your products.

MVC Stream outputting to HTML div

Small problem here with an MVC app that I'm not sure how to figure out a way around.
Basically, I'm adding additional functionality to a system that was originally created by someone else (c#). For a reporting system, the results were only ever displayed on screen. Now I am building in the functionality to allow the user the ability to download their report as an Excel document.
So basically, I have a view that displays the date ranges, and some other search refinement options to the user. I have introduced a radio button that if selected will download the report as opposed to displaying it on screen.
Here are my three actions within the ReportController:
public ActionResult Index()
{
return View();
}
public ActionResult ProductReport(AdminReportRequest reportRequest, FormCollection formVariables)
{
AdminEngine re = new AdminEngine();
if (!reportRequest.Download)
{
AdminReport report = re.GetCompleteAdminReport(reportRequest);
return View(report);
}
Stream ExcelReport = re.GetExcelAdminReport(reportRequest);
TempData["excelReport"] = ExcelReport;
return RedirectToAction("ExcelProductReport");
}
public FileResult ExcelReport()
{
var ExcelReport = TempData["excelReport"] as Stream;
return new FileStreamResult(ExcelReport, "application/ms-excel")
{
FileDownloadName = "Report" + DateTime.Now.ToString("MMMM d, yyy") + ".xls"
};
}
I've debugged through the AdminEngine, and everything looks fine. However, in the ExcelReport action, when it comes to returning the file - it doesn't. What I see is a lot of characters on screen (in the 'panelReport' div - see below), mixed in with what would be the data in the excel file.
I think I have established that the reason it is being displayed on screen is as a result of some code
that was written in the Index view:
<% using (Ajax.BeginForm("ProductReport", "Report", null,
new AjaxOptions
{
UpdateTargetId = "panelReport",
InsertionMode = InsertionMode.Replace,
OnSuccess = "pageLoaded",
OnBegin = "pageLoading",
OnFailure = "pageFailed",
LoadingElementId = ""
},
new { id = "SearchForm" })) %>
As you can see, the Ajax.BeginForm statement states that it should update to the panelReport div - which is what it's doing (through the Product Report partial view). While this is perfect for when the reports need to be displayed on screen, it is obviously not going to work with an excel file.
Is there a way of working around this issue without changing the existing code too much?
Here is the class where I do the workings for the excel file in case it's required to shed light on the situation:
Report Class:
public Stream GetExcelAdminReport(AdminReportRequest reportRequest)
{
AdminReport report = new AdminReport();
string dateRange = null;
List<ProductSale> productSales = GetSortedListOfProducts(reportRequest, out dateRange);
report.DateRange = dateRange;
if (productSales.Count > 0)
{
report.HasData = true;
CustomisedSalesReport CustSalesRep = new CustomisedSalesReport();
Stream SalesReport = CustSalesRep.GenerateCustomisedSalesFile(productSales);
return SalesReport;
}
}
Workings Class:
public class CustomisedSalesReport
{
public Stream GenerateCustomisedSalesFile(List<ProductSale> productSales)
{
MemoryStream ms = new MemoryStream();
HSSFWorkbook templateWorkbook = new HSSFWorkbook();
HSSFSheet sheet = templateWorkbook.CreateSheet("Sales Report");
//Workings
templateWorkbook.Write(ms);
ms.Position = 0;
return ms;
}
}
The problem is pretty obvious that you are using an Ajax Form to download a file. On top you are using the built-in Microsoft Ajax libraries, which seem to be not intelligent enough.
I can provide 2 solutions:
The easiest solution (which I have used in the past) is that instead of streaming a file yourself, create an excel file and save it on the server and then send the download link to the user. It won't require a lot of change to the code.
You could handle the OnSubmit event of the AjaxForm, see if you it's a file to download. If yes, then make a full postback request (using $.post()). This way the browser will automatically pop-up the dialog asking for where to download.
Hope it makes sense.

Categories

Resources