Ive just started using NonFactors.Grid.Mvc6 -Version 6.2.4. Ive got its basic functionality working and Im able to retrieve data from my serverside code (.net core 5). I want to implement paging but Im only able to get it to page through the dataset ive received on the clientside. For example, Ive returned 5 rows from the database, the grid only allows me to page through those 5 items. I cant find any documentation (or examples) of using ajax calls to retrieve the data in pages (specifying the current page and number of rows). This is of no use in the real world so the grid must be capable of this somehow (hopefully), but there is nothing documented. Has anyone managed to do this ? Id really appreciate some examples, the documentation is pretty poor
I have the same problem, if I want to use the paging from the backend
Code example
.Pageable(pager =>
{
pager.ShowPageSizes = true;
pager.PageSizes = Model.Paging.PageSizes;
pager.PagesToDisplay = Model.Paging.PagesToDisplay;
pager.CurrentPage = Model.Paging.CurrentPage;
pager.RowsPerPage = Model.Paging.RowsPerPage;
pager.TotalRows = Model.Paging.TotalRows;
})
You can set the TotalRows value, but it will be recalculated for the _Pager.cshtml view based on the Rows.
public virtual IQueryable<T> Process(IQueryable<T> items)
{
TotalRows = items.Count();
My workaround, add/modify the following line to the _Grid.cshtml file:
#if (Model.Pager != null)
{
Model.Pager.TotalRows = Model.Pager.PagesToDisplay * Model.Pager.RowsPerPage;
#await Html.PartialAsync(Model.Pager.PartialViewName, Model.Pager)
}
Related
I have inherited a Windows Forms Application. We are having some performance issues, when trying to save a new record to the SQL Table. It hangs.
For the most part, I have ruled out the database or Table Structure as the problem. I don't believe that is the case here.
I am very new to this, and am having trouble stepping through and finding the root of the problem.
The form basics: (Only included what I thought was relevant code, but can add more if needed)
public partial class frmBadgeCreate : daBaseForm
{
private boBadger_History oBadger_History;
Form mainFormHandler;
private string whome;
}
public frmBadgeCreate(String cutomItem)
{
InitializeComponent();
if (daAppDesktop.IsRunning)
{
oBadger_History = new boBadger_History();
oBadger_History.GetAll(); /// This line right here seems to have some importance
whome = customItem;
}
}
public void saveitall() /// Save Action that hangs
{
// listing of form variables to be saved to table columns
var vlast = textbox_H_lname.Text;
var vfirst = textbox_H_fname.Text;
. . .and on and on . . .
var badger_History = new Badger_History() {hlastname = vlast, vfirstname = vfirst . . . and on and on . . };
oBadger_History.Add(badger_History);
oBadger_History.Save(); /// This is where things just hang forever.
Because this is a 'Lone Ranger App' that was handed to me, I am struggling to grasp it. What really confuses me is that when I comment out the 'oBadger_History.GetAll()' line, the save works very fast! Instantly. When I add the line back in, it hangs. I only know this, because I have spent days commenting out each line, one by one, and testing results.
The oBadger_History.GetAll(); looks like it is somehow used for the auto complete feature, so it is needed.
What has me totally scratching my head is that I can't see the connection between the 'GetAll' and the save. Why would the getAll impact the save function at all?
Here is the GetAll code, if that sheds any light:
public daBindingList<Badger_History> GetAll()
{
BadgerContext cn = (BadgerContext)this.context;
List<Badger_History> ents = cn.Badger_History.ToList();
this.EntityList = new daBindingList<Badger_History>(ents);
return this.EntityList;
}
Again, I don't believe that the SQL database/tables are the problem, seeing that I can get the save to work properly by removing the one line of code. I just can't seem to find a way to resolve
This line:
List<Badger_History> ents = cn.Badger_History.ToList();
Will load EVERY row in the Badger_History table into memory. How many records are in this table?
If there are lots of rows, then when you call Save(); (which I assume is some sort of wrapper around SaveChanges(), then EF will look through every row for anything that has changed. In your case, there may be 0 rows changed, as all you are interested in, is the row you are adding.
To speed things, you could change loading the data into a 'non-tracking' query
List<Badger_History> ents = cn.Badger_History.AsNoTracking().ToList();
This will still load in all the records, but they will no longer be counted when trying to save.
Brand new to React today so apologies in advance.
I have googled but for whatever reason can't seem to find the answer which I know must be out there!
I'm trying to build a TEST component just to learn.
The component is basically going to consist of a header and a number of name value pairs set out in div blocks. So I'm starting with the header and trying to make the component generic by passing in a data attribute.
I have a cshtml page with this node (solution is a .NET Core MVC project in VS2019):
<div id="detailsHeaderText" data-headerText="Details"></div>
I have set up a jsx file which looks like this:
class Header extends React.Component {
render() {
return (
<div className="col-md-12 col-sm-12"><h5>{document.getElementById("detailsHeaderText").getAttribute("data-headerText")}</h5></div>
);
}
}
ReactDOM.render(<Header />, document.getElementById('detailsHeaderText'));
This works perfectly and returns a header with the word "Details" in it.
I now want to make it generic so I can do this elsewhere on the page:
<div class="detailsHeaderText2" data-id="2" data-headerText="Header2"></div>
<div class="detailsHeaderText3" data-id="3" data-headerText="Header3"></div>
<div class="detailsHeaderText4" data-id="4" data-headerText="Header4"></div>
etc
How can I output the header text based on a data-attribute input?
The idea being that I connect the React render output to the element along the lines of this pseudocode: document.getElementById("detailsHeaderText" + data-id)
I've looked at constructors and super(props) but nothing seems to work as most of the examples are to do with handlers and hence access the event target prop.
I've found many links to passing props between components.
But none for passing in data from the parent element on a cshtml page.
An answer or a pointer to a detailed answer on passing variables into React would be most helpful.
Thanks in advance.
So I'm 12 hours further down the line in terms of learning React and Googling.
And solved the problem.
Working code is:
function Header(props) {
return <div className="col-md-12 col-sm-12"><h5>{props.headertext}</h5></div>;
}
let elems = document.getElementsByClassName("headerText");
function renderToElements(toRender, elements, dataset) {
for (var i = 0; i < elements.length; i++) {
let passText = elements[i].dataset[dataset];
let renderEl = React.createElement(toRender, { headertext: passText })
ReactDOM.render(renderEl, elements[i]);
}
}
renderToElements(Header, elems, 'headertext')
Which renders all dom nodes of the following construct:
<div class="headerText" data-headertext="Details"></div>
It may seem like a pointless exercise to some in terms of what it is achieving but hopefully this may help others in grasping some basics as I/they can now build on this to construct more complex components.
Scenario: I want to write my own Autocomplete-API for Addresses, just like the One Google is offering. (Very Basic: Street, Housenumber, City, Postcode, Country). It is intended for private use and training-purposes only. I want to cover about 1 Million Addresses for a Start.
Technology Used: .Net Framework (not Core), C#, Visual Studio, OSMSharp, Microsoft SQL-Server, Web Api 2 (although i will probably switch to ASP.Net Core in the Future.)
Approach:
Set Up Project (Web Api 2 or Console Project for Demo-Purposes)
Download relevant File from OpenStreetMaps using DownloadClient() (https://download.geofabrik.de/)
Read in the File using OSMSharp and Filter out relevant Data.
Convert Filtered Data to a DataTable.
Use DataTable to feed SQLBulkCopy Method to import Data into Database.
Problem: Step 4 is taking way too long. For a File like "Regierungsbezirk Köln" in the Format osm.pbf which is about 160MB (the uncompressed osm file is about 2.8 GB) where talking about 4-5 Hours. I want to optimize this. The Bulk Copy of the DataTable into the Database on the other Hand (About 1 Million Rows) is taking just about 5 Seconds. (Woah. Amazing.)
Minimal Reproduction: https://github.com/Cr3pit0/OSM2Database-Minimal-Reproduction
What i tried:
Use a Stored Procedure in SQL-Server. This comes with a whole different Set of Problems and i didn't quite manage to get it Working (mainly because the uncompressed osm.pbf File is over 2GB and SQL Server doesn't like that)
Come up with a different approach to Filter and Convert the Data from the File to a DataTable (or CSV).
Use the Overpass-API. Although I read somewhere that the Overpass-API is not intended for DataSets above 10,000 Entries.
Ask the Jedi-Grandmasters on StackOverflow for Help. (Currently in Process ... :D)
Code Extract:
public static DataTable getDataTable_fromOSMFile(string FileDownloadPath)
{
Console.WriteLine("Finished Downloading. Reading File into Stream...");
using (var fileStream = new FileInfo(FileDownloadPath).OpenRead())
{
PBFOsmStreamSource source = new PBFOsmStreamSource(fileStream);
if (source.Any() == false)
{
return new DataTable();
}
Console.WriteLine("Finished Reading File into Stream. Filtering and Formatting RawData to Addresses...");
Console.WriteLine();
DataTable dataTable = convertAdressList_toDataTable(
source.Where(x => x.Type == OsmGeoType.Way && x.Tags.Count > 0 && x.Tags.ContainsKey("addr:street"))
.Select(Address.fromOSMGeo)
.Distinct(new AddressComparer())
);
return dataTable;
}
};
private static DataTable convertAdressList_toDataTable(IEnumerable<Address> addresses)
{
DataTable dataTable = new DataTable();
if (addresses.Any() == false)
{
return dataTable;
}
dataTable.Columns.Add("Id");
dataTable.Columns.Add("Street");
dataTable.Columns.Add("Housenumber");
dataTable.Columns.Add("City");
dataTable.Columns.Add("Postcode");
dataTable.Columns.Add("Country");
Int32 counter = 0;
Console.WriteLine("Finished Filtering and Formatting. Writing Addresses From Stream to a DataTable Class for the Database-SQLBulkCopy-Process ");
foreach (Address address in addresses)
{
dataTable.Rows.Add(counter + 1, address.Street, address.Housenumber, address.City, address.Postcode, address.Country);
counter++;
if (counter % 10000 == 0 && counter != 0)
{
Console.WriteLine("Wrote " + counter + " Rows From Stream to DataTable.");
}
}
return dataTable;
};
Okay i think i got it. Im down to about 12 Minutes for a File-Size of about 600mb and about 3.1 Million Rows of Data after Filtering.
The first Thing i tried is to replace the logic that populates my DataTable with FastMember. Which worked, but didnt give the Performance Increase i was hoping for (I canceled the Process after 3 Hours...). After more Research i stumbled upon an old Project which is called "osm2mssql" (https://archive.codeplex.com/?p=osm2mssql). I used a little part of the Code which directly read the Data from the osm.pbf File and modified it to my Use-Case ( → which is to extract Address-Data from Ways). I did actually use FastMember to write an IEnumerable<Address> to the Datatable, but i dont need OSM-Sharp and whatever extra Dependencies they have anymore. So thank you very much for the Suggestion of FastMember. I will certainly keep that Library in Mind in future Projects.
For those who are interested, i updated my Github-Project accordingly (https://github.com/Cr3pit0/OSM2Database-Minimal-Reproduction) (although i didnt thoroughly test it, because i moved on from the Test-Project to the Real Deal, which is a Web Api)
Im quite sure it can be further optimized but i dont think i care at the Moment. 12 Minutes for a Method which might be called once a month to update the whole Database is fine i guess. Now i can move on to opimizing my Queries for the Autocomplete.
So thank you very much to whoever wrote "osm2mssql".
I was wondering what is the best way to delete all pages in Sitefinity?
I came up with two solutions but I'm not sure what the best practices are.
I also read the documentation fore deleting pages.
The goal is to delete hierarchical pages as well.
Fluent
var fluent = App.WorkWith().Pages();
fluent.LocatedIn(Telerik.Sitefinity.Fluent.Pages.PageLocation.Frontend).Delete().SaveChanges();
Native
var pageManager = PageManager.GetManager();
var pageNodes = pageManager.GetPageNodes().ToList();
foreach (var node in pageNodes)
{
pageManager.DeleteItem(node);
}
pageManager.SaveChanges();
P.S. I was using the Fluent approach but after a while an error started to pop up. I switched to PageManager approach but I get the same error. I deleted all pages from the backend and the recycle bin, but still no resolution.
No row for Telerik.Sitefinity.Pages.Model.PageNode ('sf_page_node')
GenericOID#4f0f7ba8 PageNode id=6bd454ba-6971-4289-822d-36fbd9f5a844
NOTRES
Edit: The pages are deleted despite the error.
You should be very specific when deleting pages as the native version you are using will pull all pages including BackendPages. I have deleted ALL the pages in a Sitefinity site before.
According to the docs they use a slightly different approach for deleting pages.
https://docs.sitefinity.com/for-developers-delete-pages#delete-a-page-using-native-api
This deletes by title but can be easily modified to suit you. The docs also recommend unpublishing the page before deletion too.
public void DeletePageNativeAPI(string pageTitleToDelete)
{
PageManager pageManager = PageManager.GetManager();
PageData page = pageManager.GetPageDataList().Where(pD => (pD.NavigationNode.Title == pageTitleToDelete && pD.Status == ContentLifecycleStatus.Live)).FirstOrDefault();
if (page != null)
{
pageManager.Delete(page);
pageManager.SaveChanges();
}
}
Pages consist of a Template > PageNode > PageData
https://docs.sitefinity.com/for-developers-crud-operations-with-pages#page-components
Depending of which version you are working with of Sitefinity pages are a little different since the addition of Personalization and such.
It does not really matter which way you go (Fluent vs Native) - the end result should be the same. It is just a matter of coding style and readability.
I like Native more because I feel I have more control :)
As for the error - when using the PageManager - have you tried
pageManager.Delete(node) as opposed to the pageManager.DeleteItem(node)?
They seem slightly different.
How many pages do you have to delete?
If they are thousands, you can add a counter and call pageManager.SaveChanges() on every 100 or so deletions so that the transaction is committed to the DB.
I'm using Kendo grid in Visual Studio 2010 Asp.net & C#. Im new to this platform. I have more than 100 records in that grid.. i want to select all the records in an array.. Am using the following code.. It selects only the first page records. (PageSize: 5 )..
var entityGrid = $("#grdReport").data("kendoGrid");
var d = entityGrid.dataSource.data();
for(var i = 0; i<d.length; i++)
{
var currentDataItem = d[i];
a.push(currentDataItem);
}
appnt = a;
appnt has only 5 records.. So please Help me in this issue... Thanks in Advance.. :-) Be happy..
You should use ServerOperation of the dataSource sorce set to false if using the MVC wrappers.
If using the regular JavaScript declaration you should set the serverPaging of the dataSource to false.
How do you load them? Are they actually loaded in the browser or are you using Server Paging?
If you have actually the data loaded what you do is correct BUT if the data is actually still in the server, you should check the total using:
var entityGrid = $("#grdReport").data("kendoGrid");
console.log("Total length: ", entityGrid.dataSource.total());
BUT you cannot get the data since it is actually not in the browser, you will get it when moving to a different page.
So the question is: how are you defining the DataSource?
Check it here: http://jsfiddle.net/td8Ww/