Import a remote XML file into MVC 4 Webapp - c#

I have a webapp that needs to import 3 different remote XML files every night. These files have student and parent information, study information, etc.
Can someone point me to information how to read the XML (via http) and then loop over every student so I can add information from the other XML files and store the relations into my database?
Need some example code like:
Open & read Parents XML and store in database with fixed Ids from XML file
Open & read Students XML and store in database AND link to their respective parents using the parent id's from the parent XML
Also store study information for every student
I need to know what my strategy would be. What would the smartest and most efficient method be to accomplish this using Entity Framework?

I don't see how you would use the Entity Framework directly with your XML files. EF is designed to work with relational databases.
I think you will have to import the data into a relational database in some way.
If using SQL Server, you can achieve such an import (given you have downloaded the XML file already with the help of a scheduled task on the web server) using a DataSet and SqlBulkCopy:
// Create DataSet and load data
DataSet ParentData = new DataSet();
ParentData.ReadXml(Server.MapPath("ParentFile.xml"));
// Create SqlBulkCopy object
SqlConnection connection = new SqlConnection("YourConnectionString");
SqlBulkCopy bulkCopy = new SqlBulkCopy(connection);
bulkCopy.DestinationTableName = "YourParentTable";
// Get DataTable and copy it to database
DataTable ParentTable = ParentData.Tables["Parent"];
bulkCopy.WriteToServer(ParentTable);

First a bit of advice..
There is a caveat when you are using only a webapp, because it won't automatically fetch the updated files. At least not out-of-the-box. A webapp will be unloaded by IIS when it's not used for a long time, and is almost always a "reactive" application.
But... it is possible to schedule a task in which you open a webpage at a certain time each night, and start the import that way.
You could use the System.Net.HttpWebRequest to fetch the xml's to a local temp folder like this:
HttpWebRequest req = WebRequest.Create("http://url.to/file.xml") as HttpWebRequest;
// check if the cast went well
if (req != null) {
try {
HttpWebResponse resp = req.GetResponse() as HttpWebResponse;
System.IO.FileStream outFileStream =
System.IO.File.Create(#"Path\To\localfile.xml");
resp.GetResponseStream().CopyTo(outFileStream);
outFileStream.Close();
outFileStream.Dispose();
}
catch (ExceptionTypeA ex1) {
// Catch all specific exceptions... ommitted here for brevity
}
}
You can then use XDocument / XElement ( Linq to Xml ) to extract data from the Xml files and use standard ADO.NET API to create and process the DB Import.
Only after the import into a (relational) database you can fully use the entity framework to do the data retrieval / modification. You might even be able to convert all xml data into structured collections of POCO types and use the entity framework to query these collections, but I honestly don't think that's the best way to do it.

Related

Deserialize SQL Server image field back in Excel format

I have a SQL Server table that contains serialized Excel files, with 3 fields:
IdDocument -> int (PK)
DataFile -> image
FileName -> nvarchar
where DataFile contains the Excel file serialized, and FileName the name of the file (with path).
Something like this:
0xD0CF11E0A1B11AE100.....
U:\SAP_R3V4_Validation_Documents\March2012.xls
Now I need to get these files back in Excel format.
How can I accomplish this?
Using C# console application or SQL Server features could be fine.
Thank you in advance.
Luis
Excel files are binary. The xls format is obsolete, replaced since 2007 (15 years ago) by xlsx, a ZIP package containing XML files. What the question shows is how binary data looks in SSMS, not some kind of serialized format.
BTW the image is deprecated, replaced by varbinary(max) in 2005 or 2008 (can't remember).
In any case, reading binary data is the same as reading any other data. A DbDataReader is used to retrieve the query results and strongly typed methods are used to read specific fields per row. In this particular case GetStream() can be used to retrieve the data as a Stream that can be saved to disk:
using var con=new SqlConnection(connectionString)
{
using (var cmd=new SqlCommand(sql,con))
{
using (var reader=cmd.ExecuteReader())
{
while(reader.Read())
{
var path=reader.GetString(2);
var finalPath=Path.Combine(root,Path.GetFileName(path))
using(var stream=reader.GetStream(1))
{
using(var fileStream=File.Create(finalPath))
{
stream.CopyTo(fileStream);
}
}
}
}
}
}
The only thing that's different is the code that reads the field as a stream and saves it to disk
using(var stream=reader.GetStream(1))
{
using(var fileStream=File.Create(finalPath))
{
stream.CopyTo(fileStream);
}
}
The using clauses are used to ensure the data and file streams are closed even in case of error. The path itself is constructed by combining a root folder with the stored filename, not the full path

I want to back up the data in an xml file, I couldn't find how to save the newly received data without adding the same data to the existing file

I want to back up the data for today's date in an XML file every 10 minutes I managed to create the XML file, but I couldn't find how to save the newly received data without adding the same data to the existing file
Can I convert the file I created to dataSet with dataSet.ReadXml, add the new data I got from the query and convert it back to an XML file and save it? What method should I use?
String QueryString = "SELECT * FROM dbo.db_records WHERE DAY(datetime) = DAY(CURRENT_TIMESTAMP)";
public void run()
{
while (true)
{
try
{
Thread.Sleep(600000);
if (odbcConnection.State != ConnectionState.Open)
{
odbcConnection.Close();
odbcConnection.Open();
}
DataSet dataSet = new DataSet("XMLDB");
odbcDataAdapter.Fill(dataSet, "#ID");
if (File.Exists(Path))
{
}
else
{
using (FileStream fs = File.Create(Path))
{
dataSet.WriteXml(fs);
}
}
}
catch (Exception) { }
}
}
Xml is not a great format if you want to append data, since it uses tags that need to be closed. So you have a few options:
Save separate files
Since you seem to fetch data for the current day, just attach date-info to your file-name. When reading the data you may need to read all files in the folder fitting the pattern, and merge it.
Use a format that is trivial to append
If your data model is simple tabular data you may use a .csv file instead. You can add data to this using one of the File.Append methods.
Overwrite all data
Get the complete data you want to save each time, and overwrite any existing data. This is simple, but may be slow if you have lots of data. But if the database is small and grow slowly this might be perfectly fine.
Parse the existing data
You could read the existing file with Readxml as you suggest, and use DataSet.Merge to merge it with your new set before overwriting the existing file. This may also be slow, since it needs to process all the data. But it may put less load on the database than fetching all data from the database each time.
In any case, you might want to periodically save full backups, or have some other way to handle corrupt files. You should also have some way to test the backups. I would also consider using the backup options built into most database engines if that is an alternative.

Dynamically ALTER a SQL Table based on a DataTable

I am getting data from a rest feed at regular intervals and want to copy this into a SQL table. Easy enough using SQLBulk copy.
The issue i am struggling with is that the fields I get from the feed could change and i want to be able to add any new columns dynamically to the table. Any columns that no longer exist i need to leave in the table. I can simply add those into the datatable.
My question. What options do i have to do this. Is there any free 3rd party .net frameworks that will do it or how can i write this manually.
Must all be done in .NET.
Thanks
as you stated that you are getting data from a rest feed then u can create a simple ASP WebPages Application
Add a page in which you are going to call the "Rest Feed".
Now what format the Rest Feed is offering
mostly JSON is used
you can parse JSON to a Model Class which corresponds to the data structure of that feed
then simply insert the data using SQLConnection , SQLCommand to the underlying database
here is a sample code which you can alter
string siteContent = string.Empty;
string url = "http://www.RESTFEEDURL.com";
HttpWebRequest request = (HttpWebRequest)WebRequest.Create(url);
using(HttpWebResponse response = (HttpWebResponse)request.GetResponse())
using(Stream responseStream = response.GetResponseStream())
using(StreamReader streamReader = new StreamReader(responseStream))
{
siteContent = streamReader.ReadToEnd();
}
//NOW PARSE THE DATA AND SEND TO DATABASE HERE
//data is in siteContent
//now you want to decode the data and get all the column names
var keyArray = Ext.Object.getKeys(Ext.JSON.decode(siteContent));
//if suppose data has following format
//{"ID":"1","name":"google","IP":"69.5.33.22","active":"true"}
//{"ID":"2","name":"bing","IP":"70.5.232.33","active":"false"}
console.log(keyArray);
// ["ID","name","IP","active"]
now
foreach(var item as keyArray)
{
sql = "IF COL_LENGTH('TABLENAME', item) IS NULL alter table [TABLENAME] add [item] int default 0 NOT NULL"
//run this sql query using sql command and see the magic
}

Saving, loading and manipulating (listview) data C#

I'm working on an UWP application where a user can input data which is placed in a listview. All fine and dandy, but how can I save the user data to a separate file and load it the next time a user boots up the app?
I've tried to find a solution, but I had great difficulty to understand these code snippets and on how to apply these (since I'm fairly new to C# and App development). Would somebody like to explain how I can achieve the saving/loading of the data and explain what the code does?
Thanks in advance! :)
You can create a file like this:
StorageFile ageFile = await local.CreateFileAsync("Age.txt", CreationCollisionOption.FailIfExists);
I can read and write to a file like this:
StorageFolder local = Windows.Storage.ApplicationData.Current.LocalFolder;
var ageFile = await local.OpenStreamForReadAsync(#"Age.txt");
// Read the data.
using (StreamReader streamReader = new StreamReader(ageFile))
{
//Use like a normal streamReader
}
if you are trying to write, use OpenStreamForWriteAsync;
If I understood well, you have some kind of object structure that serves as a model for your ListView. When the application is started, you want to read a file where the data is present. When closing the application (or some other event) write the file with the changes done. Right?
1) When your application is loaded / closed (or upon modifications or some event of your choice), use the Windows.Storage API to read / write the text into the file.
2) If the data you want to write is just a liste of strings, you can save this as is in the file. If it is more complicated, I would recommend serializing it in JSON format. Use JSON.NET to serialize (object -> string) and deserialize (object <- string) the content of your file and object structure.
Product product = new Product();
product.Name = "Apple";
...
string json = JsonConvert.SerializeObject(product);

Silverlight Loading Reference Data On Demand from a 'dumb' server

I have a text file with a list of 300,000 words and the frequency with wich they occur. Each line is in the format Word:FequencyOfOccurence.
I want this information to be accessible from within the C# code. I can't hard code the list since it is too long, and I'm not sure how to go about accessing it from a file on the server. Ideally I'd ideally like the information to be downloaded only if it's used (To save on bandwidth) but this is not a high priority as the file is not too big and internet speeds are always increasing.
It doesn't need to be useable for binding.
The information does not need to be editable once the project has been built.
Here is another alternative. Zip the file up and stick it in the clientBin folder next to the apllication XAP. Then at the point in the app where the content is needed do something like this:-
public void GetWordFrequencyResource(Action<string> callback)
{
WebClient client = new WebClient();
client.OpenReadAsync += (s, args) =>
{
try
{
var zipRes = new StreamResourceInfo(args.Result, null)
var txtRes = Application.GetResourceStream(zipRes, new Uri("WordFrequency.txt", UriKind.Relative));
string result = new StreamReader(txtRes.Stream).ReadToEnd();
callback(result);
}
catch
{
callback(null); //Fetch failed.
}
}
client.OpenReadAsync(new Uri("WordFrequency.zip", UriKind.Relative"));
}
Usage:-
var wordFrequency = new Dictionary<string, int>();
GetWordFrequencyResource(s =>
{
// Code here to burst string into dictionary.
});
// Note code here is asynchronous with the building of the dictionary don't attempt to
// use the dictionary here.
The above code allows you to store the file in an efficient zip format but not in the XAP itself. Hence you can download it on demand. It makes use of the fact that a XAP is a zip file so Application.GetResourceStream which is designed to pull resources from XAP files can be used on a zip file.
BTW, I'm not actually suggesting you use a dictionary, I'm just using a dictionary as simple example. In reality I would imagine the file is in sorted order. If that is the case you could use a KeyValuePair<string, int> for each entry but create a custom collection type that holds them in an array or List and then use some Binary search methods to index into it.
Based on your comments, you could download the word list file if you are required to have a very thin server layer. The XAP file containing your Silverlight application is nothing more than a ZIP file with all the referenced files for your Silverlight client layer. Try adding the word list as content that gets compiled into the XAP and see how big the file gets. Text usually compresses really well. In general, though, you'll want to be friendly with your users in how much memory your application consumes. Loading a huge text file into memory, in addition to everything else you need in your app, may untimately make your app a resource hog.
A better practice, in general, would be to call a web service. The service could would perform whatever look up logic you need. Here's a blog post from a quick search that should get you started: (This was written for SL2, but should apply the same for SL3.)
Calling web services with Silverlight 2
Even better would be to store your list in a SQL Server. It will be much easier and quicker to query.
You could create a WCF service on the server side that will send the data to the Silverlight application. Once you retrieve the information you could cache it in-memory inside the client. Here's an example of calling a WCF service method from Silverlight.
Another possibility is to embed the text file into the Silverlight assembly that is deployed to the client:
using (var stream = Assembly.GetExecutingAssembly()
.GetManifestResourceStream("namespace.data.txt"))
using (var reader = new StreamReader(stream))
{
string data = reader.ReadToEnd();
// Do something with the data
}

Categories

Resources