I have an App which communicate to a MySQL database via webservice. The webservice serves the app with XML. Now I want to replace the MySQL database with a SQLite database. To avoid changing all logic I only need to get an XML format back from my SQLite database. To increase the level of problem, I have to read data from more than one table. Details of App: For each table I have the structure of my tables in class stored and this is the code I use currently which is not working:
XmlSerializer xs = new XmlSerializer(typeof(myTableClass));
var sRe = new myTableClass();
using (XmlWriter writer = XmlWriter.Create(sww))
{
xs.Serialize(writer, sRe);
var buf = sww.ToString();
return buf;
}
I expect a string of my XML in the variable buf, but when I run this code it just jump out on the first row. What is wrong in my code?
Related
I have a SQL Server table that contains serialized Excel files, with 3 fields:
IdDocument -> int (PK)
DataFile -> image
FileName -> nvarchar
where DataFile contains the Excel file serialized, and FileName the name of the file (with path).
Something like this:
0xD0CF11E0A1B11AE100.....
U:\SAP_R3V4_Validation_Documents\March2012.xls
Now I need to get these files back in Excel format.
How can I accomplish this?
Using C# console application or SQL Server features could be fine.
Thank you in advance.
Luis
Excel files are binary. The xls format is obsolete, replaced since 2007 (15 years ago) by xlsx, a ZIP package containing XML files. What the question shows is how binary data looks in SSMS, not some kind of serialized format.
BTW the image is deprecated, replaced by varbinary(max) in 2005 or 2008 (can't remember).
In any case, reading binary data is the same as reading any other data. A DbDataReader is used to retrieve the query results and strongly typed methods are used to read specific fields per row. In this particular case GetStream() can be used to retrieve the data as a Stream that can be saved to disk:
using var con=new SqlConnection(connectionString)
{
using (var cmd=new SqlCommand(sql,con))
{
using (var reader=cmd.ExecuteReader())
{
while(reader.Read())
{
var path=reader.GetString(2);
var finalPath=Path.Combine(root,Path.GetFileName(path))
using(var stream=reader.GetStream(1))
{
using(var fileStream=File.Create(finalPath))
{
stream.CopyTo(fileStream);
}
}
}
}
}
}
The only thing that's different is the code that reads the field as a stream and saves it to disk
using(var stream=reader.GetStream(1))
{
using(var fileStream=File.Create(finalPath))
{
stream.CopyTo(fileStream);
}
}
The using clauses are used to ensure the data and file streams are closed even in case of error. The path itself is constructed by combining a root folder with the stored filename, not the full path
I want to save the complete result of a FOR XML SQL Query to a file.
My SQL Query looks something like this:
SELECT * FROM Customer FOR XML RAW
in my code, I now want to execute this query against an SQL Server and read the complete XML result and save it to disk.
My code looks like this:
using (XmlReader xmlResultReader = command.ExecuteXmlReader()) //command is my SqlCommand
using (MemoryStream resultFile = new MemoryStream())
using (StreamWriter writer = new StreamWriter(resultFile, Encoding.UTF8))
{
while (xmlResultReader.Read())
{
writer.WriteLine(xmlResultReader.ReadOuterXml());
}
//write stream to file
}
But when I run this, not the complete result of the query gets saved to the MemoryStream. The result is truncated in the middle of a <row /> element in the resulting XML. So not even the returned XML is valid.
I also tried writing the result with an XmlWriter using this code:
xmlWriter.WriteNode(xmlResultReader, false);
but this showed the same result.
So now my question is: How can I get the complete XML result of the query from the XmlReader returned by ExecuteXmlReader()?
I think, DataSet is bet suited for such requirement.Save file to disk is time taking ,you cannot keep connection for so long using SqlDataReader or XMLdataReader.
Load Result set in Dataset.
Loop throgh dataset
Perform validation whether xml file is genuine.
Save file to disk one by one.
XmlDocument xdoc = new XmlDocument();
xdoc.LoadXml(yourXMLString);
xdoc.Save("C:\myfilename.xml");
I am getting data from a rest feed at regular intervals and want to copy this into a SQL table. Easy enough using SQLBulk copy.
The issue i am struggling with is that the fields I get from the feed could change and i want to be able to add any new columns dynamically to the table. Any columns that no longer exist i need to leave in the table. I can simply add those into the datatable.
My question. What options do i have to do this. Is there any free 3rd party .net frameworks that will do it or how can i write this manually.
Must all be done in .NET.
Thanks
as you stated that you are getting data from a rest feed then u can create a simple ASP WebPages Application
Add a page in which you are going to call the "Rest Feed".
Now what format the Rest Feed is offering
mostly JSON is used
you can parse JSON to a Model Class which corresponds to the data structure of that feed
then simply insert the data using SQLConnection , SQLCommand to the underlying database
here is a sample code which you can alter
string siteContent = string.Empty;
string url = "http://www.RESTFEEDURL.com";
HttpWebRequest request = (HttpWebRequest)WebRequest.Create(url);
using(HttpWebResponse response = (HttpWebResponse)request.GetResponse())
using(Stream responseStream = response.GetResponseStream())
using(StreamReader streamReader = new StreamReader(responseStream))
{
siteContent = streamReader.ReadToEnd();
}
//NOW PARSE THE DATA AND SEND TO DATABASE HERE
//data is in siteContent
//now you want to decode the data and get all the column names
var keyArray = Ext.Object.getKeys(Ext.JSON.decode(siteContent));
//if suppose data has following format
//{"ID":"1","name":"google","IP":"69.5.33.22","active":"true"}
//{"ID":"2","name":"bing","IP":"70.5.232.33","active":"false"}
console.log(keyArray);
// ["ID","name","IP","active"]
now
foreach(var item as keyArray)
{
sql = "IF COL_LENGTH('TABLENAME', item) IS NULL alter table [TABLENAME] add [item] int default 0 NOT NULL"
//run this sql query using sql command and see the magic
}
I have created a web form in asp.NET, unfortunately i do not have the pleasure of being able to use SQL due to some restrictions (internal). I had the idea of exporting the data to a CSV file every time the user clicks submit.
The purpose of this is so that we have a list of computer names, softwares installed, Serial #, ect to keep track.
I have never done anything like this, as i am used to SQL (beginner level), is this even possible, and how would i do it (in code form). I've googled it and all i'm getting is grid view and other forms that i don't want to use, even those seem more complicated than what i want to do. Any help would be appreciated.
I'm using a C# back end to code this.
To clarify, this will be a single CSV file on our network, this web form will only be used internally. So every time someone clicks the submit button it will add a new "row" to the same CSV file. So in the end we have one CSV file with a list of computer names and other information.
Typically you have a model class that represents the data. Here's one:
public class Computer
{
public string SerialNumber {get; set;}
public string Name {get; set;}
public List<string> InstalledSoftware {get; set;}
}
Now that you have a class that can represent the data, it's just a matter of saving or serializing it. You don't have access to a SQL Server database. That's fine. There's other options. You can store it in a structured file format. CSV is not good for this, as you might have multiple pieces of InstalledSoftware per computer, and it's hard to properly handle that with CSV. But other text based formats such as XML and JSON are perfect for this. You can also use "NoSQL" type databases such as MongoDB or RavenDB. You may also be able to use SQLite, which is very lightweight.
Let's start off with some sample data.
List<Computer> computers = new List<Computer>();
computers.Add(new Computer(){
SerialNumber="ABC123",
Name="BOB-LAPTOP",
InstalledSoftware = new List<string>(){
"Notepad", "Visual Studio", "Word"
}
});
computers.Add(new Computer(){
SerialNumber="XYZ456",
Name="JASON-WORKSTATION",
InstalledSoftware = new List<string>(){
"Notepad++", "Visual Studio Code", "Word"
}
});
computers.Add(new Computer(){
SerialNumber="LMN789",
Name="NANCY-SURFACE3",
InstalledSoftware = new List<string>(){
"Outlook", "PowerPoint", "Excel"
}
});
Then it's just a matter of saving the data. Let's try with XML:
var xmlSerializer = new XmlSerializer(typeof(Computer));
using(var stringWriter = new StringWriter())
{
using (var xmlWriter = XmlWriter.Create(stringWriter))
{
xmlSerializer .Serialize(writer, computers);
var xml = stringWriter.ToString();
File.WriteAllText(Server.MapPath("~/App_Data/computers.xml"));
}
}
Or with JSON:
var serializer = new JavaScriptSerializer();
var json = serializer.Serialize(computers);
File.WriteAllText(Server.MapPath("~/App_Data/computers.json"));
Using MongoDB:
var client = new MongoClient(connectionString);
var server = client.GetServer();
var database = server.GetDatabase("ComputerDB");
var computerCollection= database.GetCollection<Computer>("Computers");
foreach(var computer in computers)
{
computerCollection.Insert(computer);
}
Note, I have not tested this code. There's likely bugs. My goal is to give you a starting point, not necessarily 100% working code. I haven't serialized XML in a while and I typically use JSON.NET instead of the built in JavaScriptSerializer.
Also note that if there's any possibility that two users might access the data at the same time, you'll need to take care with the XML and JSON approaches to avoid two people trying to write to the file at once. That's where MongoDB would be better, but you'll have to install the server software somewhere.
I have a webapp that needs to import 3 different remote XML files every night. These files have student and parent information, study information, etc.
Can someone point me to information how to read the XML (via http) and then loop over every student so I can add information from the other XML files and store the relations into my database?
Need some example code like:
Open & read Parents XML and store in database with fixed Ids from XML file
Open & read Students XML and store in database AND link to their respective parents using the parent id's from the parent XML
Also store study information for every student
I need to know what my strategy would be. What would the smartest and most efficient method be to accomplish this using Entity Framework?
I don't see how you would use the Entity Framework directly with your XML files. EF is designed to work with relational databases.
I think you will have to import the data into a relational database in some way.
If using SQL Server, you can achieve such an import (given you have downloaded the XML file already with the help of a scheduled task on the web server) using a DataSet and SqlBulkCopy:
// Create DataSet and load data
DataSet ParentData = new DataSet();
ParentData.ReadXml(Server.MapPath("ParentFile.xml"));
// Create SqlBulkCopy object
SqlConnection connection = new SqlConnection("YourConnectionString");
SqlBulkCopy bulkCopy = new SqlBulkCopy(connection);
bulkCopy.DestinationTableName = "YourParentTable";
// Get DataTable and copy it to database
DataTable ParentTable = ParentData.Tables["Parent"];
bulkCopy.WriteToServer(ParentTable);
First a bit of advice..
There is a caveat when you are using only a webapp, because it won't automatically fetch the updated files. At least not out-of-the-box. A webapp will be unloaded by IIS when it's not used for a long time, and is almost always a "reactive" application.
But... it is possible to schedule a task in which you open a webpage at a certain time each night, and start the import that way.
You could use the System.Net.HttpWebRequest to fetch the xml's to a local temp folder like this:
HttpWebRequest req = WebRequest.Create("http://url.to/file.xml") as HttpWebRequest;
// check if the cast went well
if (req != null) {
try {
HttpWebResponse resp = req.GetResponse() as HttpWebResponse;
System.IO.FileStream outFileStream =
System.IO.File.Create(#"Path\To\localfile.xml");
resp.GetResponseStream().CopyTo(outFileStream);
outFileStream.Close();
outFileStream.Dispose();
}
catch (ExceptionTypeA ex1) {
// Catch all specific exceptions... ommitted here for brevity
}
}
You can then use XDocument / XElement ( Linq to Xml ) to extract data from the Xml files and use standard ADO.NET API to create and process the DB Import.
Only after the import into a (relational) database you can fully use the entity framework to do the data retrieval / modification. You might even be able to convert all xml data into structured collections of POCO types and use the entity framework to query these collections, but I honestly don't think that's the best way to do it.