Reading ReadSubtree()'s entire string before calling XElement.Load() - c#

I'm using the following code to read xml subtrees from a stream (where reader is an XmlReader):
while (reader.Read())
{
using (XmlReader subTreeReader = reader.ReadSubtree())
{
try
{
XElement xmlData = XElement.Load(subTreeReader);
ProcessStreamingEvent(xmlData);
}
catch (XmlException ex)
{
// want to write xml subtree
}
}
}
It works fine most of the time, but once XElement.Load(subTreeReader) throws an exception like
Name cannot begin with the '[' character, hexadecimal value 0x5B. Line
1, position 775.
The message is mostly useless; I need to output the entire string that XElement.Load() tried to parse. How can I do this? In the exception handler there is very little information; subTreeReader.Nodetype is Text, and Value is "spread". Perhaps I need to read the content of subTreeReader before calling XElement.Load()? It seems like subTreeReader should contain the entire string since it knows where the tree ends...?
I tried to replace XElement xmlData = XElement.Load(subTreeReader) with:
string xmlString = subTreeReader.ReadOuterXml();
XElement xmlData = XElement.Load(xmlString);
but xmlString is empty (even when subTreeReader does contain valid xml).
Interesting (and a good thing) that it is not XmlReader.Read() or XmlReader.ReadSubTree() that throws an exception (when the stream may not contain xml), but rather XElement.Load(). Makes me wonder though how does ReadSubTree() knows when to stop reading when the stream the does not contain xml.

Related

How to remove unused > character in xml string in c#

I have to remove some special characters and ">" in XML string.
Load XML throwing Data root level error.
public T ConvertXmlFromByte<T>(byte[] data)
{
T model = null;
try
{
if (data != null)
{
XmlDocument xmlDoc = null;
XmlSerializer serializer = null;
string xml = "";
xmlDoc = new XmlDocument();
xml = Encoding.UTF8.GetString(data);
//xml = Regex.Replace(xml, #"[^&;:()a-zA-Z0-9\=./><_-~-]", string.Empty);
xmlDoc.LoadXml(xml);
}
}
catch (Exception ex)
{
_customLogger.Error(ex.Message, ex);
}
return model;
}
Below is my XML string:
<?xml version="1.0" encoding="utf-8" standalone="no"?><Test xmlns="https://cdn.Test.go.cr/xml-schemas/v4.3/test" xmlns:ds="http://www.w3.org/2000/09/xmldsig#" xmlns:xsd="http://www.w3.org/2001/XMLSchema">
</Test>>
You are trying to load the file as XML and remove what appears to be an extra character.
The problem is, that this extra character means that the text is not valid XML! Therefore, you can't load it using an XML parser because it is not valid so this is why you get this exception.
So you must therefore treat it as a string, find the offending characters and modify the string and save it again.
So you can use a simple Regex to do this. You don't really state the conditions, so I assume that anytime a double '>>' appears it is incorrect. But you need to amend as appropriate.
string contents = File.ReadAllText(#"c:\path\file.xml");
string output = Regex.Replace(contents, ">>", ">");
File.WriteAllText(#"c:\path\output.xml", output);

{"'\u0004', hexadecimal value 0x04, is an invalid character

I am trying to convert a file to XML format that contains some special characters but it's not getting converted because of that special characters in the data.
I have already this regex code still it's not working for me please help.
The code what I have tried:
string filedata = #"D:\readwrite\test11.txt";
string input = ReadForFile(filedata);
string re1 = #"[^\u0000-\u007F]+";
string re5 = #"\p{Cs}";
data = Regex.Replace(input, re1, "");
data = Regex.Replace(input, re5, "");
XmlDocument xmlDocument = new XmlDocument();
try
{
xmlDocument = (XmlDocument)JsonConvert.DeserializeXmlNode(data);
var Xdoc = XDocument.Parse(xmlDocument.OuterXml);
}
catch (Exception ex)
{
Console.WriteLine(ex);
}
0x04 is a transmission control character and cannot appear in a text string. XmlDocument is right to reject it if it really does appear in your data. This does suggest that the regex you have doesn't do what you think it does, if I'm right that regex will find the first instance of one or more of those invalid characters at the beginning of a line and replace it, but not all of them. The real question for me is why this non-text 'character' appears in data intended as XML in the first place.
I have other questions. I've never seen JsonConvert.DeserializeXmlNode before - I had to look up what it does. Why are you using a JSON function against the root of a document which presumably therefore contains no JSON? Why are you then taking that document, converting it back to a string, and then creating an XDocument from it? Why not just create an XDocument to start with?

XML Deserialization with special characters in C# XMlSerializer

I have an xml sheet which contains some special character "& is the special character causing issues" and i use below code to deserialize XML
XMLDATAMODEL imported_data;
// Create an instance of the XmlSerializer specifying type and namespace.
XmlSerializer serializer = new XmlSerializer(typeof(XMLDATAMODEL));
// A FileStream is needed to read the XML document.
FileStream fs = new FileStream(path, FileMode.Open);
XmlReader reader = XmlReader.Create(fs);
// Use the Deserialize method to restore the object's state.
imported_data = (XMLDATAMODEL)serializer.Deserialize(reader);
fs.Close();
and structre of my XML MOdel is like this
[XmlRoot(ElementName = "XMLDATAMODEL")]
public class XMLDATAMODEL
{
[XmlElement(ElementName = "EventName")]
public string EventName { get; set; }
[XmlElement(ElementName = "Location")]
public string Location { get; set; }
}
I tried this code as well with Encoding mentioned but no success
// Declare an object variable of the type to be deserialized.
StreamReader streamReader = new StreamReader(path, System.Text.Encoding.UTF8, true);
XmlSerializer serializer = new XmlSerializer(typeof(XMLDATAMODEL));
imported_data = (XMLDATAMODEL)serializer.Deserialize(streamReader);
streamReader.Close();
Both approaches failed and if i put special character inside Cdata it looks working.
How can i make it work for xml data without CData as well?
Here is my XML file content
http://pastebin.com/Cy7icrgS
And error i am getting is There is an error in XML document (2, 17).
The best answer I could get after looking around is, unless you serialize the data yourself, it will be pretty trouble some to deserialize XML will special characters.
For your case, since the special character is & before you can deserialize it, you should convert it to & Unless the character & is converted to & we cannot really deserialize it with XmlSerializer. Yes, we still can read it by using
XmlReaderSettings settings = new XmlReaderSettings();
settings.CheckCharacters = false; //not to check false character, this setting can be set.
FileStream fs = new FileStream(xmlfolder + "\\xmltest.xml", FileMode.Open);
XmlReader reader = XmlReader.Create(fs, settings);
But we cannot deserialize it.
As how to convert & to &, there are various ways with plus and minus. But the bottom line in all conversion is, do not use stream directly. Just take the data from the file and convert it to string by using, for example, File.ReadAllText and start doing the string processing. After that, convert it to MemoryStream and start the deserialization;
And now for the string processing before deserialization, there are couple of ways to do it.
The easiest, and most of the time could be the most unsafe, would be by using string.Replace("&", "&").
The other way, harder but safer, is by using Regex. Since your case is something inside CData, this could be a good way too.
Another way harder yet safer, by creating your parsing for line by line.
I have yet to find what is the common, safe, way for this conversion.
But as for your example, the string.Replace would work. Also, you could potentially exploit the pattern (something inside CData) to use Regex. This could be a good way too.
Edit:
As for what are considered as special characters in XML and how to process them before hand, according to this, non-Roman characters are included.
Apart from the non-Roman characters, in here, there are 5 special characters listed:
< -> <
> -> >
" -> "
' -> &apos;
& -> &
And from here, we get one more:
% -> %
Hope they can help you!

Testing whether or not something is parseable XML in C# [duplicate]

This question already has answers here:
Check well-formed XML without a try/catch?
(11 answers)
Closed 9 years ago.
Does anyone know of a quick way to check if a string is parseable as XML in C#? Preferably something quick, low resource, which returns a boolean whether or not it will parse.
I'm working on a database app which deals with errors that are sometimes stored as XML, and sometimes not. Hence, I'd like to just be able to test the string I grab from the database (contained in a DataTable) very quickly...and not have to resort to any try / catch {} statements or other kludges...unless those are the only way to make it happen.
It sounds like that you sometimes get back XML and sometimes you get back "plain" (non-XML) text.
If that's the case you could just check that the text starts with <:
if (!string.IsNullOrEmpty(str) && str.TrimStart().StartsWith("<"))
var doc = XDocument.Parse(str);
Since "plain" messages seem unlikely to start with < this may be reasonable. The only thing you need to decide is what to do in the edge case that you have non-XML text that starts with a <?
If it were me I would default to trying to parse it and catching the exception:
if (!string.IsNullOrEmpty(str) && str.TrimStart().StartsWith("<"))
{
try
{
var doc = XDocument.Parse(str);
return //???
}
catch(Exception ex)
return str;
}
else
{
return str;
}
That way the only time you have the overhead of a thrown exception is when you have a message that starts with < but is not valid XML.
You could try to parse the string into an XDocument. If it fails to parse, then you know that it is not valid.
string xml = "";
XDocument document = XDocument.Parse(xml);
And if you don't want to have the ugly try/catch visible, you can throw it into an extension method on the string class...
public static bool IsValidXml(this string xml)
{
try
{
XDocument.Parse(xml);
return true;
}
catch
{
return false;
}
}
Then your code simply looks like if (mystring.IsValidXml()) {
The only way you can really find out if something will actually parse is to...try and parse it.
An XMl document should (but may not) have an XML declaration at the head of the file, following the BOM (if present). It should look something like this:
<?xml version="1.0" encoding="UTF-8" ?>
Though the encoding attribute is, I believe, optional (defaulting to UTF-8. It might also have a standalone attribute whose value is yes or no. If that is present, that's a pretty good indicator that the document is supposed to be valid XML.
Riffing on #GaryWalker's excellent answer, something like this is about as good as it gets, I think (though the settings might need some tweaking, a custom no-op resolver perhaps). Just for kicks, I generated a 300mb random XML file using XMark xmlgen (http://www.xml-benchmark.org/): validating it with the code below takes 1.7–1.8 seconds elapsed time on my desktop machine.
public static bool IsMinimallyValidXml( Stream stream )
{
XmlReaderSettings settings = new XmlReaderSettings
{
CheckCharacters = true ,
ConformanceLevel = ConformanceLevel.Document ,
DtdProcessing = DtdProcessing.Ignore ,
IgnoreComments = true ,
IgnoreProcessingInstructions = true ,
IgnoreWhitespace = true ,
ValidationFlags = XmlSchemaValidationFlags.None ,
ValidationType = ValidationType.None ,
} ;
bool isValid ;
using ( XmlReader xmlReader = XmlReader.Create( stream , settings ) )
{
try
{
while ( xmlReader.Read() )
{
; // This space intentionally left blank
}
isValid = true ;
}
catch (XmlException)
{
isValid = false ;
}
}
return isValid ;
}
static void Main( string[] args )
{
string text = "<foo>This &SomeEntity; is about as simple as it gets.</foo>" ;
Stream stream = new MemoryStream( Encoding.UTF8.GetBytes(text) ) ;
bool isValid = IsMinimallyValidXml( stream ) ;
return ;
}
The best answer I've seem for test well-formed XML I know of is What is the fastest way to programatically check the well-formedness of XML files in C#?
formedness-of-xml-file" It covers using an XMLReader to do this efficiently.

Using StringWriter for XML Serialization

I'm currently searching for an easy way to serialize objects (in C# 3).
I googled some examples and came up with something like:
MemoryStream memoryStream = new MemoryStream ( );
XmlSerializer xs = new XmlSerializer ( typeof ( MyObject) );
XmlTextWriter xmlTextWriter = new XmlTextWriter ( memoryStream, Encoding.UTF8 );
xs.Serialize ( xmlTextWriter, myObject);
string result = Encoding.UTF8.GetString(memoryStream .ToArray());
After reading this question I asked myself, why not using StringWriter? It seems much easier.
XmlSerializer ser = new XmlSerializer(typeof(MyObject));
StringWriter writer = new StringWriter();
ser.Serialize(writer, myObject);
serializedValue = writer.ToString();
Another Problem was, that the first example generated XML I could not just write into an XML column of SQL Server 2005 DB.
The first question is: Is there a reason why I shouldn't use StringWriter to serialize an Object when I need it as a string afterwards? I never found a result using StringWriter when googling.
The second is, of course: If you should not do it with StringWriter (for whatever reasons), which would be a good and correct way?
Addition:
As it was already mentioned by both answers, I'll further go into the XML to DB problem.
When writing to the Database I got the following exception:
System.Data.SqlClient.SqlException:
XML parsing: line 1, character 38,
unable to switch the encoding
For string
<?xml version="1.0" encoding="utf-8"?><test/>
I took the string created from the XmlTextWriter and just put as xml there. This one did not work (neither with manual insertion into the DB).
Afterwards I tried manual insertion (just writing INSERT INTO ... ) with encoding="utf-16" which also failed.
Removing the encoding totally worked then. After that result I switched back to the StringWriter code and voila - it worked.
Problem: I don't really understand why.
at Christian Hayter: With those tests I'm not sure that I have to use utf-16 to write to the DB. Wouldn't setting the encoding to UTF-16 (in the xml tag) work then?
One problem with StringWriter is that by default it doesn't let you set the encoding which it advertises - so you can end up with an XML document advertising its encoding as UTF-16, which means you need to encode it as UTF-16 if you write it to a file. I have a small class to help with that though:
public sealed class StringWriterWithEncoding : StringWriter
{
public override Encoding Encoding { get; }
public StringWriterWithEncoding (Encoding encoding)
{
Encoding = encoding;
}
}
Or if you only need UTF-8 (which is all I often need):
public sealed class Utf8StringWriter : StringWriter
{
public override Encoding Encoding => Encoding.UTF8;
}
As for why you couldn't save your XML to the database - you'll have to give us more details about what happened when you tried, if you want us to be able to diagnose/fix it.
When serialising an XML document to a .NET string, the encoding must be set to UTF-16. Strings are stored as UTF-16 internally, so this is the only encoding that makes sense. If you want to store data in a different encoding, you use a byte array instead.
SQL Server works on a similar principle; any string passed into an xml column must be encoded as UTF-16. SQL Server will reject any string where the XML declaration does not specify UTF-16. If the XML declaration is not present, then the XML standard requires that it default to UTF-8, so SQL Server will reject that as well.
Bearing this in mind, here are some utility methods for doing the conversion.
public static string Serialize<T>(T value) {
if(value == null) {
return null;
}
XmlSerializer serializer = new XmlSerializer(typeof(T));
XmlWriterSettings settings = new XmlWriterSettings()
{
Encoding = new UnicodeEncoding(false, false), // no BOM in a .NET string
Indent = false,
OmitXmlDeclaration = false
};
using(StringWriter textWriter = new StringWriter()) {
using(XmlWriter xmlWriter = XmlWriter.Create(textWriter, settings)) {
serializer.Serialize(xmlWriter, value);
}
return textWriter.ToString();
}
}
public static T Deserialize<T>(string xml) {
if(string.IsNullOrEmpty(xml)) {
return default(T);
}
XmlSerializer serializer = new XmlSerializer(typeof(T));
XmlReaderSettings settings = new XmlReaderSettings();
// No settings need modifying here
using(StringReader textReader = new StringReader(xml)) {
using(XmlReader xmlReader = XmlReader.Create(textReader, settings)) {
return (T) serializer.Deserialize(xmlReader);
}
}
}
First of all, beware of finding old examples. You've found one that uses XmlTextWriter, which is deprecated as of .NET 2.0. XmlWriter.Create should be used instead.
Here's an example of serializing an object into an XML column:
public void SerializeToXmlColumn(object obj)
{
using (var outputStream = new MemoryStream())
{
using (var writer = XmlWriter.Create(outputStream))
{
var serializer = new XmlSerializer(obj.GetType());
serializer.Serialize(writer, obj);
}
outputStream.Position = 0;
using (var conn = new SqlConnection(Settings.Default.ConnectionString))
{
conn.Open();
const string INSERT_COMMAND = #"INSERT INTO XmlStore (Data) VALUES (#Data)";
using (var cmd = new SqlCommand(INSERT_COMMAND, conn))
{
using (var reader = XmlReader.Create(outputStream))
{
var xml = new SqlXml(reader);
cmd.Parameters.Clear();
cmd.Parameters.AddWithValue("#Data", xml);
cmd.ExecuteNonQuery();
}
}
}
}
}
<TL;DR> The problem is rather simple, actually: you are not matching the declared encoding (in the XML declaration) with the datatype of the input parameter. If you manually added <?xml version="1.0" encoding="utf-8"?><test/> to the string, then declaring the SqlParameter to be of type SqlDbType.Xml or SqlDbType.NVarChar would give you the "unable to switch the encoding" error. Then, when inserting manually via T-SQL, since you switched the declared encoding to be utf-16, you were clearly inserting a VARCHAR string (not prefixed with an upper-case "N", hence an 8-bit encoding, such as UTF-8) and not an NVARCHAR string (prefixed with an upper-case "N", hence the 16-bit UTF-16 LE encoding).
The fix should have been as simple as:
In the first case, when adding the declaration stating encoding="utf-8": simply don't add the XML declaration.
In the second case, when adding the declaration stating encoding="utf-16": either
simply don't add the XML declaration, OR
simply add an "N" to the input parameter type: SqlDbType.NVarChar instead of SqlDbType.VarChar :-) (or possibly even switch to using SqlDbType.Xml)
(Detailed response is below)
All of the answers here are over-complicated and unnecessary (regardless of the 121 and 184 up-votes for Christian's and Jon's answers, respectively). They might provide working code, but none of them actually answer the question. The issue is that nobody truly understood the question, which ultimately is about how the XML datatype in SQL Server works. Nothing against those two clearly intelligent people, but this question has little to nothing to do with serializing to XML. Saving XML data into SQL Server is much easier than what is being implied here.
It doesn't really matter how the XML is produced as long as you follow the rules of how to create XML data in SQL Server. I have a more thorough explanation (including working example code to illustrate the points outlined below) in an answer on this question: How to solve “unable to switch the encoding” error when inserting XML into SQL Server, but the basics are:
The XML declaration is optional
The XML datatype stores strings always as UCS-2 / UTF-16 LE
If your XML is UCS-2 / UTF-16 LE, then you:
pass in the data as either NVARCHAR(MAX) or XML / SqlDbType.NVarChar (maxsize = -1) or SqlDbType.Xml, or if using a string literal then it must be prefixed with an upper-case "N".
if specifying the XML declaration, it must be either "UCS-2" or "UTF-16" (no real difference here)
If your XML is 8-bit encoded (e.g. "UTF-8" / "iso-8859-1" / "Windows-1252"), then you:
need to specify the XML declaration IF the encoding is different than the code page specified by the default Collation of the database
you must pass in the data as VARCHAR(MAX) / SqlDbType.VarChar (maxsize = -1), or if using a string literal then it must not be prefixed with an upper-case "N".
Whatever 8-bit encoding is used, the "encoding" noted in the XML declaration must match the actual encoding of the bytes.
The 8-bit encoding will be converted into UTF-16 LE by the XML datatype
With the points outlined above in mind, and given that strings in .NET are always UTF-16 LE / UCS-2 LE (there is no difference between those in terms of encoding), we can answer your questions:
Is there a reason why I shouldn't use StringWriter to serialize an Object when I need it as a string afterwards?
No, your StringWriter code appears to be just fine (at least I see no issues in my limited testing using the 2nd code block from the question).
Wouldn't setting the encoding to UTF-16 (in the xml tag) work then?
It isn't necessary to provide the XML declaration. When it is missing, the encoding is assumed to be UTF-16 LE if you pass the string into SQL Server as NVARCHAR (i.e. SqlDbType.NVarChar) or XML (i.e. SqlDbType.Xml). The encoding is assumed to be the default 8-bit Code Page if passing in as VARCHAR (i.e. SqlDbType.VarChar). If you have any non-standard-ASCII characters (i.e. values 128 and above) and are passing in as VARCHAR, then you will likely see "?" for BMP characters and "??" for Supplementary Characters as SQL Server will convert the UTF-16 string from .NET into an 8-bit string of the current Database's Code Page before converting it back into UTF-16 / UCS-2. But you shouldn't get any errors.
On the other hand, if you do specify the XML declaration, then you must pass into SQL Server using the matching 8-bit or 16-bit datatype. So if you have a declaration stating that the encoding is either UCS-2 or UTF-16, then you must pass in as SqlDbType.NVarChar or SqlDbType.Xml. Or, if you have a declaration stating that the encoding is one of the 8-bit options (i.e. UTF-8, Windows-1252, iso-8859-1, etc), then you must pass in as SqlDbType.VarChar. Failure to match the declared encoding with the proper 8 or 16 -bit SQL Server datatype will result in the "unable to switch the encoding" error that you were getting.
For example, using your StringWriter-based serialization code, I simply printed the resulting string of the XML and used it in SSMS. As you can see below, the XML declaration is included (because StringWriter does not have an option to OmitXmlDeclaration like XmlWriter does), which poses no problem so long as you pass the string in as the correct SQL Server datatype:
-- Upper-case "N" prefix == NVARCHAR, hence no error:
DECLARE #Xml XML = N'<?xml version="1.0" encoding="utf-16"?>
<string>Test ሴ😸</string>';
SELECT #Xml;
-- <string>Test ሴ😸</string>
As you can see, it even handles characters beyond standard ASCII, given that ሴ is BMP Code Point U+1234, and 😸 is Supplementary Character Code Point U+1F638. However, the following:
-- No upper-case "N" prefix on the string literal, hence VARCHAR:
DECLARE #Xml XML = '<?xml version="1.0" encoding="utf-16"?>
<string>Test ሴ😸</string>';
results in the following error:
Msg 9402, Level 16, State 1, Line XXXXX
XML parsing: line 1, character 39, unable to switch the encoding
Ergo, all of that explanation aside, the full solution to your original question is:
You were clearly passing the string in as SqlDbType.VarChar. Switch to SqlDbType.NVarChar and it will work without needing to go through the extra step of removing the XML declaration. This is preferred over keeping SqlDbType.VarChar and removing the XML declaration because this solution will prevent data loss when the XML includes non-standard-ASCII characters. For example:
-- No upper-case "N" prefix on the string literal == VARCHAR, and no XML declaration:
DECLARE #Xml2 XML = '<string>Test ሴ😸</string>';
SELECT #Xml2;
-- <string>Test ???</string>
As you can see, there is no error this time, but now there is data-loss 🙀.
public static T DeserializeFromXml<T>(string xml)
{
T result;
XmlSerializerFactory serializerFactory = new XmlSerializerFactory();
XmlSerializer serializer =serializerFactory.CreateSerializer(typeof(T));
using (StringReader sr3 = new StringReader(xml))
{
XmlReaderSettings settings = new XmlReaderSettings()
{
CheckCharacters = false // default value is true;
};
using (XmlReader xr3 = XmlTextReader.Create(sr3, settings))
{
result = (T)serializer.Deserialize(xr3);
}
}
return result;
}
For anyone in need of an F# version of the approved answer:
type private Utf8StringWriter() =
inherit StringWriter()
override _.Encoding = System.Text.Encoding.UTF8
It may have been covered elsewhere but simply changing the encoding line of the XML source to 'utf-16' allows the XML to be inserted into a SQL Server 'xml'data type.
using (DataSetTableAdapters.SQSTableAdapter tbl_SQS = new DataSetTableAdapters.SQSTableAdapter())
{
try
{
bodyXML = #"<?xml version="1.0" encoding="UTF-8" standalone="yes"?><test></test>";
bodyXMLutf16 = bodyXML.Replace("UTF-8", "UTF-16");
tbl_SQS.Insert(messageID, receiptHandle, md5OfBody, bodyXMLutf16, sourceType);
}
catch (System.Data.SqlClient.SqlException ex)
{
Console.WriteLine(ex.Message);
Console.ReadLine();
}
}
The result is all of the XML text is inserted into the 'xml' data type field but the 'header' line is removed. What you see in the resulting record is just
<test></test>
Using the serialization method described in the "Answered" entry is a way of including the original header in the target field but the result is that the remaining XML text is enclosed in an XML <string></string> tag.
The table adapter in the code is a class automatically built using the Visual Studio 2013 "Add New Data Source: wizard. The five parameters to the Insert method map to fields in a SQL Server table.

Categories

Resources