What's the most efficient way to marshal C++ structs to C#? - c#
I am about to begin reading tons of binary files, each with 1000 or more records. New files are added constantly so I'm writing a Windows service to monitor the directories and process new files as they are received. The files were created with a c++ program. I've recreated the struct definitions in c# and can read the data fine, but I'm concerned that the way I'm doing it will eventually kill my application.
using (BinaryReader br = new BinaryReader(File.Open("myfile.bin", FileMode.Open)))
{
long pos = 0L;
long length = br.BaseStream.Length;
CPP_STRUCT_DEF record;
byte[] buffer = new byte[Marshal.SizeOf(typeof(CPP_STRUCT_DEF))];
GCHandle pin;
while (pos < length)
{
buffer = br.ReadBytes(buffer.Length);
pin = GCHandle.Alloc(buffer, GCHandleType.Pinned);
record = (CPP_STRUCT_DEF)Marshal.PtrToStructure(pin.AddrOfPinnedObject(), typeof(CPP_STRUCT_DEF));
pin.Free();
pos += buffer.Length;
/* Do stuff with my record */
}
}
I don't think I need to use GCHandle because I'm not actually communicating with the C++ app, everything is being done from managed code, but I don't know of an alternative method.
Using Marshal.PtrToStructure is rather slow. I found the following article on CodeProject which is comparing (and benchmarking) different ways of reading binary data very helpful:
Fast Binary File Reading with C#
For your particular application, only one thing will give you the definitive answer: Profile it.
That being said here are the lessons I've learned while working with large PInvoke solutions. The most effective way to marshal data is to marshal fields which are blittable. Meaning the CLR can simple do what amounts to a memcpy to move data between native and managed code. In simple terms, get all of the non-inline arrays and strings out of your structures. If they are present in the native structure, represent them with an IntPtr and marshal the values on demand into managed code.
I haven't ever profiled the difference between using Marshal.PtrToStructure vs. having a native API dereference the value. This is probably something you should invest in should PtrToStructure be revealed as a bottleneck via profiling.
For large hierarchies marshal on demand vs. pulling an entire structure into managed code at a single time. I've run into this issue the most when dealing with large tree structures. Marshalling an individual node is very fast if it's blittable and performance wise it works out to only marshal what you need at that moment.
In addition to JaredPar's comprehensive answer, you don't need to use GCHandle, you can use unsafe code instead.
fixed(byte *pBuffer = buffer) {
record = *((CPP_STRUCT_DEF *)pBuffer);
}
The whole purpose of the GCHandle/fixed statement is to pin/fix the particular memory segment, making the memory immovable from GC's point of view. If the memory was movable, any relocation would render your pointers invalid.
Not sure which way is faster though.
This may be outside the bounds of your question, but I would be inclined to write a little assembly in Managed C++ that did an fread() or something similarly fast to read in the structs. Once you've got them read in, you can use C# to do everything else you need with them.
here's a small class i made a while back while playing with structured files. it was the fastest method i could figure out at the time shy of going unsafe (which was what i was trying to replace and maintain comparable performance.)
using System;
using System.Collections.Generic;
using System.IO;
using System.Runtime.InteropServices;
namespace PersonalUse.IO {
public sealed class RecordReader<T> : IDisposable, IEnumerable<T> where T : new() {
const int DEFAULT_STREAM_BUFFER_SIZE = 2 << 16; // default stream buffer (64k)
const int DEFAULT_RECORD_BUFFER_SIZE = 100; // default record buffer (100 records)
readonly long _fileSize; // size of the underlying file
readonly int _recordSize; // size of the record structure
byte[] _buffer; // the buffer itself, [record buffer size] * _recordSize
FileStream _fs;
T[] _structBuffer;
GCHandle _h; // handle/pinned pointer to _structBuffer
int _recordsInBuffer; // how many records are in the buffer
int _bufferIndex; // the index of the current record in the buffer
long _recordPosition; // position of the record in the file
/// <overloads>Initializes a new instance of the <see cref="RecordReader{T}"/> class.</overloads>
/// <summary>
/// Initializes a new instance of the <see cref="RecordReader{T}"/> class.
/// </summary>
/// <param name="filename">filename to be read</param>
public RecordReader(string filename) : this(filename, DEFAULT_STREAM_BUFFER_SIZE, DEFAULT_RECORD_BUFFER_SIZE) { }
/// <summary>
/// Initializes a new instance of the <see cref="RecordReader{T}"/> class.
/// </summary>
/// <param name="filename">filename to be read</param>
/// <param name="streamBufferSize">buffer size for the underlying <see cref="FileStream"/>, in bytes.</param>
public RecordReader(string filename, int streamBufferSize) : this(filename, streamBufferSize, DEFAULT_RECORD_BUFFER_SIZE) { }
/// <summary>
/// Initializes a new instance of the <see cref="RecordReader{T}"/> class.
/// </summary>
/// <param name="filename">filename to be read</param>
/// <param name="streamBufferSize">buffer size for the underlying <see cref="FileStream"/>, in bytes.</param>
/// <param name="recordBufferSize">size of record buffer, in records.</param>
public RecordReader(string filename, int streamBufferSize, int recordBufferSize) {
_fileSize = new FileInfo(filename).Length;
_recordSize = Marshal.SizeOf(typeof(T));
_buffer = new byte[recordBufferSize * _recordSize];
_fs = new FileStream(filename, FileMode.Open, FileAccess.Read, FileShare.None, streamBufferSize, FileOptions.SequentialScan);
_structBuffer = new T[recordBufferSize];
_h = GCHandle.Alloc(_structBuffer, GCHandleType.Pinned);
FillBuffer();
}
// fill the buffer, reset position
void FillBuffer() {
int bytes = _fs.Read(_buffer, 0, _buffer.Length);
Marshal.Copy(_buffer, 0, _h.AddrOfPinnedObject(), _buffer.Length);
_recordsInBuffer = bytes / _recordSize;
_bufferIndex = 0;
}
/// <summary>
/// Read a record
/// </summary>
/// <returns>a record of type T</returns>
public T Read() {
if(_recordsInBuffer == 0)
return new T(); //EOF
if(_bufferIndex < _recordsInBuffer) {
// update positional info
_recordPosition++;
return _structBuffer[_bufferIndex++];
} else {
// refill the buffer
FillBuffer();
return Read();
}
}
/// <summary>
/// Advances the record position without reading.
/// </summary>
public void Next() {
if(_recordsInBuffer == 0)
return; // EOF
else if(_bufferIndex < _recordsInBuffer) {
_bufferIndex++;
_recordPosition++;
} else {
FillBuffer();
Next();
}
}
public long FileSize {
get { return _fileSize; }
}
public long FilePosition {
get { return _recordSize * _recordPosition; }
}
public long RecordSize {
get { return _recordSize; }
}
public long RecordPosition {
get { return _recordPosition; }
}
public bool EOF {
get { return _recordsInBuffer == 0; }
}
public void Close() {
Dispose(true);
}
void Dispose(bool disposing) {
try {
if(disposing && _fs != null) {
_fs.Close();
}
} finally {
if(_fs != null) {
_fs = null;
_buffer = null;
_recordPosition = 0;
_bufferIndex = 0;
_recordsInBuffer = 0;
}
if(_h.IsAllocated) {
_h.Free();
_structBuffer = null;
}
}
}
#region IDisposable Members
public void Dispose() {
Dispose(true);
}
#endregion
#region IEnumerable<T> Members
public IEnumerator<T> GetEnumerator() {
while(_recordsInBuffer != 0) {
yield return Read();
}
}
#endregion
#region IEnumerable Members
System.Collections.IEnumerator System.Collections.IEnumerable.GetEnumerator() {
return GetEnumerator();
}
#endregion
} // end class
} // end namespace
to use:
using(RecordReader<CPP_STRUCT_DEF> reader = new RecordReader<CPP_STRUCT_DEF>(path)) {
foreach(CPP_STRUCT_DEF record in reader) {
// do stuff
}
}
(pretty new here, hope that wasn't too much to post... just pasted in the class, didn't chop out the comments or anything to shorten it.)
It seems this has nothing to do with neither C++ nor marshalling. You know the structure what else do you need.
Obviously you need a simple code which will read group of bytes representing one struct and then using BitConverter to place bytes into corresponding C# fields..
Related
How to use Fonts in PDFsharp in my Website?
I am just using the Arial font on an Azure WebApplication web site but when I get to this line: MainFont = new XFont("Arial", FontSize); it throws an exception reading: Font data could not retrieved. I would have thought Arial would have been installed on the server ... and I also tried changing it to Sans-Serif to match the default font of the Microsoft generated web-site ... but it still fails. I have also tried adding Arial.ttf to the project, but that hasn't worked.
Thanks for the pointers #PDFSharp Team. Here is my implementation for PdfSharp 1.5 beta3b: Add the fonts you want to your project - in my example below I put Arial in MyProject\fonts\arial\arial.ttf etc. Set each font file as an embedded resource (properties -> build action). Apply the font resolver only once using the static call like this: MyFontResolver.Apply(); // Ensures it's only applied once Here's the font resolver class: class MyFontResolver : IFontResolver { public FontResolverInfo ResolveTypeface(string familyName, bool isBold, bool isItalic) { // Ignore case of font names. var name = familyName.ToLower().TrimEnd('#'); // Deal with the fonts we know. switch (name) { case "arial": if (isBold) { if (isItalic) return new FontResolverInfo("Arial#bi"); return new FontResolverInfo("Arial#b"); } if (isItalic) return new FontResolverInfo("Arial#i"); return new FontResolverInfo("Arial#"); } // We pass all other font requests to the default handler. // When running on a web server without sufficient permission, you can return a default font at this stage. return PlatformFontResolver.ResolveTypeface(familyName, isBold, isItalic); } /// <summary> /// Return the font data for the fonts. /// </summary> public byte[] GetFont(string faceName) { switch (faceName) { case "Arial#": return FontHelper.Arial; case "Arial#b": return FontHelper.ArialBold; case "Arial#i": return FontHelper.ArialItalic; case "Arial#bi": return FontHelper.ArialBoldItalic; } return null; } internal static MyFontResolver OurGlobalFontResolver = null; /// <summary> /// Ensure the font resolver is only applied once (or an exception is thrown) /// </summary> internal static void Apply() { if (OurGlobalFontResolver == null || GlobalFontSettings.FontResolver == null) { if (OurGlobalFontResolver == null) OurGlobalFontResolver = new MyFontResolver(); GlobalFontSettings.FontResolver = OurGlobalFontResolver; } } } /// <summary> /// Helper class that reads font data from embedded resources. /// </summary> public static class FontHelper { public static byte[] Arial { get { return LoadFontData("MyProject.fonts.arial.arial.ttf"); } } public static byte[] ArialBold { get { return LoadFontData("MyProject.fonts.arial.arialbd.ttf"); } } public static byte[] ArialItalic { get { return LoadFontData("MyProject.fonts.arial.ariali.ttf"); } } public static byte[] ArialBoldItalic { get { return LoadFontData("MyProject.fonts.arial.arialbi.ttf"); } } /// <summary> /// Returns the specified font from an embedded resource. /// </summary> static byte[] LoadFontData(string name) { var assembly = Assembly.GetExecutingAssembly(); // Test code to find the names of embedded fonts //var ourResources = assembly.GetManifestResourceNames(); using (Stream stream = assembly.GetManifestResourceStream(name)) { if (stream == null) throw new ArgumentException("No resource with name " + name); int count = (int)stream.Length; byte[] data = new byte[count]; stream.Read(data, 0, count); return data; } } } This is a single, complete and working class based on these two almost identical posts: this blog and this forum.
Use the latest version of PDFsharp (currently 1.50 beta 3) and implement IFontResolver. See also: https://stackoverflow.com/a/32489271/162529 https://stackoverflow.com/a/29059207/162529 The font may be installed on the server, but PDFsharp cannot read it.
Unit testing of code ported from a CRC32 implementation originally in C?
This is a port of a C implementation written by Craig Bruce that had been made available in the public domain, which in turn had been based on the byte-oriented implementation: "File Verification Using CRC" by Mark R. Nelson in Dr. Dobb's Journal, May 1992, pp. 64-67. The original C implementation can be found here I am trying to write a NUnit test for it and need some help to figure out where to start, what conditions to state. Thanks in advance for your help. using System; using System.Diagnostics.CodeAnalysis; namespace MCC.Common { /// <summary> /// Calculates CRC32 Values /// </summary> /// <remarks> /// This is a port of a C implementaion written by Craig Bruce that had /// been made available in the public domain, which in turn had been based on the byte-oriented implementation "File /// Verification Using CRC" by Mark R. Nelson in Dr. Dobb's Journal, May 1992, pp. 64-67. /// The original C implementation can be found here: /// http://www.csbruce.com/software/crc32.c /// </remarks> [SuppressMessage("Microsoft.Naming", "CA1704:IdentifiersShouldBeSpelledCorrectly", MessageId = "Crc")] public static class Crc32 { private static readonly uint[] CrcTable = { 0x00000000, 0x77073096, 0xEE0E612C, 0x990951BA, 0x076DC419, 0x706AF48F, 0xE963A535, 0x9E6495A3, 0x0EDB8832, 0x79DCB8A4, 0xE0D5E91E, 0x97D2D988, 0x09B64C2B, 0x7EB17CBD, 0xE7B82D07, 0x90BF1D91, 0x1DB71064, 0x6AB020F2, 0xF3B97148, 0x84BE41DE, 0x1ADAD47D, 0x6DDDE4EB, 0xF4D4B551, 0x83D385C7, 0x136C9856, 0x646BA8C0, 0xFD62F97A, 0x8A65C9EC, 0x14015C4F, 0x63066CD9, 0xFA0F3D63, 0x8D080DF5, 0x3B6E20C8, 0x4C69105E, 0xD56041E4, 0xA2677172, 0x3C03E4D1, 0x4B04D447, 0xD20D85FD, 0xA50AB56B, 0x35B5A8FA, 0x42B2986C, 0xDBBBC9D6, 0xACBCF940, 0x32D86CE3, 0x45DF5C75, 0xDCD60DCF, 0xABD13D59, 0x26D930AC, 0x51DE003A, 0xC8D75180, 0xBFD06116, 0x21B4F4B5, 0x56B3C423, 0xCFBA9599, 0xB8BDA50F, 0x2802B89E, 0x5F058808, 0xC60CD9B2, 0xB10BE924, 0x2F6F7C87, 0x58684C11, 0xC1611DAB, 0xB6662D3D, 0x76DC4190, 0x01DB7106, 0x98D220BC, 0xEFD5102A, 0x71B18589, 0x06B6B51F, 0x9FBFE4A5, 0xE8B8D433, 0x7807C9A2, 0x0F00F934, 0x9609A88E, 0xE10E9818, 0x7F6A0DBB, 0x086D3D2D, 0x91646C97, 0xE6635C01, 0x6B6B51F4, 0x1C6C6162, 0x856530D8, 0xF262004E, 0x6C0695ED, 0x1B01A57B, 0x8208F4C1, 0xF50FC457, 0x65B0D9C6, 0x12B7E950, 0x8BBEB8EA, 0xFCB9887C, 0x62DD1DDF, 0x15DA2D49, 0x8CD37CF3, 0xFBD44C65, 0x4DB26158, 0x3AB551CE, 0xA3BC0074, 0xD4BB30E2, 0x4ADFA541, 0x3DD895D7, 0xA4D1C46D, 0xD3D6F4FB, 0x4369E96A, 0x346ED9FC, 0xAD678846, 0xDA60B8D0, 0x44042D73, 0x33031DE5, 0xAA0A4C5F, 0xDD0D7CC9, 0x5005713C, 0x270241AA, 0xBE0B1010, 0xC90C2086, 0x5768B525, 0x206F85B3, 0xB966D409, 0xCE61E49F, 0x5EDEF90E, 0x29D9C998, 0xB0D09822, 0xC7D7A8B4, 0x59B33D17, 0x2EB40D81, 0xB7BD5C3B, 0xC0BA6CAD, 0xEDB88320, 0x9ABFB3B6, 0x03B6E20C, 0x74B1D29A, 0xEAD54739, 0x9DD277AF, 0x04DB2615, 0x73DC1683, 0xE3630B12, 0x94643B84, 0x0D6D6A3E, 0x7A6A5AA8, 0xE40ECF0B, 0x9309FF9D, 0x0A00AE27, 0x7D079EB1, 0xF00F9344, 0x8708A3D2, 0x1E01F268, 0x6906C2FE, 0xF762575D, 0x806567CB, 0x196C3671, 0x6E6B06E7, 0xFED41B76, 0x89D32BE0, 0x10DA7A5A, 0x67DD4ACC, 0xF9B9DF6F, 0x8EBEEFF9, 0x17B7BE43, 0x60B08ED5, 0xD6D6A3E8, 0xA1D1937E, 0x38D8C2C4, 0x4FDFF252, 0xD1BB67F1, 0xA6BC5767, 0x3FB506DD, 0x48B2364B, 0xD80D2BDA, 0xAF0A1B4C, 0x36034AF6, 0x41047A60, 0xDF60EFC3, 0xA867DF55, 0x316E8EEF, 0x4669BE79, 0xCB61B38C, 0xBC66831A, 0x256FD2A0, 0x5268E236, 0xCC0C7795, 0xBB0B4703, 0x220216B9, 0x5505262F, 0xC5BA3BBE, 0xB2BD0B28, 0x2BB45A92, 0x5CB36A04, 0xC2D7FFA7, 0xB5D0CF31, 0x2CD99E8B, 0x5BDEAE1D, 0x9B64C2B0, 0xEC63F226, 0x756AA39C, 0x026D930A, 0x9C0906A9, 0xEB0E363F, 0x72076785, 0x05005713, 0x95BF4A82, 0xE2B87A14, 0x7BB12BAE, 0x0CB61B38, 0x92D28E9B, 0xE5D5BE0D, 0x7CDCEFB7, 0x0BDBDF21, 0x86D3D2D4, 0xF1D4E242, 0x68DDB3F8, 0x1FDA836E, 0x81BE16CD, 0xF6B9265B, 0x6FB077E1, 0x18B74777, 0x88085AE6, 0xFF0F6A70, 0x66063BCA, 0x11010B5C, 0x8F659EFF, 0xF862AE69, 0x616BFFD3, 0x166CCF45, 0xA00AE278, 0xD70DD2EE, 0x4E048354, 0x3903B3C2, 0xA7672661, 0xD06016F7, 0x4969474D, 0x3E6E77DB, 0xAED16A4A, 0xD9D65ADC, 0x40DF0B66, 0x37D83BF0, 0xA9BCAE53, 0xDEBB9EC5, 0x47B2CF7F, 0x30B5FFE9, 0xBDBDF21C, 0xCABAC28A, 0x53B39330, 0x24B4A3A6, 0xBAD03605, 0xCDD70693, 0x54DE5729, 0x23D967BF, 0xB3667A2E, 0xC4614AB8, 0x5D681B02, 0x2A6F2B94, 0xB40BBE37, 0xC30C8EA1, 0x5A05DF1B, 0x2D02EF8D }; /// <summary> /// Computes the CRC-32 hash of the specified bytes /// </summary> /// <param name="byteBuffer">The bytes to compute the hash for</param> /// <returns>The CRC-32 hash of the specified bytes</returns> public static uint ComputeHash(byte[] byteBuffer) { if (byteBuffer == null) { throw new ArgumentNullException("byteBuffer"); } long len = byteBuffer.LongLength; uint crc32 = 0xFFFFFFFF; for (int i = 0; i < len; i++) { crc32 = (crc32 >> 8) ^ CrcTable[(crc32 ^ byteBuffer[i]) & 0xFF]; } return (crc32 ^ 0xFFFFFFFF); } } }
Since it is static with no dependencies, unit testing is as simple as calling the method with known inputs and expected results: [Test] public void TestThatHashOfHelloWorldIs98766() { Assert.AreEqual(98766, Crc32.ComputeHash(System.Text.Encoding.Unicode.GetBytes("Hello World"))); } You'll probably want to do a bunch of cases, so look at using parameterized tests or [Theory] to DRY up your test code. One requirement is that you will need another, trusted CRC library to determine the test cases, of course. You'll also want to test the edge cases, like: [Test] public void TestNullThrows() { Assert.Throws<ArgumentNullException>(() => Crc32.ComputeHash(null)); } And if you suspect performance will be an issue on large computations, set a MaxTime: [Test, Maxtime(50)] public void TimedTest() { Crc32.ComputeHash(someVeryLongByteArrayHere); }
C# DataTable (DataRowCollection) stored in a temporary file, not memory?
I would like to replace a DataTable with a custom class that implements DataRowCollection by storing the rows in a temporary data file instead of keeping them in memory. I understand that this will be slow compared to in-memory tables, but I occasionally need to work with tables that simply will not fit in ram (> 4GB of data). I will discard the table and delete the temporary file at the end of the run. The table data is coming from a database query. I know that I can change queries to reduce the size of the data set I get back. That is not the point. The point is there will always be some limit on memory and I would like to have the option of using a slow temporary file rather than just saying "you can't do that". Is there a pre-written class or method of doing this? It seems like I am reinventing the wheel here... Here is my skeletal start: /// <summary> /// like DataTable, but storing data in a file instead of memory /// </summary> public class FileBackedDataTable : DataTable, IIntegrationTest { new public FileBackedDataRowCollection Rows = null; // Summary: // Initializes a new instance of the System.Data.DataTable class with no arguments. public FileBackedDataTable() { Rows = new FileBackedDataRowCollection(this); } } /// <summary> /// like a DataRowCollection but data is stored in a file, not in memory /// </summary> public class FileBackedDataRowCollection : ICollection, IEnumerable, IDisposable { /// <summary> /// internally track each file record /// </summary> class recordInfo { public long recordPosition; public int recordLength; public int recordMaxLength; public long hash; } DataTable table; ArrayList rows = new ArrayList(); public FileBackedDataRowCollection(DataTable table) { this.table = table; openBackingFile(table); } public int Count { get { return rows.Count; } } public void Clear() { rows.Clear(); truncateBackingFile(); } public DataRow this[int index] { get { recordInfo info = (recordInfo)rows[index]; return readRow(info); } set { writeRow(index, value); } } private void writeRow(int index, DataRow value) { byte[] bytes = rowToBytes(value); recordInfo info = (recordInfo)rows[index]; if (bytes.Length <= info.recordMaxLength) { info.recordLength = bytes.Length; info.hash = value.GetHashCode(); writeBytes(info.recordPosition, bytes); } else { rows[index] = appendRow(bytes, value.GetHashCode()); } } private DataRow readRow(recordInfo recordInfo) { byte[] bytes = readBytes(recordInfo.recordPosition, recordInfo.recordLength); DataRow row = bytesToRow(bytes); return row; } public void Add(DataRow r) { byte[] bytes = rowToBytes(r); recordInfo info = appendRow(bytes, r.GetHashCode()); rows.Add(info); } private recordInfo appendRow(byte[] bytes, long hash) { recordInfo info = new recordInfo(); info.recordLength = bytes.Length; info.recordMaxLength = info.recordLength; info.recordPosition = appendBytes(bytes); info.hash = hash; return info; }
Recently, I've been looking at System.Data.SQLite to persist some application data instead of writing one myself. How about create a temp file with SQLite and load your legacy data there? Then you can use it like a local file and delete after munching.
Almost 100% your plan is bad design. Spend some time on redesign, use your fellow DB instead of FILE they were kinda created to manipulate large chunks of data. IF needed you can write stored procedures in C# or other language if your db allows that. describe the way you want to manipulate your data and you will get a real answer to your real problem. It will either require SQL query, or if it cant be done in SQL it can be done in some kind of a loop working with smaller data size almost for sure.
You can use DataTable.WriteXml. But I will support other people, it is better to limit the records you get from the database in the first place.
Log4Net Custom AdoNetAppender Buffer Issue
I am using log4net and I have created my own appender from the AdoNetAppender. My appender just implements a kind of a buffer which permits grouping identical events in one log (for thousands of identical errors, I will only have one line in the database). Here is the code for easy comprehension (my appender has a buffersize = 1): class CustomAdoNetAppender : AdoNetAppender { //My Custom Buffer private static List<LoggingEvent> unSendEvents = new List<LoggingEvent>(); private int customBufferSize = 5; private double interval = 100; private static DateTime lastSendTime = DateTime.Now; protected override void SendBuffer(log4net.Core.LoggingEvent[] events) { LoggingEvent loggingEvent = events[0]; LoggingEvent l = unSendEvents.Find(delegate(LoggingEvent logg) { return GetKey(logg).Equals(GetKey(loggingEvent), StringComparison.OrdinalIgnoreCase); }); //If the events already exist in the custom buffer (unSendEvents) containing the 5 last events if (l != null) { //Iterate the count property try { l.Properties["Count"] = (int)l.Properties["Count"] + 1; } catch { l.Properties["Count"] = 1; } } //Else else { //If the custom buffer (unSendEvents) contains 5 events if (unSendEvents.Count() == customBufferSize) { //Persist the older event base.SendBuffer(new LoggingEvent[] { unSendEvents.ElementAt(0) }); //Delete it from the buffer unSendEvents.RemoveAt(0); } //Set count properties to 1 loggingEvent.Properties["Count"] = 1; //Add the event to the pre-buffer unSendEvents.Add(loggingEvent); } //If timer is over TimeSpan timeElapsed = loggingEvent.TimeStamp - lastSendTime; if (timeElapsed.TotalSeconds > interval) { //Persist all events contained in the unSendEvents buffer base.SendBuffer(unSendEvents.ToArray()); //Update send time lastSendTime = unSendEvents.ElementAt(unSendEvents.Count() - 1).TimeStamp; //Flush the buffer unSendEvents.Clear(); } } /// <summary> /// Function to build a key (aggregation of important properties of a logging event) to facilitate comparison. /// </summary> /// <param name="logg">The loggign event to get the key.</param> /// <returns>Formatted string representing the log event key.</returns> private string GetKey(LoggingEvent logg) { return string.Format("{0}|{1}|{2}|{3}", logg.Properties["ErrorCode"] == null ? string.Empty : logg.Properties["ErrorCode"].ToString() , logg.Level.ToString() , logg.LoggerName , logg.MessageObject.ToString() ); } } The buffer and count part is going well. My issue is that I am losing the 5 last logs because the buffer is not flushed at the end of the program. The unSendEvent buffer is full but never flushed in the database because no more new logs are going to "push" in the db older logs. Is there any solution for me? I have tried to use the Flush() method but with no success.
The Smtp appender has a lossy parameter. If it's not set to false, you aren't guaranteed to get all of the logging messages. Sounds like that might be your problem perhaps? I use a config file, so this line is in my appender definition. <lossy value="false" />
There are a couple ways I can think of to handle this. The first is to change your buffer size to one (it is at 5 right now). That would ensure that all entries get written right away. However, this might not be ideal. If that is the case, one work-around I can think of is to put five dummy log messages into your buffer. That will flush out the real ones and your dummy events will be the ones that get dropped.
How to display data uri scheme into a C# WebBrowser Controller
How can I show an image base64 encoded using WebBrowser control in C#? I used the following code: <img src="data:image/gif;base64,/9j/4AAQSkZJRgABAgAAZABkAA7AAR R894ADkFkb2JlAGTAAAAAAfbAIQABAMDAwMDBAMDBAYEAwQGBwUEBAUHCAYGBw ... uhWkvoJfQO2z/rf4VpL6CX0Dts/63+FaS+gl9A7bP+tthWkvoJfQODCde4qfcg RiNWK3UyUeX9CXpHU43diOK915X5fG/reux5hUAUBftZ" /> but no image is displayed. One solution would be to save images locally and using absolute path, but this is not desirable. Any idea?
I tried doing this for a project and IE (which the WebBrowser control will eventually use) became the limiting factor - it can only hold 32Kb-sized images. I wound up having to create an HTTP handler (.ashx) that returned the image based on a database key. edit: example - note the database handling routines are proprietary and you'd have to put in your own. The rest of the handler will show how to rescale images (if desired) and send back as a response to the browser: public class GenerateImage : IHttpHandler { /// <summary> /// Shortcut to the database controller. Instantiated immediately /// since the ProcessRequest method uses it. /// </summary> private static readonly IDataModelDatabaseController controller = DataModelDatabaseControllerFactory.Controller; /// <summary> /// Enables processing of HTTP Web requests by a custom HttpHandler /// that implements the <see cref="T:System.Web.IHttpHandler"/> /// interface. /// </summary> /// <param name="context">An <see cref="T:System.Web.HttpContext"/> /// object that provides references to the intrinsic server objects /// (for example, Request, Response, Session, and Server) used to /// service HTTP requests.</param> public void ProcessRequest(HttpContext context) { if (controller == null) { return; } IDataModelDescriptor desc = controller.GetDataModelDescriptor( new Guid(context.Request.QueryString["dataModel"])); IDataModelField imageField = desc.Fields[context.Request.QueryString["imageField"]]; IDatabaseSelectQuery query = controller.CreateQuery(); string[] keys = context.Request.QueryString["key"].Split(','); string showThumb = context.Request.QueryString["showThumbnail"]; bool showThumbnail = showThumb != null; query.AssignBaseTable(desc); query.AddColumn(imageField, false); for (int i = 0; i < desc.KeyFields.Count; i++) { query.AddCompareValue( desc.KeyFields[i], keys[i], DatabaseOperator.Equal); } context.Response.CacheControl = "no-cache"; context.Response.ContentType = "image/jpeg"; context.Response.Expires = -1; byte[] originalImage = (byte[])controller.ExecuteScalar(query); if (showThumbnail) { int scalePixels; if (!int.TryParse(showThumb, out scalePixels)) { scalePixels = 100; } using (Stream stream = new MemoryStream(originalImage)) using (Image img = Image.FromStream(stream)) { double multiplier; if ((img.Width <= scalePixels) && (img.Height <= scalePixels)) { context.Response.BinaryWrite(originalImage); return; } else if (img.Height < img.Width) { multiplier = (double)img.Width / (double)scalePixels; } else { multiplier = (double)img.Height / (double)scalePixels; } using (Bitmap finalImg = new Bitmap( img, (int)(img.Width / multiplier), (int)(img.Height / multiplier))) using (Graphics g = Graphics.FromImage(finalImg)) { g.InterpolationMode = InterpolationMode.HighQualityBicubic; finalImg.Save( context.Response.OutputStream, ImageFormat.Jpeg); } } } else { context.Response.BinaryWrite(originalImage); } } /// <summary> /// Gets a value indicating whether another request can use the /// <see cref="T:System.Web.IHttpHandler"/> instance. /// </summary> /// <value></value> /// <returns>true if the <see cref="T:System.Web.IHttpHandler"/> /// instance is reusable; otherwise, false. /// </returns> public bool IsReusable { get { return false; } } }
What is data uri string length, according to data Protocol in IE8 Data URIs cannot be larger than 32,768 characters. Edit: The resource data must be properly encoded; otherwise, an error occurs and the resource is not loaded. The "#" and "%" characters must be encoded, as well as control characters, non-US ASCII characters, and multibyte characters.