This question already has answers here:
Casting a byte array to a managed structure
(7 answers)
Reading a C/C++ data structure in C# from a byte array
(5 answers)
Closed 9 days ago.
The community reviewed whether to reopen this question 9 days ago and left it closed:
Original close reason(s) were not resolved
I have a black box. I connect to it via TCP. The black box helpfully provides an Interface Control Document which tells me the format of the data I'll be recieving. It may be something like:
Bytes 0-3 UINT32: id of thing
Bytes 4-8 UINT32: type of thing
[and, you, know, like 100 other elements of varying data types]
In C#, how can I get the data I have taken off the TCP socket and get it into the format I know it to be in?
In C, I could just memcpy from a buffer into a struct. Or I could recast the buffer to become the struct I know it to be. I've been casting around for how to best do this in C# and so far find the answers to be somewhere between baffling and way more complicated than I was expecting (C# lacks typedefs and seems to frown on structs in general). So far my best attempt is to make the message as a class:
class MessageThing
{
public uint thingId;
public uint thingType;
[...insert loads of other data fields here...]
}
...and then painfully use BitConverter.ToUint32(buffer, offset) to copy, piece by piece, into my class, from my message buffer. Surely there is a more efficient way of parsing than this. (I found this sort of thing to be super easy in Python and Java, and actually easy enough in C, so I'm clearly missing something for C#?)
Related
This question already has answers here:
Improving performance of for loop when decimating a buffer
(3 answers)
Closed 4 years ago.
I have a rather strange problem that I cannot figure out. I am using a third party library that creates a buffer. This buffer can contain doubles but copying between a double array and it is extremely slow. There must be something going on behind the scenes with the particular data type, specifically when you write to it. For example, the following works but take over 20 ms whereas a copy from a double array to another double array takes 20us.
Mitov.SignalLab.RealBuffer mitovBuffer = new Mitov.SignalLab.RealBuffer(16384);
double[] doubleBuffer = new double[16384];
private void Test()
{
for (int i=0; i < 16384; i++)
{
mitovBuffer[i] = doubleBuffer[i];
}
}
This works but takes 20+ ms. I can get a pointer to the mitovBuffer and I know that there are 8 bytes stored for each "double" in this buffer. Is there a way I can copy between these two? I've tried all the usual things like array.copy, block copies etc. Each time I get a cannot convert from a "double[] to double" error.
Thanks, Tom
Perhaps one reason this function is slow is because
Mitov.SignalLab.RealBuffer is a wrapper around a resizeable delphi buffer. If I understand their documentation correctly, the byte-wise assignment you are doing involves layers of abstraction that might even involve resizing the buffer for every byte.
The API even says that the class is intended for use within Delphi code, not from other languages. The API says
This is Real(double) Data wrapper buffer. Use this buffer to access
and manipulate the Real(double) data from inside your Delphi code.
.NET, C++ Builder and Visual C++ users should use the much more
convenient and powerful TSLCRealBuffer class.
However, their public API does not document that recommended class. Perhaps the documentation doesn't really reflect the product, but if I were you I'd call their engineers to find out what you are intended to do. Since you won't be able to pin their "buffer" abstraction, I suspect you don't want to use unmanaged code to push bytes into those locations.
If you want to try byte-wise loading, perhaps you might try their documented bytewise methods:
function GetByteSize() : Cardinal - Returns the size of the buffer in bytes.
function GetSize() : Cardinal - Returns the size of the buffer in elements.
function ByteRead() : PByte
function ByteWrite() : PByte
function ByteModify() : PByte
Or perhaps you can put your data into their internal format and then call their public procedure procedure AddCustom(AData : ISLData)
I want to transition some old files first to human readable type, so in Delphi code is following:
OpenFileWriteRA(MyF, dd+'invoice.mfs', SizeOf(TFSerialDocEx)) then
and then calling
ReadFile(MyF, vSS1, SizeOf(TFSerialDocEx), nr1, nil);
So i am looking for a way to conver this files using with small programm i want to make it with C#, as i am more fammiliar with C# than with Delphi. .MFS file is written in binary, so what would i need to convert this to text/string, i tryed with simple binary convert but it was not ok, as it seems SizeOf Object at paramters is big thing here or?
There broadly speaking are three approaches that I would consider:
1. Transform data with Delphi code
Since you already have Delphi code to read the data, and structures defined, it will be simplest and quickest to transform the data with Delphi code. Simply read it using your existing code and then output in human readable form. For instance using the built in JSON libraries.
2. Define an equivalent formatted C# structure and blit the binary data onto that structure
Define a formatted structure in C# that has identical binary layout to the structure put to disk. This will use LayoutKind.Sequential and perhaps specify Pack = 1 if the Delphi structure is packed. You may need to use the MarshalAs attribute on some members to achieve binary equivalence. Then read the structure from disk into a byte array. Pin this array, and use Marshal.PtrToStructure on the pinned object address to deserialize. Now you have the data, you can write it how you please.
An example can be found here: Proper struct layout from delphi packed record
3. Read the structure field by field with a binary reader
Rather than declaring a binary compatible structure you can use a BinaryReader to read from a stream one field at a time. Method calls like Read, ReadInt32, ReadDouble, etc. let you work your way through the record. Remember that the fields will have been written in the order in which the Delphi record was declared. If the original record is aligned rather than packed you will need to step over any padding. Again, once you have the data available to your C# code you can write it as you please.
I have a binary file that is created by an open source application that is written in C. Since it is open source I can see how the data is structured when it is written to the file. The problem is that I don't know C, but I can at least mostly tell what is going when the structs are being declared. But from what I've seen in other posts it isn't a simple as creating a struct in C# with the same data types as the ones in C.
I found this post https://stackoverflow.com/a/3863658/201021 which has a class for translating structs but (as far as I can tell) you need to declare the struct properly in C# for it to work.
I've read about the MarshalAs attribute and the StructLayout attribute. I mostly get how you would use them to control the physical structure of the data type. I think what I'm missing are the details.
I'm not asking for somebody to just convert the C data structures into C#. What I'd really like is some pointers to information that will help me figure out how to do it myself. I have another binary file in a slightly different format to read so some general knowledge around this topic would be really appreciated.
How do you convert a C data structure to a C# struct that will allow you to read the data type from a file?
Notes:
Specifically I'm trying to read the rstats and cstats files that are output by the Tomato router firmware. This file contains bandwidth usage data and ip traffic data.
The C code for the data structure is (from rstats.c):
#define MAX_COUNTER 2
#define MAX_NSPEED ((24 * SHOUR) / INTERVAL)
#define MAX_NDAILY 62
#define MAX_NMONTHLY 25
typedef struct {
uint32_t xtime;
uint64_t counter[MAX_COUNTER];
} data_t;
typedef struct {
uint32_t id;
data_t daily[MAX_NDAILY];
int dailyp;
data_t monthly[MAX_NMONTHLY];
int monthlyp;
} history_t;
typedef struct {
char ifname[12];
long utime;
unsigned long speed[MAX_NSPEED][MAX_COUNTER];
unsigned long last[MAX_COUNTER];
int tail;
char sync;
} speed_t;
I think your first link https://stackoverflow.com/a/3863658/201021 is a good way to follow. So I guess the next thing would be constructing a C# struct to map C struct. Here is the map for different types from MSDN http://msdn.microsoft.com/en-us/library/ac7ay120(v=vs.110).aspx
Cheers!
I'm not an ANSI C programmer either but, at first glance at the source file, it appears to be saving data into a .gz file and then renaming it. The open function decompresses it with gzip. So, you might be looking at a compressed file at the top layer.
Once you know that you are dealing with the raw file, it looks like the best place to start is the load(int new) function. You need to figure out how to reverse engineer whats going on. If you get lost, you may have to learn how some of the native C function calls work.
The first interesting line is:
if (f_read("/var/lib/misc/rstats-stime", &save_utime, sizeof(save_utime)) != sizeof(save_utime)) {
save_utime = 0;
}
In scanning the file save_time is declared as a long. In C, that is a 32-bit number so int is the C# equivalent. Given it's name, it seems to be a time-stamp. So, the first step appears to be to read in a 4-byte int.
The next interesting piece is
speed_count = decomp(hgz, speed, sizeof(speed[0]), MAX_SPEED_IF);
In the save function it saves speed as an array of size_t structs with sizeof() * count type behavior. But, it doesn't save the actual count. Since it passes MAX_SPEED_IF (which is defined as = 10) into decomp from the load function, it makes sense to see what it's doing in decomp. In looking, it seems that it tries to read( ... size * max) (a.k.a. size * MAX_SPEED_IF) and depends on the return value from the read library function to know how many size_t structures were actually saved.
From there, it's just a matter of reading in the correct number of bytes for the number of size_t structures written. Then, it goes on to load the history data.
This is the only approach I can think to reverse engineer a binary file while referencing the source code and porting it to a different language all at the same time.
BTW. I'm only offering my help. I could be totally wrong. Like I said, I'm not an ansi c guy. But, I do hope that this helps get you going.
The short answer is that you probably cannot do this automatically, at least at runtime.
Knowing how many C programs are written, there's little chance of any meta-data being in the file. Even if there is, you need to address that as "a program that reads data with meta-data in this format". There are also all sorts of subtleties such as word length, packing etc.
Just because the two languages have "C" in the name does not make them magically compatible I am afraid. I fear you need to write a specific program for each file type and as part of that, re-declare your structures in C#
This question already has answers here:
Closed 11 years ago.
Possible Duplicate:
How can I convert an integer into its verbal representation?
I am looking for some solution to return a number as text.
By that I mean that 100 should be returned as 'one hundred'
It is more or less just a tidious task to make such a function myself, but I rather not re-invent the wheel, and this can't be the first time someones has requested this.
Unfortunatly my search so far has not turned up anything, so here I try stackowerflow.
Basically the numbers comes from a database, so if there is some smart methods you could use here it would be pretty nice.
As mentioned, a small function that returns eg. 100 as 'one hundred' is not a complicated task, but what if you need to take language considarations into the solution?
Has anybody come accross something that actually can do this, and perhaps in multiple languages?
Yes you can use this:
Java numbers to text
Basically the idea is to form the numbers by simply defining all the digits and the tenths, and after that you can also the define the hundreds, the thousands and so on. Because numbers in English are always formed the same way, this is very easy for the English language.
Something like this (may be not the best one, I just make a draft search)?
http://www.daniweb.com/software-development/csharp/threads/53072
This question already has answers here:
Find size of object instance in bytes in c#
(17 answers)
Closed 5 years ago.
I need to know how much bytes my object consumes in memory (in C#). for example how much my Hashtable, or SortedList, or List<String>.
this may not be accurate but its close enough for me
long size = 0;
object o = new object();
using (Stream s = new MemoryStream()) {
BinaryFormatter formatter = new BinaryFormatter();
formatter.Serialize(s, o);
size = s.Length;
}
I don't think you can get it directly, but there are a few ways to find it indirectly.
One way is to use the GC.GetTotalMemory method to measure the amount of memory used before and after creating your object. This won't be perfect, but as long as you control the rest of the application you may get the information you are interested in.
Apart from that you can use a profiler to get the information or you could use the profiling api to get the information in code. But that won't be easy to use I think.
See Find out how much memory is being used by an object in C#? for a similar question.
Unmanaged object:
Marshal.SizeOf(object yourObj);
Value Types:
sizeof(object val)
Managed object:
Looks like there is no direct way to get for managed objects, Ref:
https://learn.microsoft.com/en-us/archive/blogs/cbrumme/size-of-a-managed-object
OK, this question has been answered and answer accepted but someone asked me to put my answer so there you go.
First of all, it is not possible to say for sure. It is an internal implementation detail and not documented. However, based on the objects included in the other object. Now, how do we calculate the memory requirement for our cached objects?
I had previously touched this subject in this article:
Now, how do we calculate the memory requirement for our cached
objects? Well, as most of you would know, Int32 and float are four
bytes, double and DateTime 8 bytes, char is actually two bytes (not
one byte), and so on. String is a bit more complex, 2*(n+1), where n
is the length of the string. For objects, it will depend on their
members: just sum up the memory requirement of all its members,
remembering all object references are simply 4 byte pointers on a 32
bit box. Now, this is actually not quite true, we have not taken care
of the overhead of each object in the heap. I am not sure if you need
to be concerned about this, but I suppose, if you will be using lots
of small objects, you would have to take the overhead into
consideration. Each heap object costs as much as its primitive types,
plus four bytes for object references (on a 32 bit machine, although
BizTalk runs 32 bit on 64 bit machines as well), plus 4 bytes for the
type object pointer, and I think 4 bytes for the sync block index. Why
is this additional overhead important? Well, let’s imagine we have a
class with two Int32 members; in this case, the memory requirement is
16 bytes and not 8.
The following code fragment should return the size in bytes of any object passed to it, so long as it can be serialized.
I got this from a colleague at Quixant to resolve a problem of writing to SRAM on a gaming platform. Hope it helps out.
Credit and thanks to Carlo Vittuci.
/// <summary>
/// Calculates the lenght in bytes of an object
/// and returns the size
/// </summary>
/// <param name="TestObject"></param>
/// <returns></returns>
private int GetObjectSize(object TestObject)
{
BinaryFormatter bf = new BinaryFormatter();
MemoryStream ms = new MemoryStream();
byte[] Array;
bf.Serialize(ms, TestObject);
Array = ms.ToArray();
return Array.Length;
}
In debug mode
load SOS
and execute dumpheap command.