Does WCF pad all byte arrays in SOAP messages? - c#

I am doing some data chunking and I'm seeing an interesting issue when sending binary data in my response. I can confirm that the length of the byte array is below my data limit of 4 megabytes, but when I receive the message, it's total size is over 4 megabytes.
For the example below, I used the largest chunk size I could so I could illustrate the issue while still receiving a usable chunk.
The size of the binary data is 3,040,870 on the service side and the client (once the message is deserialized). However, I can also confirm that the byte array is actually just under 4 megabytes (this was done by actually copying the binary data from the message and pasting it into a text file).
So, is WCF causing these issues and, if so, is there anything I can do to prevent it? If not, what might be causing this inflation on my side?
Thanks!

The usual way of sending byte[]s in SOAP messages is to base64-encode the data. This encoding takes 33% more space than binary encoding, which accounts for the size difference almost precisely.
You could adjust the max size or chunk size slightly so that the end result is within the right range, or use another encoding, e.g. MTOM, to eliminate this 33% overhead.

If you're stuck with soap, you can offset the buffer overhead Tim S. talked about using the System.IO.Compression library in .Net - You'd use the compress function first, before building and sending the soap message.
You'd compress with this:
public static byte[] Compress(byte[] data)
{
MemoryStream ms = new MemoryStream();
DeflateStream ds = new DeflateStream(ms, CompressionMode.Compress);
ds.Write(data, 0, data.Length);
ds.Flush();
ds.Close();
return ms.ToArray();
}
On the receiving end, you'd use this to decompress:
public static byte[] Decompress(byte[] data)
{
const int BUFFER_SIZE = 256;
byte[] tempArray = new byte[BUFFER_SIZE];
List<byte[]> tempList = new List<byte[]>();
int count = 0;
int length = 0;
MemoryStream ms = new MemoryStream(data);
DeflateStream ds = new DeflateStream(ms, CompressionMode.Decompress);
while ((InlineAssignHelper(count, ds.Read(tempArray, 0, BUFFER_SIZE))) > 0) {
if (count == BUFFER_SIZE) {
tempList.Add(tempArray);
tempArray = new byte[BUFFER_SIZE];
} else {
byte[] temp = new byte[count];
Array.Copy(tempArray, 0, temp, 0, count);
tempList.Add(temp);
}
length += count;
}
byte[] retVal = new byte[length];
count = 0;
foreach (byte[] temp in tempList) {
Array.Copy(temp, 0, retVal, count, temp.Length);
count += temp.Length;
}
return retVal;
}

Related

NAudio C# resampling sizzle trouble

I have a problem using NAudio and WasapiLoopBackCapture...
I'm beginner and I don't understand all what I do.
I want to convert byte buffer to another with wave format 44100 rate and 16 bits.
It work but I have some sizzle or strange noise after conversion.
Edit: I tryed to merge all buffer without convert. I convert and write only at the end and then, it work fine. I think the problem is in function Convert16 or readStream. If I merge converted buffer, the sizzle come beetween 2 buffer. It can be a length problem ? I need to Convert buffer without merging because I send them by udp
The functions Convert16 and Readstream are from this topic
public byte[] Convert16(byte[] input, int length, WaveFormat format)
{
if (length == 0)
return new byte[0];
using (var memStream = new MemoryStream(input, 0, length))
{
using (var inputStream = new RawSourceWaveStream(memStream, format))
{
var sampleStream = new NAudio.Wave.SampleProviders.WaveToSampleProvider(inputStream);
var resamplingProvider = new NAudio.Wave.SampleProviders.WdlResamplingSampleProvider(sampleStream, audioRate);
var ieeeToPCM = new NAudio.Wave.SampleProviders.SampleToWaveProvider16(resamplingProvider);
var sampleStreams = new NAudio.Wave.StereoToMonoProvider16(ieeeToPCM);
sampleStreams.RightVolume = 0.5f;
sampleStreams.LeftVolume = 0.5f;
return readStream(sampleStreams, length);
}
}
}
private byte[] readStream(IWaveProvider waveStream, int length)
{
byte[] buffer = new byte[length];
using (var stream = new MemoryStream())
{
int read;
while ((read = waveStream.Read(buffer, 0, length)) > 0)
{
stream.Write(buffer, 0, read);
}
return stream.ToArray();
}
}
public void InputBufferToFileCallback(object sender, WaveInEventArgs e)
{
// Used to see WaveViewer and to test
baseWriter.Write(e.Buffer, 0, e.BytesRecorded);
// byte[] convertedTo16 -- PROBLEM IS HERE
convertedTo16 = Convert16(e.Buffer, e.BytesRecorded, waveFormatIn);
// Used to see WaveViewer and to test
convertedWriter.Write(convertedTo16, 0, convertedTo16.Length);
// Send over udp real time
SendSoundController(convertedTo16);
}
We can see on this image the difference between the audacity resampling and my resampling. We can see the trouble.
https://i.imgur.com/H3PbNYR.png
Thanks and have a good day.
When you're resampling recorded audio, you need to maintain a single resampler that lives across multiple buffers. I've written a couple of articles on "input driven resampling" with NAudio that explain how to do this:
Input driven resampling with NAudio using ACM
Fully managed input driven resampling with WDL

c# ZLib.Net decompression

I'm trying to decompress a byte array using the ZLib.Net library. Unfortunately my function is always returning two bytes only. The compressed array has 1240 bytes. No one of zlibConst values does return more than two bytes.
Maybe something is wrong with ZLib.Net? (the DLL is taken from here: http://zlibnet.codeplex.com/)
byte[] Decompress(byte [] compressed)
{
byte[] rez = new byte[compressed.Length];
MemoryStream oInStream = new MemoryStream(compressed);
ZInputStream oZInstream = new ZInputStream(oInStream, zlibConst.Z_BEST_COMPRESSION);
MemoryStream oOutStream = new MemoryStream();
byte[] buffer = new byte[2000];
int len;
while ((len = oZInstream.read(buffer, 0, 2000)) > 0)
{
oOutStream.Write(buffer, 0, len);
}
oOutStream.Flush();
byte[] arrUncompressed = oOutStream.ToArray();
oZInstream.Close();
oOutStream.Close();
return arrUncompressed;
}

Write file directly into TcpClient without storing it in memory

I have a 1 GB file that I need to write to a TcpClient object. What's the best way to do this without reading the entire file into memory?
You have to read it into memory at some point though you obviously don't need to do it all at once!
Just use BinaryReader.Read and read in "n" number of bytes at a time, something like:
BinaryReader reader = new BinaryReader(new FileStream("test.dat", FileMode.Open));
int currentIndex = 0;
byte[] buffer = new byte[100];
while (reader.Read(buffer, currentIndex, 100) > 0)
{
//Send buffer
currentIndex += 100;
}

Need help Creating a big array from small byte arrays

i got the following code:
byte[] myBytes = new byte[10 * 10000];
for (long i = 0; i < 10000; i++)
{
byte[] a1 = BitConverter.GetBytes(i);
byte[] a2 = BitConverter.GetBytes(true);
byte[] a3 = BitConverter.GetBytes(false);
byte[] rv = new byte[10];
System.Buffer.BlockCopy(a1, 0, rv, 0, a1.Length);
System.Buffer.BlockCopy(a2, 0, rv, a1.Length, a2.Length);
System.Buffer.BlockCopy(a3, 0, rv, a1.Length + a2.Length, a3.Length);
}
everything works as it should. i was trying to convert this code so everything will be written into myBytes but then i realised, that i use a long and if its value will be higher then int.MaxValue casting will fail.
how could one solve this?
another question would be, since i dont want to create a very large bytearray in memory, how could i send it directry to my .WriteBytes(path, myBytes); function ?
If the final destination for this is, as suggested, a file: then write to a file more directly, rather than buffering in memory:
using (var file = File.Create(path)) // or append file FileStream etc
using (var writer = new BinaryWriter(file))
{
for (long i = 0; i < 10000; i++)
{
writer.Write(i);
writer.Write(true);
writer.Write(false);
}
}
Perhaps the ideal way of doing this in your case would be to pass a single BinaryWriter instance to each object in turn as you serialize them (don't open and close the file per-object).
Why don't you just Write() the bytes out as you process them rather than converting to a massive buffer, or use a smaller buffer at least?

Reading NetworkStream doesn't advance stream

I have a client-server application where the server transmits a 4-byte integer specifying how large the next transmission is going to be. When I read the 4-byte integer on the client side (specifying FILE_SIZE), the next time I read the stream I get FILE_SIZE + 4 bytes read.
Do I need to specify the offset to 4 when reading from this stream, or is there a way to automatically advance the NetworkStream so my offset can always be 0?
SERVER
NetworkStream theStream = theClient.getStream();
//...
//Calculate file size with FileInfo and put into byte[] size
//...
theStream.Write(size, 0, size.Length);
theStream.Flush();
CLIENT
NetworkStream theStream = theClient.getStream();
//read size
byte[] size = new byte[4];
int bytesRead = theStream.Read(size, 0, 4);
...
//read content
byte[] content = new byte[4096];
bytesRead = theStream.Read(content, 0, 4096);
Console.WriteLine(bytesRead); // <-- Prints filesize + 4
Right; found it; FileInfo.Length is a long; your call to:
binWrite.Write(fileInfo.Length);
writes 8 bytes, little-endian. You then read that back via:
filesize = binRead.ReadInt32();
which little-endian will give you the same value (for 32 bits, at least). You have 4 00 bytes left unused in the stream, though (from the high-bytes of the long) - hence the 4 byte mismatch.
Use one of:
binWrite.Write((int)fileInfo.Length);
filesize = binRead.ReadInt64();
NetworkStream certainly advances, but in both cases, your read is unreliable; a classic "read known amount of content" would be:
static void ReadAll(Stream source, byte[] buffer, int bytes) {
if(bytes > buffer.Length) throw new ArgumentOutOfRangeException("bytes");
int bytesRead, offset = 0;
while(bytes > 0 && (bytesRead = source.Reader(buffer, offset, bytes)) > 0) {
offset += bytesRead;
bytes -= bytesRead;
}
if(bytes != 0) throw new EndOfStreamException();
}
with:
ReadAll(theStream, size, 4);
...
ReadAll(theStream, content, contentLength);
note also that you need to be careful with endianness when parsing the length-prefix.
I suspect you simply aren't reading the complete data.

Categories

Resources