I'm calling a json-rpc api that returns a UCHAR array that represents a PDF file (so the result property on return contains a string representation of a UCHAR array). I need to convert this result string into a Byte array so I can handle it as a PDF file, i.e., save it and/or forward it as a file in a POST to another api.
I have tried the following (the result variable is the returned UCHAR string):
char[] pdfChar = result.ToCharArray();
byte[] pdfByte = new byte[pdfChar.Length];
for (int i = 0; i < pdfChar.Length; i++)
{
pdfByte[i] = Convert.ToByte(pdfChar[i]);
}
File.WriteAllBytes(basePath + "test.pdf", pdfByte);
I have also tried:
byte[] pdfByte = Encoding.ASCII.GetBytes(pdfObj.result);
File.WriteAllBytes(basePath + "test.pdf", pdfByte);
With both of these, when I try to open the resulting test.pdf file, it will not open, presumably because it was not converted properly.
Turns out that, although the output of the API function is UCHAR, when it comes in as part of the JSON string, it is a base64 string, so this works for me:
byte[] pdfBytes = Convert.FromBase64String(pdfObj.result);
I'm pretty sure the API is making that conversion "under the hood", i.e., while the function being called returns UCHAR, the api is using a framework to create the JSON-RPC responses, and it is likely performing the conversion before sending it out. If it is .NET that makes this conversion from UCHAR to base64, then please feel free to chime in and confirm this.
Do you know the file encoding format? Try to use this
return System.Text.Encoding.UTF8.GetString(pdfObj.result)
EDIT:
The solution you found is also reported here
var base64EncodedBytes = System.Convert.FromBase64String(pdfObj.result);
return System.Text.Encoding.UTF8.GetString(base64EncodedBytes)
Related
how can I create a byte data type from a string? For example: The device I am sending data to, expects the data to be in the hexadecimal format. More specifically, it needs to be in the format: 0x{hexa_decimal_value}
Hard coded, it already worked sending data this way.
I would create a byte array like this:
byte[] items_to_send_ = new byte[] {0x46, 0x30, 0x00};
Now I want to code it dynamically.
The code I am now trying to write looks like this:
var ListByte = new List<byte>();
foreach (char val in messageToConvert)
{
var hexa_decimal_val = Convert.ToInt32(val).ToString("X");
hexa_decimal_val = $"0x:{hexa_decimal_val}";
byte val_ = CreateByteFromStringFunction(hexa_decimal_val); // How?
ListByte.Add(val_);
}
The step in question is when creating the variable val_, where I want to build the byte value from hexa_decimal_val, but I just don't know how. Casting does not work, and I did not find any other function that would do it for me.
It feels like there should be a really easy solution to this, but I just don't seem to find it.
What makes looking for the correct answer tricky here is that I already know how to convert from string to hexadecimal value, but the conversion afterwards is nowhere to be found.
You don't need to do create bytes from characters one by one and append to a list. You can usethis;
var encodedByteList = Encoding.UTF8.GetBytes(messageToConvert);
If you still want to do that, you can do something like this;
var encodedByteList = new List<byte>();
foreach (var character in messageToConvert)
{
var correspondingByte = (byte)character;
encodedByteList.Add(correspondingByte);
}
Or with LINQ, you can use this one liner;
var encodedByteList = messageToConvert.Select(c => (byte)c);
From a call to an external API my method receives an image/png as an IRestResponse where the Content property is a string representation.
I need to convert this string representation of image/png into a byte array without saving it first and then going File.ReadAllBytes. How can I achieve this?
You can try a hex string to byte conversion. Here is a method I've used before. Please note that you may have to pad the byte depending on how it comes. The method will throw an error to let you know this. However if whoever sent the image converted it into bytes, then into a hex string (which they should based on what you are saying) then you won't have to worry about padding.
public static byte[] HexToByte(string HexString)
{
if (HexString.Length % 2 != 0)
throw new Exception("Invalid HEX");
byte[] retArray = new byte[HexString.Length / 2];
for (int i = 0; i < retArray.Length; ++i)
{
retArray[i] = byte.Parse(HexString.Substring(i * 2, 2), NumberStyles.HexNumber, CultureInfo.InvariantCulture);
}
return retArray;
}
This might not be the fastest solution by the way, but its a good representation of what needs to happen so you can optimize later.
This is also assuming the string being sent to you is the raw byte converted string. If the sender did anything like a base58 conversion or something else you will need to decode and then use method.
I have found that the IRestResponse from RestSharp actually contains a 'RawBytes' property which is of the response content. This meets my needs and no conversion is necessary!
I have a database storing Binary JPEG Images with two different file signatures (FFD8FFE0 and FFD8DDE1) and I would like to convert them into Base64 so I can use them in another application (Power BI). The data is stored as an IMAGE field type, however I only receive the data in a CSV file and import into my tool as a string and work with it from there.
For the file signature FFD8FFE0, I have no problem converting using the below code (from another Stack post - thank you):
public static string ToBase64(String sBinary)
{
int noChars = sBinary.Length;
byte[] bytes = new byte[noChars / 2];
for (int i = 0; i < noChars; i += 2)
{
bytes[i / 2] = Convert.ToByte(sBinary.Substring(i, 2), 16);
}
return Convert.ToBase64String(bytes);
}
However, the file signature FFD8FFE1 is not converting and displaying properly to an image. It gives me an output, but does not display properly.
Any advice? Is this because of the different file signature OR because of the size of the string (they are noticeably larger).
EDIT: Thank you everyone who assisted. As mentioned in the comments, the real issue was the data I was trying to convert - it was being truncated in the CSV. So for anything who ever comes across this post, pull directly from SQL and not a text file as there is a good chance the data will be truncated.
It sounds like you exported the binary data from an admin tool's query results. Such tools will almost always truncate the displayed binary results to conserve memory.
It's better and easier to read the data directly from the database using ADO.NET or a micro-ORM like Dapper to reduce the boilerplate code.
Using Dapper you could write something as simple as:
var sql="select image from MyTable where Category=#category";
using var connection=new SqlConnection(connectionString);
var images=connection.Query<byte[]>(sql,new {category="Cats"});
And convert it with :
var cats64=images.Select(bytes=>Convert.ToBase64String(bytes));
Dapper will handle opening and closing the connection, so we don't even have to do that.
If you want to retrieve more fields, you can define a class to accept the results. Dapper will map the result columns to properties by name. Once you have a class, you can easily add a method to return the Base64 string:
class MyData
{
public string Name{get;set;}
public byte[] Image {get;set;}
public string ToBase64()=>Convert.ToBase64String(Image);
}
....
var images=connection.Query<MyData>("select Name,Image From ....",...);
Using plain old ADO.NET needs a few more lines:
var sql="select image from MyTable where Category=#category";
using var connection=new SqlConnection(connectionString);
using var cmd=new SqlCommand(sql,connection);
cmd.Parameters.Add("#category",SqlDbType.NVarChar,20).Value="Cats";
using var reader=cmd.ExecuteReader();
while(reader.Read())
{
var image=(byte[])reader["image"];
...
}
With ADO.NET though, it's also possible to load the data as a stream instead of loading everything into memory. This is very helpful when the image is large because we avoid caching the entire blob in memory. We could write the stream to a file directly without first loading it in memory :
while(reader.Read())
{
using (var stream=reader.GetStream(0))
using (var file=File.Create(somePath))
{
stream.CopyTo(file);
}
}
There's no overload of Convert.ToBase64String that works with streams. It's possible to use the CryptoStream class with a Base64 transform to convert a stream to encode an input stream into Base64 as this SO answer shows. Adopting it to this case:
while(reader.Read())
{
using (var stream. = reader.GetStream(0))
using (var base64Stream = new CryptoStream( stream, new ToBase64Transform(), CryptoStreamMode.Read ) )
using (var outputFile = File.Create(somePath) )
{
await base64Stream.CopyToAsync( outputFile ).ConfigureAwait(false);
}
If you are looking to convert to a base64 jpg url:
public static string ToBase64PNGUrl (byte[] bytes) =>
$"data:image/jpg;base64,{Convert.ToBase64String(bytes)}";
You are mapping the results from the database incorrectly, this field should be byte[], not string.
I assume you are receiving a hexadecimal representation of the bytes, which you could convert to bytes then convert to base64 as you attempted (Convert.ToBase64String(bytes)).
Try using EF-code first to read from table and define the image property as byte[].
Just to clarify something first. I am not trying to convert a byte array to a single string. I am trying to convert a byte-array to a string-array.
I am fetching some data from the clipboard using the GetClipboardData API, and then I'm copying the data from the memory as a byte array. When you're copying multiple files (hence a CF_HDROP clipboard format), I want to convert this byte array into a string array of the files copied.
Here's my code so far.
//Get pointer to clipboard data in the selected format
var clipboardDataPointer = GetClipboardData(format);
//Do a bunch of crap necessary to copy the data from the memory
//the above pointer points at to a place we can access it.
var length = GlobalSize(clipboardDataPointer);
var #lock = GlobalLock(clipboardDataPointer);
//Init a buffer which will contain the clipboard data
var buffer = new byte[(int)length];
//Copy clipboard data to buffer
Marshal.Copy(#lock, buffer, 0, (int)length);
GlobalUnlock(clipboardDataPointer);
snapshot.InsertData(format, buffer);
Now, here's my code for reading the buffer data afterwards.
var formatter = new BinaryFormatter();
using (var serializedData = new MemoryStream(buffer))
{
paths = (string[]) formatter.Deserialize(serializedData);
}
This won't work, and it'll crash with an exception saying that the stream doesn't contain a binary header. I suppose this is because it doesn't know which type to deserialize into.
I've tried looking the Marshal class through. Nothing seems of any relevance.
If the data came through the Win32 API then a string array will just be a sequence of null-terminated strings with a double-null-terminator at the end. (Note that the strings will be UTF-16, so two bytes per character). You'll basically need to pull the strings out one at a time into an array.
The method you're looking for here is Marshal.PtrToStringUni, which you should use instead of Marshal.Copy since it works on an IntPtr. It will extract a string, up to the first null character, from your IntPtr and copy it to a string.
The idea would be to continually extract a single string, then advance the IntPtr past the null byte to the start of the next string, until you run out of buffer. I have not tested this, and it could probably be improved (in particular I think there's a smarter way to detect the end of the buffer) but the basic idea would be:
var myptr = GetClipboardData(format);
var length = GlobalSize(myptr);
var result = new List<string>();
var pos = 0;
while ( pos < length )
{
var str = Marshal.PtrToStringUni(myptr);
var count = Encoding.Unicode.GetByteCount(str);
myptr = IntPtr.Add(myptr, count + 1);
pos += count + 1;
result.Add(str);
}
return result.ToArray();
(By the way: the reason your deserialization doesn't work is because serializing a string[] doesn't just write out the characters as bytes; it writes out the structure of a string array, including additional internal bits that .NET uses like the lengths, and a binary header with type information. What you're getting back from the clipboard has none of that present, so it cannot be deserialized.)
How about this:
var strings = Encoding.Unicode
.GetString(buffer)
.Split(new[] { '\0' }, StringSplitOptions.RemoveEmptyEntries);
CString strFile = "c:\\test.txt";
CStdioFile aFile;
UINT nOpenFlags = CFile::modeWrite | CFile::modeCreate | CFile::typeText;
CFileException anError;
if (!aFile.Open(strFile, nOpenFlags, &anError))
{
return false
}
int nSize = 4*sizeof(double);
double* pData = new double[2];
CString strLine, str;
// Write begin of header
strLine = _T(">>> Begin of header <<<\n");
aFile.WriteString(strLine);
// Retrieve current position of file pointer
int lFilePos = (long) aFile.GetPosition();
// Close file
aFile.Close();
nOpenFlags = CFile::modeWrite | CFile::typeBinary;
if (!aFile.Open(strFile, nOpenFlags, &anError))
{
return false;
}
for(int i = 0 ; i < 2 ; i++)
{
pData[i] = i;
}
// Set position of file pointer behind header
aFile.Seek(lFilePos, CFile::begin);
// Write complex vector
aFile.Write(pData, nSize);
// Write complex vector
aFile.Write(pData, nSize);
// Close file
aFile.Close();
Intention to create a file which contains both text data and binary data. This code is written in MFC. I wanted to similarly created a file in C# which contains both text data a and binary data. Please let me know which stream class is used to create this
Text can be written as binary data => simply use binary mode for the whole file and be done.
The only thing the text mode does is that it converts "\n" to "\r\n" on write and back on read. Since the file is partly binary and therefore not editable in regular text editor anyway, you don't need that conversion. If the file is just for your application, you just don't care and if it's for another application, just use whatever newline sequence it requires manually.
As to C#, possibly this S.O. article can give you the answer you are looking for.
The C# solution could also guide you in writing something similar for c, but I suspect you are on your own, i.e., you can use generic read/write to file. In C++, you have the possibility of doing formatted input/output from/to streams by using operator>> and operator<<.