Hi so am trying to convert this C# function to NodeJS but it does not work I don't really know what is wrong lemme show some code and outputs
C#:
private static byte[] ConvertMsg(byte[] message, byte type = 255, byte cmd = 255)
{
int msgLength = message.Length;
byte[] bArray = new byte[msgLength + 3];
bArray[0] = type;
bArray[1] = cmd;
Buffer.BlockCopy(message, 0, bArray, 2, msgLength);
bArray[msgLength + 2] = 0;
return bArray;
}
static void Main()
{
byte[] encrypted = ConvertMsg(Encoding.Default.GetBytes("hi"),3,3);
Console.WriteLine($"Encrypted: {Convert.ToBase64String(encrypted)}");
Console.ReadKey();
}
Output:
AwNoaQA=
NodeJS:
function ConvertMsg(message, type=255, cmd=255){
let length = message.length;
let bArray = Buffer.alloc(length+3);
bArray[0] = type;
bArray[1] = cmd;
bArray.copy(message,0,length);
bArray[length + 2] = 0;
return bArray;
}
let encrypted = ConvertMsg(Buffer.from("hi"),3,3);
console.log(encrypted.toString("base64"));
Output:
AwMAAAA=
As you can see the output is not the same any help is much appreciated, please explain when you answer I would like to learn more thank you.
According to Buffer documentation, .copy(target[, targetStart[, sourceStart[, sourceEnd]]])
Copies data from a region of buf to a region in target even if the target memory region overlaps with buf.
Here
// means copy 'bArray' starting from length to 'message' starting from 0
bArray.copy(message, 0, length);
You do not copy contents of message to bArray. You do the opposite thing - you copy bArray contents, which is [3, 3, 0, 0, 0] by now to message, and actually overwrite your message.
Then, you output this bArray, which results in AwMAAAA= which is Base64 representation of [3, 3, 0, 0, 0].
You may want to change your function in this way:
function ConvertMsg(message, type=255, cmd=255){
let length = message.length;
let bArray = Buffer.alloc(length + 3);
bArray[0] = type;
bArray[1] = cmd;
// means copy 'message' starting from 0 to 'bArray' starting from 2
message.copy(bArray, 2);
bArray[length + 2] = 0;
return bArray;
}
Related
I trying to add certificate extension to my X509Certificate2 object in pure .NET 4.7.2
I was using BouncyCastle by this method:
private static void AddCdpUrl(X509V3CertificateGenerator certificateGenerator, string cdptUrl)
{
var uriGeneralName = new GeneralName(GeneralName.UniformResourceIdentifier, cdptUrl);
var cdpName = new DistributionPointName(DistributionPointName.FullName, uriGeneralName);
var cdp = new DistributionPoint(cdpName, null, null);
certificateGenerator.AddExtension(X509Extensions.CrlDistributionPoints, false, new CrlDistPoint(new[] { cdp }));
}
Add its works and I get great result:
Now in pure .NET I am using this method:
const string X509CRLDistributionPoints = "2.5.29.31";
certificateRequest.CertificateExtensions.Add(new X509Extension(new Oid(X509CRLDistributionPoints), Encoding.UTF8.GetBytes("http://crl.example.com"), false));
And get this result:
I am missing the sequences for "Distribution Point Name", "Full Name" and "URL="
How can I generate the same result that BouncyCastle does with pure .NET
Thanks
If you only want to write one distribution point, and it's less than or equal to 119 ASCII characters long, and you aren't delegating CRL signing authority to a different certificate:
private static X509Extension MakeCdp(string url)
{
byte[] encodedUrl = Encoding.ASCII.GetBytes(url);
if (encodedUrl.Length > 119)
{
throw new NotSupportedException();
}
byte[] payload = new byte[encodedUrl.Length + 10];
int offset = 0;
payload[offset++] = 0x30;
payload[offset++] = (byte)(encodedUrl.Length + 8);
payload[offset++] = 0x30;
payload[offset++] = (byte)(encodedUrl.Length + 6);
payload[offset++] = 0xA0;
payload[offset++] = (byte)(encodedUrl.Length + 4);
payload[offset++] = 0xA0;
payload[offset++] = (byte)(encodedUrl.Length + 2);
payload[offset++] = 0x86;
payload[offset++] = (byte)(encodedUrl.Length);
Buffer.BlockCopy(encodedUrl, 0, payload, offset, encodedUrl.Length);
return new X509Extension("2.5.29.31", payload, critical: false);
}
Past 119 characters the outer payload length exceeds 0x7F and then you really start wanting a proper DER encoder. You definitely want one for variable numbers of URLs, or including any of the optional data from the extension.
This is probably a bit late, but you could use BC as a utility to get the DER-encoded extension and import it into native .NET like so:
// req is a .NET Core 3.1 CertificateRequest object
req.CertificateExtensions.Add(
new X509Extension(
new Oid("2.5.29.31"),
crlDistPoint.GetDerEncoded(), // this is your CRL extension
false
)
);
I ran into this problem and now there is a AsnWriter available (thanks #bartonjs, looks like you got the namespace you wanted :)), if you're on at least NET5.
So I've cobbled together a method which creates the extension using the writer:
/// <summary>Derived from https://github.com/dotnet/runtime/blob/main/src/libraries/System.Security.Cryptography/src/System/Security/Cryptography/X509Certificates/Asn1/DistributionPointAsn.xml.cs</summary>
private static X509Extension BuildDistributionPointExtension(string[] fullNames, ReasonFlagsAsn? reasons, string[]? crlIssuers, bool critical) {
var writer = new AsnWriter(AsnEncodingRules.DER);
writer.PushSequence();
writer.PushSequence(Asn1Tag.Sequence);
writer.PushSequence(new Asn1Tag(TagClass.ContextSpecific, 0));
writer.PushSequence(new Asn1Tag(TagClass.ContextSpecific, 0));
//See https://github.com/dotnet/runtime/blob/main/src/libraries/Common/src/System/Security/Cryptography/Asn1/GeneralNameAsn.xml.cs for different value types
for(int i = 0; i < fullNames.Length; i++) writer.WriteCharacterString(UniversalTagNumber.IA5String, fullNames[i], new Asn1Tag(TagClass.ContextSpecific, 6)); //GeneralName 6=URI
writer.PopSequence(new Asn1Tag(TagClass.ContextSpecific, 0));
writer.PopSequence(new Asn1Tag(TagClass.ContextSpecific, 0));
if(reasons.HasValue) writer.WriteNamedBitList(reasons.Value, new Asn1Tag(TagClass.ContextSpecific, 1));
if(crlIssuers?.Length > 0) {
writer.PushSequence(new Asn1Tag(TagClass.ContextSpecific, 2));
for(int i = 0; i < crlIssuers.Length; i++) writer.WriteCharacterString(UniversalTagNumber.IA5String, crlIssuers[i], new Asn1Tag(TagClass.ContextSpecific, 2)); //GeneralName 2=DnsName
writer.PopSequence(new Asn1Tag(TagClass.ContextSpecific, 2));
}
writer.PopSequence(Asn1Tag.Sequence);
writer.PopSequence();
return new X509Extension(new Oid("2.5.29.31"), writer.Encode(), critical);
}
[Flags] internal enum ReasonFlagsAsn { Unused = 1 << 0, KeyCompromise = 1 << 1, CACompromise = 1 << 2, AffiliationChanged = 1 << 3, Superseded = 1 << 4, CessationOfOperation = 1 << 5, CertificateHold = 1 << 6, PrivilegeWithdrawn = 1 << 7, AACompromise = 1 << 8 }
Just be careful and do your own tests as I've only checked if the CRL is properly displayed in a cert viewer.
So, I'm attempting to communicate with a device over a serialport object in C#. The device is looking for a mask value to be sent to it as a part of a command string. For example, one of the strings will be something like "SETMASK:{}", where {} is the unsigned 8-bit mask.
When I use a terminal (such as BRAY) to communicate with the device, I can get the device to work. For example, in BRAY terminal, the string SETMASK:$FF will set the mask to 0xFF. However, I can't for the life of me figure out how to do this in C#.
I've already tried the following function, where Data is the mask value and CMD is the surrounding string ("SETMASK:" in this case"). Where am I going wrong?
public static string EmbedDataInString(string Cmd, byte Data)
{
byte[] ConvertedToByteArray = new byte[(Cmd.Length * sizeof(char)) + 2];
System.Buffer.BlockCopy(Cmd.ToCharArray(), 0, ConvertedToByteArray, 0, ConvertedToByteArray.Length - 2);
ConvertedToByteArray[ConvertedToByteArray.Length - 2] = Data;
/*Add on null terminator*/
ConvertedToByteArray[ConvertedToByteArray.Length - 1] = (byte)0x00;
Cmd = System.Text.Encoding.Unicode.GetString(ConvertedToByteArray);
return Cmd;
}
Can't be certain, but I'll bet your device is expecting 1-byte chars, but the C# char is 2 bytes. Try converting your string into a byte array with Encoding.ASCII.GetBytes(). You'll probably also need to return the byte[] array instead of a string, since you'll end up converting it back to 2 byte chars.
using System.Text;
// ...
public static byte[] EmbedDataInString(string Cmd, byte Data)
{
byte[] ConvertedToByteArray = new byte[Cmd.Length + 2];
System.Buffer.BlockCopy(Encoding.ASCII.GetBytes(Cmd), 0, ConvertedToByteArray, 0, ConvertedToByteArray.Length - 2);
ConvertedToByteArray[ConvertedToByteArray.Length - 2] = Data;
/*Add on null terminator*/
ConvertedToByteArray[ConvertedToByteArray.Length - 1] = (byte)0x00;
return ConvertedToByteArray;
}
If your device accepts some other character encoding, swap out ASCII for the appropriate one.
Problem solved, the System.Buffer.BlockCopy() command was embedding zeroes after each character in the string. This works:
public static byte[] EmbedDataInString(string Cmd, byte Data)
{
byte[] ConvertedToByteArray = new byte[(Cmd.Length * sizeof(byte)) + 3];
char[] Buffer = Cmd.ToCharArray();
for (int i = 0; i < Buffer.Length; i++)
{
ConvertedToByteArray[i] = (byte)Buffer[i];
}
ConvertedToByteArray[ConvertedToByteArray.Length - 3] = Data;
ConvertedToByteArray[ConvertedToByteArray.Length - 2] = (byte)0x0A;
/*Add on null terminator*/
ConvertedToByteArray[ConvertedToByteArray.Length - 1] = (byte)0x00;
return ConvertedToByteArray;
}
I know there are many tutorials out there showing you how to use the "ProcessMemoryReader" functions. But this problems seems to be unique or not solved yet.
For quite a while I've been digging into other people's code to find a way to use multiple offsets.
And I thought that using multiple offsets was the problem for me, but I think it's a problem with the fact that my offset value is bigger than 255.
The game I'm trying to get the memory values from is called "Assault Cube".
As I wasn't sure whether I got the right offset values I googled what others results where.
They seem to be exactly the same:
http://cheatengine.org/tables/moreinfo.php?tid=1142 (You can view the .ct file with notepad if you don't have cheat engine installed.)
Here is my code, using the ProcessMemoryReader.cs.
private void timer1_Tick(object sender, EventArgs e)
{
int bytesread;
int pointerbase;
byte[] memory;
Process[] myprocess = Process.GetProcessesByName("ac_client");
if (myprocess.Length != 0)
{
preader.ReadProcess = myprocess[0];
preader.OpenProcess();
//Ammo
memory = preader.ReadProcessMemory((IntPtr)0x4DF73C, 4, out bytesread);
pointerbase = BitConverter.ToInt32(memory, 0);
pointerbase += 0x00; //0 // 14 // 378
byte[] memory1 = preader.ReadProcessMemory((IntPtr)pointerbase, 4, out bytesread);
int pointerbase1 = BitConverter.ToInt32(memory1, 0);
pointerbase1 += 0x14; //0 // 14 // 378
byte[] memory2 = preader.ReadProcessMemory((IntPtr)pointerbase1, 4, out bytesread);
int pointerbase2 = BitConverter.ToInt32(memory2, 0);
pointerbase2 += 0x378; //00 // 14 // 378
byte[] memory3 = preader.ReadProcessMemory((IntPtr)pointerbase2, 4, out bytesread);
int valueis = BitConverter.ToInt32(memory3, 0);
label1.Text = valueis.ToString();
}
Though with a single pointer the process works fine, for example:
//HP
memory = preader.ReadProcessMemory((IntPtr)0x4DF73C, 4, out bytesread);
pointerbase = BitConverter.ToInt32(memory, 0);
pointerbase += 0xf4;
byte[] memory1 = preader.ReadProcessMemory((IntPtr)pointerbase, 4, out bytesread);
int valueis = BitConverter.ToInt32(memory1, 0);
label2.Text = valueis.ToString();
So that works, it's pretty straight forward what's happening here, but I can't figure how to read the Ammo code with the multiple offsets.
I'm not familiar with CheatEngine and it's table format, but I do not get the impression it's pointing to the memory addresses that you are using.
You read 4 bytes at 0x4DF73C, which is used as the new memory address for the next read. This is repeated a few times. Basically, you're reading information from a pointer to a pointer to a pointer. Are you sure this is what is intended?
There's no reason whatsoever that an offset value greater than 255 would be a problem.
Use FindDMAAddy to walk the pointer chain for you, here is a working example, make sure you run as admin:
public static IntPtr FindDMAAddy(IntPtr hProc, IntPtr ptr, int[] offsets)
{
var buffer = new byte[IntPtr.Size];
foreach (int i in offsets)
{
ReadProcessMemory(hProc, ptr, buffer, buffer.Length, out
var read);
ptr = (IntPtr.Size == 4) ? IntPtr.Add(new IntPtr(BitConverter.ToInt32(buffer, 0)), i) : ptr = IntPtr.Add(new IntPtr(BitConverter.ToInt64(buffer, 0)), i);
}
return ptr;
}
var modBase = GetModuleBaseAddress(proc.Id, "ac_client.exe");
var ammoAddr = FindDMAAddy(hProc, (IntPtr)(modBase + 0x10f4f4), new int[] { 0x374, 0x14, 0 });
Console.WriteLine("Ammo address " + "0x" + ammoAddr.ToString("X"));
int newAmmo = 1337;
byte[] buffer = new byte[4];
ReadProcessMemory(proc.Handle, ammoAddr, buffer, 4, out _);
Console.WriteLine("Ammo value " + BitConverter.ToInt32(buffer, 0).ToString());
WriteProcessMemory(hProc, ammoAddr, newAmmo, 4, out _);
Hi i was able to convert a ASCII string to binary using a binarywriter .. as 10101011 . im required back to convert Binary ---> ASCII string .. any idea how to do it ?
This should do the trick... or at least get you started...
public Byte[] GetBytesFromBinaryString(String binary)
{
var list = new List<Byte>();
for (int i = 0; i < binary.Length; i += 8)
{
String t = binary.Substring(i, 8);
list.Add(Convert.ToByte(t, 2));
}
return list.ToArray();
}
Once the binary string has been converted to a byte array, finish off with
Encoding.ASCII.GetString(data);
So...
var data = GetBytesFromBinaryString("010000010100001001000011");
var text = Encoding.ASCII.GetString(data);
If you have ASCII charters only you could use Encoding.ASCII.GetBytes and Encoding.ASCII.GetString.
var text = "Test";
var bytes = Encoding.ASCII.GetBytes(text);
var newText = Encoding.ASCII.GetString(bytes);
Here is complete code for your answer
FileStream iFile = new FileStream(#"c:\test\binary.dat",
FileMode.Open);
long lengthInBytes = iFile.Length;
BinaryReader bin = new BinaryReader(aFile);
byte[] byteArray = bin.ReadBytes((int)lengthInBytes);
System.Text.Encoding encEncoder = System.Text.ASCIIEncoding.ASCII;
string str = encEncoder.GetString(byteArray);
Take this as a simple example:
public void ByteToString()
{
Byte[] arrByte = { 0, 1, 1, 0, 1, 0, 1, 0, 1, 0, 1, 0 };
string x = Convert.ToBase64String(arrByte);
}
This linked answer has interesting details about this kind of conversion:
binary file to string
Sometimes instead of using the built in tools it's better to use "custom" code.. try this function:
public string BinaryToString(string binary)
{
if (string.IsNullOrEmpty(binary))
throw new ArgumentNullException("binary");
if ((binary.Length % 8) != 0)
throw new ArgumentException("Binary string invalid (must divide by 8)", "binary");
StringBuilder builder = new StringBuilder();
for (int i = 0; i < binary.Length; i += 8)
{
string section = binary.Substring(i, 8);
int ascii = 0;
try
{
ascii = Convert.ToInt32(section, 2);
}
catch
{
throw new ArgumentException("Binary string contains invalid section: " + section, "binary");
}
builder.Append((char)ascii);
}
return builder.ToString();
}
Tested with 010000010100001001000011 it returned ABC using the "raw" ASCII values.
I don't know C++ very well especially the IO part. Can anyone please help me to translate the following C++ code into C#?
unsigned *PostingOffset, *PostingLength, NumTerms;
void LoadSubIndex(char *subindex) {
FILE *in = fopen(subindex, "rb");
if (in == 0) {
printf("Error opening sub-index file '%s'!\n", subindex);
exit(EXIT_FAILURE);
}
int len=0;
// Array of terms
char **Term;
char *TermList;
fread(&NumTerms, sizeof(unsigned), 1, in);
PostingOffset = (unsigned*)malloc(sizeof(unsigned) * NumTerms);
PostingLength = (unsigned*)malloc(sizeof(unsigned) * NumTerms);
Term = (char**)malloc(sizeof(char*) * NumTerms);
Term = (char**)malloc(sizeof(char*) * NumTerms);
// Offset of each posting
fread(PostingOffset, sizeof(unsigned), NumTerms, in);
// Length of each posting in bytes
fread(PostingLength, sizeof(unsigned), NumTerms, in);
// Number of bytes in the posting terms array
fread(&len, sizeof(unsigned), 1, in);
TermList = (char*)malloc(sizeof(char) * len);
fread(TermList, sizeof(unsigned)*len, 1, in);
unsigned k=1;
Term[0] = &TermList[0];
for (int i=1; i<len; i++) {
if (TermList[i-1] == '\0') {
Term[k] = &TermList[i];
k++;
}
}
fclose(in);
}
Thanks in advance.
I'll give you a headstart.
using(var reader = new BinaryReader(new FileStream(subindex, FileMode.Open)) {
int numTerms = reader.ReadUInt32();
postingOffset = new UInt32[numTerms];
postingLength = new UInt32[numTerms];
var term = new byte[numTerms];
for(int i=0;i<numTerms;i++)
postingOffset[i] = reader.ReadUInt32();
for(int i=0;i<numTerms;i++)
postingLength[i] = reader.ReadUInt32();
var len = reader.ReadInt32();
var termList = new ... // byte[] or uint32[] ??
//etc
}
There's no need to close the file handle here - it will close when the using { } block loses scope.
I didn't finish it because there are some flaws in your code. With TermList you are reading in 4 times as much data as you've allocated. You shouldn't be allocating Term twice either - that will result in leaking memory.
To turn Term back into a string, use Encoding.ASCII.GetString(term).TrimEnd('\0');