Replacing byte based on index value in C# - c#

I am trying to write a console application that will take data as input and split it in to two
example: if i pass a value 0x00000000A0DB383E as input my output should be look like below:
var LowerValue = 0x00000000A0DB0000 (last 2 bytes 383E (index 14-17) replaced with 0000)
var UpperValue = 0x000000000000383E (middle 2 bytes A0DB (index 10-13) replaced with 0000)
So far i have tried below but dont know how to proceed further. Any help will be highly appreciated
using System;
using System.Collections.Generic;
using System.Linq;
using System.Text;
using System.Threading.Tasks;
using System.IO;
namespace SplitFunction
{
class Program
{
static void Main(string[] args)
{
byte[] rawValue = BitConverter.GetBytes(0x00000000A0DB383E);
SplitData(rawValue);
Console.ReadKey();
}
public static byte[] SplitDta(byte[] input)
{
byte[] lowerValues = new byte[8];
Array.Copy(input, 0, lowerValues, 4, 4);
foreach(var lowerValue in lowerValues)
Console.WriteLine(lowerValue);
return lowerValues;
}
}
}

Rather than copying & zeroing individual array elements, use masking to create new arrays directly. Something like this :
long input = 0x0000000A0DB383EL;
byte[] rawValue = BitConverter.GetBytes(input);
byte[] lowValue = BitConverter.GetBytes(input & 0x000000000000FFFF);
byte[] highValue = BitConverter.GetBytes(input & 0x00000000FFFF0000);
if you want the values in order high byte to low byte - then reverse them
byte[] rawValue = Array.Reverse(BitConverter.GetBytes(input));
byte[] lowValue = Array.Reverse(BitConverter.GetBytes(input & 0x000000000000FFFF));
byte[] highValue = Array.Reverse(BitConverter.GetBytes(input & 0x00000000FFFF0000));
if you simply want the long value rather than an array
long lowValue = input & 0x000000000000FFFF;
long highValue = input & 0x00000000FFFF0000;

Related

Replacing values at a particular offset

I managed to read values from a binary file between two particular offset values, now I'm stuck. Now I need to replace all values between two particular offsets.
If file is not that long, you can try Linq:
using System.IO;
using System.Linq;
...
string fileName = ...
int offset1 = ...;
int offset2 = ...;
byte[] toInsert = ...
byte[] data = File.ReadAllBytes(fileName);
File.WriteAllBytes(fileName, data
.Take(offset1) // bytes in 0..offset1 range
.Concat(toInsert) // bytes to insert
.Concat(data.Skip(offset2)) // bytes in offset2..eof range
.ToArray());

Want to write to a file converting an integer list of Hex to Char

So I have a list full of integers. These integers are hexadecimals. I would like to convert this list to ASCII Chars. Once that is done I would like to write the ASCII chars to a file. Here is what I have so far:
public byte[] buffer;
public List<int> list= new List<int>(new int[3]);
list[0] = 5445535420; //AKA header[0] represents the hex integers for Test_ where _ is a space
list[1] = 0; // so the char would be null
list[2] = 4a4153; // would be JAS
System.IO.FileStream fs;
fs = new FileStream(filename, FileMode.OpenOrCreate);
if (fs.CanWrite)
{
for (int i=0;i<list.Count();i++)
{
buffer = Encoding.ASCII.GetBytes(list[i].ToString());
Convert.ToChar(header[i]);
fs.Write(buffer, 0, buffer.Length);
}
}
Would this work for you? Hope comments are self-explenatory
int intFromHexLiteral = 0x4a4153;
var hexString = intFromHexLiteral.ToString("X"); // "4A4153"
var hexCharsList = Split(hexString, 2).ToList(); // ["4A", "41", "53"]
var charsArray = hexCharsList
.Select(hexChar => Convert.ToInt32(hexChar, 16)) // [74, 65, 83]
.Select(i => (char) i) // ['J', 'A', 'S']
.ToArray();
var word = new string(charsArray); // "JAS"
private static IEnumerable<string> Split(string str, int chunkSize) =>
Enumerable.Range(0, str.Length / chunkSize)
.Select(i => str.Substring(i * chunkSize, chunkSize));
An integer does not equal the bytes of the ASCII characters. I.e 1010 is not 0x1010 in hex. In your case it would make more sense to use byte[] an write each hex character explicitly.
class Program
{
static void Main(string[] args)
{
List<byte[]> list = new List<byte[]>();
list.Add(new byte[]{0x54, 0x45, 0x53, 0x54, 0x20}); //AKA header[0] represents the hex integers for Test_ where _ is a space
list.Add(new byte[]{0x0}); // so the char would be null
list.Add(new byte[]{ 0x4a, 0x41, 0x53 }); // would be JAS
foreach (var b in list)
{
var chars = Encoding.ASCII.GetChars(b);
var s = new string(chars);
Console.WriteLine(s);
}
}
}
I see that you've selected an answer, but I wanted to show you this method of solving your problem dealing with your data strictly as numeric data. The values you stick into your List<int> are small enough to fit into a long so I changed it to a List<long>. If they ever become bigger than that, than this solution would not work.
See how I broke each long element, byte by byte, in reverse and stored the conversion into a StringBuilder before writing it to the screen. In your case, you would write to a file instead, but could use the same conversion method.
using System;
using System.Collections.Generic;
using System.Text;
public class Program
{
public static void Main(string[] args)
{
List<long> list = new List<long>(new long[3]);
list[0] = 0x5445535420; // AKA header[0] represents the hex integers for Test_ where _ is a space
list[1] = 0; // so the char would be null
list[2] = 0x4a4153; // would be JAS
for (int i = 0; i < list.Count; i++)
{
StringBuilder sb = new StringBuilder();
// Break down each element byte by byte in reverse
while (list[i] > 0)
{
// Anding against 0xFF to only have the least significant byte to convert into a char
sb.Insert(0, Convert.ToChar(list[i] & 0xFF));
list[i] >>= 8; // Remove the least significant byte
}
Console.WriteLine(sb);
}
}
}
Result:
TEST
JAS

AES 64 key generation

I am trying to generate 64 HEX digits to be used as a AES 256 key with no success.
Can somebody point out the mistakes and a better way to generate the same.
using System;
using System.Collections.Generic;
using System.Linq;
using System.Text.RegularExpressions;
using System.Security.Cryptography;
namespace Test
{
public class Program
{
static System.Text.StringBuilder builder = new System.Text.StringBuilder();
public static void Main(string[] args)
{
String randomNumber = Convert.ToBase64String (GenerateRandomNumber(32));
Console.WriteLine(randomNumber);
string input = randomNumber;
char[] values = input.ToCharArray();
foreach (char letter in values)
{
// Get the integral value of the character.
int value = Convert.ToInt32(letter);
// Convert the decimal value to a hexadecimal value in string form.
string hexOutput = String.Format("{0:X}", value);
// Console.WriteLine("Hexadecimal value of {0} is {1}", letter, hexOutput);
builder.Append(hexOutput);
}
Console.WriteLine(builder);
}
public static byte[] GenerateRandomNumber(int length)
{
using (var randomNumberGenerator = new RNGCryptoServiceProvider())
{
var randomNumber = new byte[length];
randomNumberGenerator.GetBytes(randomNumber);
return randomNumber;
}
}
}
}
Your biggest technical problem is that you used {0:X} when you meant {0:X2}. If the value is 10 the former produces "A" and the latter "0A". Since you've lost where all of the interior zeroes are your number isn't recoverable.
internal static string ByteArrayToHex(this byte[] bytes)
{
StringBuilder builder = new StringBuilder(bytes.Length * 2);
foreach (byte b in bytes)
{
builder.Append(b.ToString("X2"));
}
return builder.ToString();
}
(Code copied from https://github.com/dotnet/corefx/blob/7cad8486cbabbce0236bdf530e30db7036335524/src/Common/tests/System/Security/Cryptography/ByteUtils.cs#L37-L47)
But it's also pretty unclear why you're rerouting through Base64+ToCharArray+ToInt32. You're replacing values in the 0-255 range (bytes) with values in the [A-Za-z0-9/=+] rangeset, equivalent to 0-63 (Base64 and all); so you a) wouldn't have a very random key and b) it'll be too long.
I don't see why you need to convert it to a base64 string first. It could be as simple as this:
public class Program
{
public static void Main(string[] args)
{
var key = GenerateRandomNumber(32);
var hexEncodedKey = BitConverter.ToString(key).Replace("-", "");
Console.WriteLine(hexEncodedKey);
}
public static byte[] GenerateRandomNumber(int length)
{
using (var randomNumberGenerator = RandomNumberGenerator.Create())
{
var randomNumber = new byte[length];
randomNumberGenerator.GetBytes(randomNumber);
return randomNumber;
}
}
}
.NET framework already has a method Aes.GenerateKey() for generating symmetric keys, please look at this MSDN documentation: Aes class

BinaryReader overwhelming me by padding byte array

So I have this really simple code that reads a file and spits its data out in a hex viewer fashion. Here it is:
using System;
using System.Collections.Generic;
using System.Linq;
using System.Text;
using System.IO;
namespace HexViewer
{
class Program
{
static void Main(string[] args)
{
BinaryReader br = new BinaryReader(new FileStream("C:\\dump.bin", FileMode.Open));
for (int i = 0; i < br.BaseStream.Length; i+= 16)
{
Console.Write(i.ToString("x") + ": ");
byte[] data = new byte[16];
br.Read(data, i, 16);
Console.WriteLine(BitConverter.ToString(data).Replace("-", " "));
}
Console.ReadLine();
}
}
}
The problem is that after the first iteration, when I do
br.Read(data, 16, 16);
The byte array is padded by 16 bytes, and then filled with data from 15th byte to 31st byte of the file. Because it can't fit 32 bytes into a 16 byte large array, it throws an exception. You can try this code with any file larger than 16 bytes. So, the question is, what is wrong with this code?
Just change br.Read(data, i, 16); to br.Read(data, 0, 16);
You are reading in a new block of data each time, so no need to use i for the data buffer.
Even better, change:
byte[] data = new byte[16];
br.Read(data, 0, 16);
To:
var data = br.ReadBytes(16);

C# System.Security.Cryptography.HMACSHA1.ComputeHash() does not return expected result

I am trying to implement a OTP solution in C# based on RFC 4226: https://www.rfc-editor.org/rfc/rfc4226
I have found a sample implementation and it looks like this:
using System;
using System.Collections.Generic;
using System.Linq;
using System.Text;
using System.Security.Cryptography;
namespace OTP
{
class Program
{
static void Main(string[] args)
{
System.Text.UTF8Encoding encoding = new System.Text.UTF8Encoding();
byte[] secretKey = encoding.GetBytes("12345678901234567890");
byte[] counter = encoding.GetBytes("1");
Console.WriteLine(CalculateHotp(secretKey, counter));
Console.ReadKey();
}
public static int CalculateHotp(byte[] key, byte[] counter)
{
var hmacsha1 = new HMACSHA1(key);
byte[] hmac_result = hmacsha1.ComputeHash(counter);
int offset = hmac_result[19] & 0x0f;
int bin_code = (hmac_result[offset] & 0x7f) << 24
| (hmac_result[offset + 1] & 0xff) << 16
| (hmac_result[offset + 2] & 0xff) << 8
| (hmac_result[offset + 3] & 0xff);
int hotp = bin_code % 1000000;
return hotp;
}
}
}
The problem is that the call:
byte[] hmac_result = hmacsha1.ComputeHash(counter);
does not return the expected result and thus the returned OTP will be wrong. Reading the RFC4226 appendix D (https://www.rfc-editor.org/rfc/rfc4226#appendix-D), there are some test values to use and the result wont match them:
From the RFC 4226, Appendix D:
The following test data uses the ASCII string
"12345678901234567890" for the secret:
Secret = 0x3132333435363738393031323334353637383930
Table 1 details for each count, the intermediate HMAC value.
Count Hexadecimal HMAC-SHA-1(secret, count)
0 cc93cf18508d94934c64b65d8ba7667fb7cde4b0
1 75a48a19d4cbe100644e8ac1397eea747a2d33ab
2 0bacb7fa082fef30782211938bc1c5e70416ff44
3 66c28227d03a2d5529262ff016a1e6ef76557ece
4 a904c900a64b35909874b33e61c5938a8e15ed1c
<snip>
Table 2 details for each count the truncated values (both in
hexadecimal and decimal) and then the HOTP value.
Truncated
Count Hexadecimal Decimal HOTP
0 4c93cf18 1284755224 755224
1 41397eea 1094287082 287082
2 82fef30 137359152 359152
3 66ef7655 1726969429 969429
4 61c5938a 1640338314 338314
<snip>
Since I in my example above use "12345678901234567890" as key and "1" as counter, I would expect the result of ComputeHash() to be:
75a48a19d4cbe100644e8ac1397eea747a2d33ab
and the OTP to be:
287082
But I get the OTP:
906627
I really cant see what I'm doing wrong here, has anyone successfully implemented a counter based OTP in C# using the HMACSHA1 class?
You use the counter incorrectly. The counter should not be an ASCII string, it should be a numeric (long) value in big-endian.
Use
var counter = new byte[] { 0, 0, 0, 0, 0, 0, 0, 1 };
for this test instead, and your code will return the correct OTP.

Categories

Resources