Convert PHP encryption code to C# - c#

I'm trying to convert this piece of code from PHP to C#. It's part of a Captive Portal. Could somebody explain what it does?
$hexchal = pack ("H32", $challenge);
if ($uamsecret) {
$newchal = pack ("H*", md5($hexchal . $uamsecret));
} else {
$newchal = $hexchal;
}
$response = md5("\0" . $password . $newchal);
$newpwd = pack("a32", $password);
$pappassword = implode ("", unpack("H32", ($newpwd ^ $newchal)));

I also encountered the need of php's pack-unpack functions in c# but did not get any good resource.
So i thought to do it myself. I have verified the function's input with pack/unpack/md5 methods found at onlinephpfunctions.com. Since i have done code only as per my requirements. This can be extended for other formats
Pack
private static string pack(string input)
{
//only for H32 & H*
return Encoding.Default.GetString(FromHex(input));
}
public static byte[] FromHex(string hex)
{
hex = hex.Replace("-", "");
byte[] raw = new byte[hex.Length / 2];
for (int i = 0; i < raw.Length; i++)
{
raw[i] = Convert.ToByte(hex.Substring(i * 2, 2), 16);
}
return raw;
}
MD5
private static string md5(string input)
{
byte[] asciiBytes = Encoding.Default.GetBytes(input);
byte[] hashedBytes = MD5CryptoServiceProvider.Create().ComputeHash(asciiBytes);
string hashedString = BitConverter.ToString(hashedBytes).Replace("-", "").ToLower();
return hashedString;
}
Unpack
private static string unpack(string p1, string input)
{
StringBuilder output = new StringBuilder();
for (int i = 0; i < input.Length; i++)
{
string a = Convert.ToInt32(input[i]).ToString("X");
output.Append(a);
}
return output.ToString();
}

Eduardo,
if you take a look at the pack manual, pack is used to convert a string in (hex, octal, binary )to his number representation.
so
$hexcal = pack('H32', $challenge);
would convert a string like 'cca86bc64ec5889345c4c3d8dfc7ade9' to the actual 0xcca... de9
if $uamsecret exist do the same things with the MD5 of hexchal concacteate with the uamsecret.
if ($uamsecret) {
$newchal = pack ("H*", md5($hexchal . $uamsecret));
} else {
$newchal = $hexchal;
}
$response = md5("\0" . $password . $newchal);
MD% '\0' + $password + $newchal
$newpwd = pack("a32", $password);
pad password to 32 byte
$pappassword = implode ("", unpack("H32", ($newpwd ^ $newchal)));
do a xor newpwd and newchal and convert it to a hexadecimal string, I don't get the implode() maybe it's to convert to string to an array of character.

Related

Convert php decryption function to C# responds differently result

I created a service that encrypts and stores keys using PHP. The Service returns an encrypted response to the client. The client program will need to decrypt the encrypted data. But the decrypt function I wrote in PHP responds differently in C#.
So my php Function:
<?php
$key = 12;
$string = "rVOlpZOlp6RUa2RhY11UpKeUlZalpFRrpqOnll5TlpKmklRrrVOempWQm5VUa2NdVJKnpZqgpFNsU4FYnaWTnlKKk6Knk6GnVF1UoKSYk5+bq5Olm6CgkKCSn5ZUa1R+e35ziVKEgXeGiHODd1F5g4GGglNeU6GjmZKgmqySppqhn5GWn5KbnVRrVKSnoaKgpKVynpuek6mZo6Gmol+VoJ9TXlOho5mSoJqskqaaoZ+RoZqgoJZUa1SFk6SanJefplF1mqaqVF1UnZuUkZaqoZujl5VUa1RjYmNnXmNiX2NkUWJhbGFia2JhVF1UnZuUl5+llpGUpJaTpZeVVGtUY2JjZF5iYl9iZFNeU56alZagpJeQl6mimqSWllNsU2RhZGZfYmNeZGNUXVSVl6eblJeQnpqfmqZTbGJirl5Tn5alpJOYl1NsU4ugp6NSnZuUl5+lllKcl6pSkpWlm6eTpZeVVK4=";
$result = '';
$string = base64_decode($string);
for ($i = 0, $k = strlen($string); $i < $k; $i++) {
$char = substr($string, $i, 1);
$keyChar = substr($key, ($i % strlen($key)) - 1, 1);
$char = chr(ord($char) - ord($keyChar));
$result .= $char;
}
echo $result;
?>
It returns:
{"status":201,"success":true,"data":{"lic_id":1,"author":"Author Name","organization_name":"XXXXX XXXXXX GROUP","organization_email":"support#xxxxxx.com","organization_phone":"City/City","lic_expired":"2025-11-22 00:00:00","license_created":"2022-01-12","license_expired":"2025-11-22","device_limit":10},"message":"Your license key activated"}
Now I have converted the above function to C #. The result is strange.
My C# code:
static void Main(string[] args)
{
string key = "rVOlpZOlp6RUa2RhY11UpKeUlZalpFRrpqOnll5TlpKmklRrrVOempWQm5VUa2NdVJKnpZqgpFNsU4FYnaWTnlKKk6Knk6GnVF1UoKSYk5+bq5Olm6CgkKCSn5ZUa1R+e35ziVKEgXeGiHODd1F5g4GGglNeU6GjmZKgmqySppqhn5GWn5KbnVRrVKSnoaKgpKVynpuek6mZo6Gmol+VoJ9TXlOho5mSoJqskqaaoZ+RoZqgoJZUa1SFk6SanJefplF1mqaqVF1UnZuUkZaqoZujl5VUa1RjYmNnXmNiX2NkUWJhbGFia2JhVF1UnZuUl5+llpGUpJaTpZeVVGtUY2JjZF5iYl9iZFNeU56alZagpJeQl6mimqSWllNsU2RhZGZfYmNeZGNUXVSVl6eblJeQnpqfmqZTbGJirl5Tn5alpJOYl1NsU4ugp6NSnZuUl5+lllKcl6pSkpWlm6eTpZeVVK4=";
Console.WriteLine(DecryptIt(key));
}
static string DecryptIt (string key)
{
/// <summary>
/// Decrypt key using custom algarithm
/// </summary>
///
string keyLength = "12";
string result = "";
byte[] data = Convert.FromBase64String(key);
key = Encoding.UTF8.GetString(data);
for (int i = 0; i < key.Length; i++)
{
int res = (i % keyLength.Length) - 1;
int res2 = res < 0 ? keyLength.Length + res : res;
//Console.WriteLine(res2.ToString());
//Console.WriteLine(key.Length);
char ch = key.Substring(i, 1).ToCharArray()[0];
char KeyChar = keyLength.Substring(res2, 1).ToCharArray()[0];
ch = (char)((byte)ch - KeyChar);
result += ch.ToString();
}
return result;
}
}
It returns:
E"EIEIEI":201,"IEIEIEI":EIEI,"EIEI":E"EIEIEI":1,"IEIEIE":"E'EIEI IEIEIEI","IEIEIEIEIEIEIEIEI":"MIMAI IEFEIAIE GIEIE","EIEIEIEIEIEIEIEIEI":"IEIEIEI#IEIEIEIEIE.EIE","EIEIEIEIEIEIEIEIEI":"IEIEIEIE CIEI","IEIEIEIEIEI":"2025-11-22 00:00:00","IEIEIEIEIEIEIEI":"2022-01-12","EIEIEIEIEIEIEIE":"2025-11-22","IEIEIEIEIEIE":10I,"EIEIEIE":"EIEI IEIEIEI IEI IEIEIEIEI"I
I can't understand where I'm making a mistake. The letters are completely different.
If I do it like this:
ch = (char)((char)ch - (char)KeyChar);
It returns:
?"??????":201,"???????":????,"????":?"??????":1,"??????":"?'???? ???????","?????????????????":"MIMA? ??F??A?E G????","??????????????????":"???????#??????????.???","??????????????????":"???????? C???","???????????":"2025-11-22 00:00:00","???????????????":"2022-01-12","???????????????":"2025-11-22","????????????":10?,"???????":"???? ??????? ??? ?????????"?
Can anyone help to solve this problem?
The line:
key = Encoding.UTF8.GetString(data);
is incorrect, you want to deal with bytes directly, not converting it to a utf-8 string.
In fact, the decoded base64 is not even a valid UTF-8 string.
Working Python port:
#!/usr/bin/env python3
import base64
a = "rVOlpZOlp6RUa2RhY11UpKeUlZalpFRrpqOnll5TlpKmklRrrVOempWQm5VUa2NdVJKnpZqgpFNsU4FYnaWTnlKKk6Knk6GnVF1UoKSYk5+bq5Olm6CgkKCSn5ZUa1R+e35ziVKEgXeGiHODd1F5g4GGglNeU6GjmZKgmqySppqhn5GWn5KbnVRrVKSnoaKgpKVynpuek6mZo6Gmol+VoJ9TXlOho5mSoJqskqaaoZ+RoZqgoJZUa1SFk6SanJefplF1mqaqVF1UnZuUkZaqoZujl5VUa1RjYmNnXmNiX2NkUWJhbGFia2JhVF1UnZuUl5+llpGUpJaTpZeVVGtUY2JjZF5iYl9iZFNeU56alZagpJeQl6mimqSWllNsU2RhZGZfYmNeZGNUXVSVl6eblJeQnpqfmqZTbGJirl5Tn5alpJOYl1NsU4ugp6NSnZuUl5+lllKcl6pSkpWlm6eTpZeVVK4="
s = base64.b64decode(a)
key = b'12'
result = ''
for i in range(len(s)):
char = s[i]
pos = (i % len(key)) - 1
if pos < 0:
pos += len(key)
keychar = key[(i % len(key)) - 1]
result += chr((char) - (keychar))
print(result)
UPDATE: C#
using System;
using System.Text;
class Untitled
{
static void Main(string[] args)
{
string key = "rVOlpZOlp6RUa2RhY11UpKeUlZalpFRrpqOnll5TlpKmklRrrVOempWQm5VUa2NdVJKnpZqgpFNsU4FYnaWTnlKKk6Knk6GnVF1UoKSYk5+bq5Olm6CgkKCSn5ZUa1R+e35ziVKEgXeGiHODd1F5g4GGglNeU6GjmZKgmqySppqhn5GWn5KbnVRrVKSnoaKgpKVynpuek6mZo6Gmol+VoJ9TXlOho5mSoJqskqaaoZ+RoZqgoJZUa1SFk6SanJefplF1mqaqVF1UnZuUkZaqoZujl5VUa1RjYmNnXmNiX2NkUWJhbGFia2JhVF1UnZuUl5+llpGUpJaTpZeVVGtUY2JjZF5iYl9iZFNeU56alZagpJeQl6mimqSWllNsU2RhZGZfYmNeZGNUXVSVl6eblJeQnpqfmqZTbGJirl5Tn5alpJOYl1NsU4ugp6NSnZuUl5+lllKcl6pSkpWlm6eTpZeVVK4=";
Console.WriteLine(DecryptIt(key));
}
static string DecryptIt (string key)
{
/// <summary>
/// Decrypt key using custom algarithm
/// </summary>
///
byte[] data = Convert.FromBase64String(key);
byte[] keyLength = new byte[] {0x31, 0x32}; //"12"
byte[] result = new byte[data.Length];
for (int i = 0; i < data.Length; i++)
{
int res = (i % keyLength.Length) - 1;
int res2 = res < 0 ? keyLength.Length + res : res;
//Console.WriteLine(res2.ToString());
//Console.WriteLine(key.Length);
byte ch = data[i];
byte KeyChar = keyLength[res2];
ch = (byte)(ch - KeyChar);
result[i] = ch;
}
return Encoding.UTF8.GetString(result);
}
}
Ideone

Convert a hex string to base64

byte[] ba = Encoding.Default.GetBytes(input);
var hexString = BitConverter.ToString(ba);
hexString = hexString.Replace("-", "");
Console.WriteLine("Or: " + hexString + " in hexadecimal");
So I got this, now how would I convert hexString to a base64 string?
I tried this, got the error:
Cannot convert from string to byte[]
If that solution works for anyone else, what am I doing wrong?
edit:
var plainTextBytes = System.Text.Encoding.UTF8.GetBytes(plainText);
return System.Convert.ToBase64String(plainTextBytes);
I tried this, but it
returns "Cannot implicitly convert type 'byte[]' to 'string'" on the first
line, then "Argument 1: cannot convert from 'string' to 'byte[]'".
You first need to convert your hexstring to a byte-array, which you can then convert to base-64.
To convert from your hexstring to Base-64, you can use:
public static string HexString2B64String(this string input)
{
return System.Convert.ToBase64String(input.HexStringToHex());
}
Where HexStringToHex is:
public static byte[] HexStringToHex(this string inputHex)
{
var resultantArray = new byte[inputHex.Length / 2];
for (var i = 0; i < resultantArray.Length; i++)
{
resultantArray[i] = System.Convert.ToByte(inputHex.Substring(i * 2, 2), 16);
}
return resultantArray;
}
Since .NET5 it can be done using standard library only:
string HexStringToBase64String(string hexString)
{
// hex-string is converted to byte-array
byte[] stringBytes = System.Convert.FromHexString(hexString);
// byte-array is converted base64-string
string res = System.Convert.ToBase64String(stringBytes);
return res;
}
Also there are good examples in docs
public string HexToBase64(string strInput)
{
try
{
var bytes = new byte[strInput.Length / 2];
for (var i = 0; i < bytes.Length; i++)
{
bytes[i] = Convert.ToByte(strInput.Substring(i * 2, 2), 16);
}
return Convert.ToBase64String(bytes);
}
catch (Exception)
{
return "-1";
}
}
On the contrary: https://stackoverflow.com/a/61224900/3988122

Convert hex code to text c# hex string different the other page

I am working on network conversion (unicode) code, but the results are not what I want.
For reference, this is what I want to achieve: http://www.unit-conversion.info/texttools/hexadecimal/
E.g.
Input "E5BC B5E6 9F8F E6A6 86", received "張柏榆" <-----this is what i need
But I use the following reference code
public static string ConvertStringToHex(String input, System.Text.Encoding encoding)
{
Byte[] stringBytes = encoding.GetBytes(input);
StringBuilder sbBytes = new StringBuilder(stringBytes.Length * 2);
foreach (byte b in stringBytes)
{
sbBytes.AppendFormat("{0:X2}", b);
}
return sbBytes.ToString();
}
I get hex string "355F CF67 8669"
It does not convert the hex code into "張柏榆".
public static string ConvertHexToString(String hexInput, System.Text.Encoding encoding)
{
int numberChars = hexInput.Length;
byte[] bytes = new byte[numberChars / 2];
for (int i = 0; i < numberChars; i += 2)
{
bytes[i / 2] = Convert.ToByte(hexInput.Substring(i, 2), 16);
}
return encoding.GetString(bytes);
}
Any advice would be appreciated.
I tried your function, and it did give an error while trying to convert. Weirdly, when I tried with the string "E5BCB5E69F8FE6A686" (your string without the spaces), it worked.
You could modify your code to replace out the spaces automatically, I also added a line to remove any "-" signs (in case they are included):
public static string ConvertHexToString(String hexInput, System.Text.Encoding encoding)
{
hexInput = hexInput.Replace(" ", "").Replace("-", ""); //Edited here to not declare a new string, suggested by Clonkex in comment
int numberChars = hexInput.Length;
byte[] bytes = new byte[numberChars / 2];
for (int i = 0; i < numberChars; i += 2)
{
bytes[i / 2] = Convert.ToByte(hexInput.Substring(i, 2), 16);
}
return encoding.GetString(bytes);
}
You only use System.Text.Encoding.UTF8
string temp = ConvertStringToHex("張柏榆", System.Text.Encoding.UTF8);
string temp1 = ConvertHexToString(temp, System.Text.Encoding.UTF8);
You can use this. I hope it will work for you.

Android base64 hash mismatch with server side hash using C# script

I am creating base64 hash using HMAC SHA256 in my Android application. and send it on server for match with server side hash.
Following this tutorial.
Working Android code:
public String getHash(String data,String key)
{
try
{
String secret = key;
String message = data;
Mac sha256_HMAC = Mac.getInstance("HmacMD5");
SecretKeySpec secret_key = new SecretKeySpec(secret.getBytes(), "HmacMD5");
sha256_HMAC.init(secret_key);
String hash = Base64.encodeBase64String(sha256_HMAC.doFinal(message.getBytes()));
System.out.println(hash);
return hash;
}
catch (Exception e){
System.out.println("Error");
}
}
server code is in C# script and its as per below
using System.Security.Cryptography;
namespace Test
{
public class MyHmac
{
private string CreateToken(string message, string secret)
{
secret = secret ?? "";
var encoding = new System.Text.ASCIIEncoding();
byte[] keyByte = encoding.GetBytes(secret);
byte[] messageBytes = encoding.GetBytes(message);
using (var hmacsha256 = new HMACSHA256(keyByte))
{
byte[] hashmessage = hmacsha256.ComputeHash(messageBytes);
return Convert.ToBase64String(hashmessage);
}
}
}
}
but hash key generated at android side is not match with server side and below is objective c code which generate same as C# code
objective c code:
#import "AppDelegate.h"
#import <CommonCrypto/CommonHMAC.h>
#implementation AppDelegate
- (void)applicationDidFinishLaunching:(NSNotification *)aNotification
{
NSString* key = #"secret";
NSString* data = #"Message";
const char *cKey = [key cStringUsingEncoding:NSASCIIStringEncoding];
const char *cData = [data cStringUsingEncoding:NSASCIIStringEncoding];
unsigned char cHMAC[CC_SHA256_DIGEST_LENGTH];
CCHmac(kCCHmacAlgSHA256, cKey, strlen(cKey), cData, strlen(cData), cHMAC);
NSData *hash = [[NSData alloc] initWithBytes:cHMAC length:sizeof(cHMAC)];
NSLog(#"%#", hash);
NSString* s = [AppDelegate base64forData:hash];
NSLog(s);
}
+ (NSString*)base64forData:(NSData*)theData
{
const uint8_t* input = (const uint8_t*)[theData bytes];
NSInteger length = [theData length];
static char table[] = "ABCDEFGHIJKLMNOPQRSTUVWXYZabcdefghijklmnopqrstuvwxyz0123456789+/=";
NSMutableData* data = [NSMutableData dataWithLength:((length + 2) / 3) * 4];
uint8_t* output = (uint8_t*)data.mutableBytes;
NSInteger i;
for (i=0; i < length; i += 3) {
NSInteger value = 0;
NSInteger j;
for (j = i; j < (i + 3); j++) {
value <<= 8;
if (j < length) { value |= (0xFF & input[j]);
}
}
NSInteger theIndex = (i / 3) * 4; output[theIndex + 0] = table[(value >> 18) & 0x3F];
output[theIndex + 1] = table[(value >> 12) & 0x3F];
output[theIndex + 2] = (i + 1) < length ? table[(value >> 6) & 0x3F] : '=';
output[theIndex + 3] = (i + 2) < length ? table[(value >> 0) & 0x3F] : '=';
}
return [[NSString alloc] initWithData:data encoding:NSASCIIStringEncoding];
}
#end
please help me to sove out this issue,
Thanks in advance.
I have solved this issue by changing HmacSHA256 to HmacMD5 and its give same hash value as given by C# code.
I have updated my question with working code. check it
I suspect this is an encoding issue.
In one sample you specify the string should be encoded using ASCII when converting the string to a byte array. In the other sample you do not specify an encoding.
If the default encoding is anything other than ASCII that means the byte arrays will be different, leading to different hash results.
In android secret.getBytes may get UTF-16 bytes, check the length of the result. In general separate such functions out into separate statements for easier debugging.
Not the answer, rather a demonstration of a simpler Obj-C implementation and provides the hash and Base64 vaules:
NSString* key = #"secret";
NSString* data = #"Message";
NSData *keyData = [key dataUsingEncoding:NSASCIIStringEncoding];
NSData *dataData = [data dataUsingEncoding:NSASCIIStringEncoding];
NSMutableData *hash = [NSMutableData dataWithLength:CC_SHA256_DIGEST_LENGTH];
CCHmac(kCCHmacAlgSHA256, keyData.bytes, keyData.length , dataData.bytes, dataData.length, hash.mutableBytes);
NSLog(#"hash: %#", hash);
NSString* s = [hash base64EncodedStringWithOptions:0];
NSLog(#"s: %#", s);
Output:
hash: <aa747c50 2a898200 f9e4fa21 bac68136 f886a0e2 7aec70ba 06daf2e2 a5cb5597>
s: qnR8UCqJggD55PohusaBNviGoOJ67HC6Btry4qXLVZc=

Binary To Corresponding ASCII String Conversion

Hi i was able to convert a ASCII string to binary using a binarywriter .. as 10101011 . im required back to convert Binary ---> ASCII string .. any idea how to do it ?
This should do the trick... or at least get you started...
public Byte[] GetBytesFromBinaryString(String binary)
{
var list = new List<Byte>();
for (int i = 0; i < binary.Length; i += 8)
{
String t = binary.Substring(i, 8);
list.Add(Convert.ToByte(t, 2));
}
return list.ToArray();
}
Once the binary string has been converted to a byte array, finish off with
Encoding.ASCII.GetString(data);
So...
var data = GetBytesFromBinaryString("010000010100001001000011");
var text = Encoding.ASCII.GetString(data);
If you have ASCII charters only you could use Encoding.ASCII.GetBytes and Encoding.ASCII.GetString.
var text = "Test";
var bytes = Encoding.ASCII.GetBytes(text);
var newText = Encoding.ASCII.GetString(bytes);
Here is complete code for your answer
FileStream iFile = new FileStream(#"c:\test\binary.dat",
FileMode.Open);
long lengthInBytes = iFile.Length;
BinaryReader bin = new BinaryReader(aFile);
byte[] byteArray = bin.ReadBytes((int)lengthInBytes);
System.Text.Encoding encEncoder = System.Text.ASCIIEncoding.ASCII;
string str = encEncoder.GetString(byteArray);
Take this as a simple example:
public void ByteToString()
{
Byte[] arrByte = { 0, 1, 1, 0, 1, 0, 1, 0, 1, 0, 1, 0 };
string x = Convert.ToBase64String(arrByte);
}
This linked answer has interesting details about this kind of conversion:
binary file to string
Sometimes instead of using the built in tools it's better to use "custom" code.. try this function:
public string BinaryToString(string binary)
{
if (string.IsNullOrEmpty(binary))
throw new ArgumentNullException("binary");
if ((binary.Length % 8) != 0)
throw new ArgumentException("Binary string invalid (must divide by 8)", "binary");
StringBuilder builder = new StringBuilder();
for (int i = 0; i < binary.Length; i += 8)
{
string section = binary.Substring(i, 8);
int ascii = 0;
try
{
ascii = Convert.ToInt32(section, 2);
}
catch
{
throw new ArgumentException("Binary string contains invalid section: " + section, "binary");
}
builder.Append((char)ascii);
}
return builder.ToString();
}
Tested with 010000010100001001000011 it returned ABC using the "raw" ASCII values.

Categories

Resources