Android base64 hash mismatch with server side hash using C# script - c#

I am creating base64 hash using HMAC SHA256 in my Android application. and send it on server for match with server side hash.
Following this tutorial.
Working Android code:
public String getHash(String data,String key)
{
try
{
String secret = key;
String message = data;
Mac sha256_HMAC = Mac.getInstance("HmacMD5");
SecretKeySpec secret_key = new SecretKeySpec(secret.getBytes(), "HmacMD5");
sha256_HMAC.init(secret_key);
String hash = Base64.encodeBase64String(sha256_HMAC.doFinal(message.getBytes()));
System.out.println(hash);
return hash;
}
catch (Exception e){
System.out.println("Error");
}
}
server code is in C# script and its as per below
using System.Security.Cryptography;
namespace Test
{
public class MyHmac
{
private string CreateToken(string message, string secret)
{
secret = secret ?? "";
var encoding = new System.Text.ASCIIEncoding();
byte[] keyByte = encoding.GetBytes(secret);
byte[] messageBytes = encoding.GetBytes(message);
using (var hmacsha256 = new HMACSHA256(keyByte))
{
byte[] hashmessage = hmacsha256.ComputeHash(messageBytes);
return Convert.ToBase64String(hashmessage);
}
}
}
}
but hash key generated at android side is not match with server side and below is objective c code which generate same as C# code
objective c code:
#import "AppDelegate.h"
#import <CommonCrypto/CommonHMAC.h>
#implementation AppDelegate
- (void)applicationDidFinishLaunching:(NSNotification *)aNotification
{
NSString* key = #"secret";
NSString* data = #"Message";
const char *cKey = [key cStringUsingEncoding:NSASCIIStringEncoding];
const char *cData = [data cStringUsingEncoding:NSASCIIStringEncoding];
unsigned char cHMAC[CC_SHA256_DIGEST_LENGTH];
CCHmac(kCCHmacAlgSHA256, cKey, strlen(cKey), cData, strlen(cData), cHMAC);
NSData *hash = [[NSData alloc] initWithBytes:cHMAC length:sizeof(cHMAC)];
NSLog(#"%#", hash);
NSString* s = [AppDelegate base64forData:hash];
NSLog(s);
}
+ (NSString*)base64forData:(NSData*)theData
{
const uint8_t* input = (const uint8_t*)[theData bytes];
NSInteger length = [theData length];
static char table[] = "ABCDEFGHIJKLMNOPQRSTUVWXYZabcdefghijklmnopqrstuvwxyz0123456789+/=";
NSMutableData* data = [NSMutableData dataWithLength:((length + 2) / 3) * 4];
uint8_t* output = (uint8_t*)data.mutableBytes;
NSInteger i;
for (i=0; i < length; i += 3) {
NSInteger value = 0;
NSInteger j;
for (j = i; j < (i + 3); j++) {
value <<= 8;
if (j < length) { value |= (0xFF & input[j]);
}
}
NSInteger theIndex = (i / 3) * 4; output[theIndex + 0] = table[(value >> 18) & 0x3F];
output[theIndex + 1] = table[(value >> 12) & 0x3F];
output[theIndex + 2] = (i + 1) < length ? table[(value >> 6) & 0x3F] : '=';
output[theIndex + 3] = (i + 2) < length ? table[(value >> 0) & 0x3F] : '=';
}
return [[NSString alloc] initWithData:data encoding:NSASCIIStringEncoding];
}
#end
please help me to sove out this issue,
Thanks in advance.
I have solved this issue by changing HmacSHA256 to HmacMD5 and its give same hash value as given by C# code.
I have updated my question with working code. check it

I suspect this is an encoding issue.
In one sample you specify the string should be encoded using ASCII when converting the string to a byte array. In the other sample you do not specify an encoding.
If the default encoding is anything other than ASCII that means the byte arrays will be different, leading to different hash results.

In android secret.getBytes may get UTF-16 bytes, check the length of the result. In general separate such functions out into separate statements for easier debugging.
Not the answer, rather a demonstration of a simpler Obj-C implementation and provides the hash and Base64 vaules:
NSString* key = #"secret";
NSString* data = #"Message";
NSData *keyData = [key dataUsingEncoding:NSASCIIStringEncoding];
NSData *dataData = [data dataUsingEncoding:NSASCIIStringEncoding];
NSMutableData *hash = [NSMutableData dataWithLength:CC_SHA256_DIGEST_LENGTH];
CCHmac(kCCHmacAlgSHA256, keyData.bytes, keyData.length , dataData.bytes, dataData.length, hash.mutableBytes);
NSLog(#"hash: %#", hash);
NSString* s = [hash base64EncodedStringWithOptions:0];
NSLog(#"s: %#", s);
Output:
hash: <aa747c50 2a898200 f9e4fa21 bac68136 f886a0e2 7aec70ba 06daf2e2 a5cb5597>
s: qnR8UCqJggD55PohusaBNviGoOJ67HC6Btry4qXLVZc=

Related

Convert php decryption function to C# responds differently result

I created a service that encrypts and stores keys using PHP. The Service returns an encrypted response to the client. The client program will need to decrypt the encrypted data. But the decrypt function I wrote in PHP responds differently in C#.
So my php Function:
<?php
$key = 12;
$string = "rVOlpZOlp6RUa2RhY11UpKeUlZalpFRrpqOnll5TlpKmklRrrVOempWQm5VUa2NdVJKnpZqgpFNsU4FYnaWTnlKKk6Knk6GnVF1UoKSYk5+bq5Olm6CgkKCSn5ZUa1R+e35ziVKEgXeGiHODd1F5g4GGglNeU6GjmZKgmqySppqhn5GWn5KbnVRrVKSnoaKgpKVynpuek6mZo6Gmol+VoJ9TXlOho5mSoJqskqaaoZ+RoZqgoJZUa1SFk6SanJefplF1mqaqVF1UnZuUkZaqoZujl5VUa1RjYmNnXmNiX2NkUWJhbGFia2JhVF1UnZuUl5+llpGUpJaTpZeVVGtUY2JjZF5iYl9iZFNeU56alZagpJeQl6mimqSWllNsU2RhZGZfYmNeZGNUXVSVl6eblJeQnpqfmqZTbGJirl5Tn5alpJOYl1NsU4ugp6NSnZuUl5+lllKcl6pSkpWlm6eTpZeVVK4=";
$result = '';
$string = base64_decode($string);
for ($i = 0, $k = strlen($string); $i < $k; $i++) {
$char = substr($string, $i, 1);
$keyChar = substr($key, ($i % strlen($key)) - 1, 1);
$char = chr(ord($char) - ord($keyChar));
$result .= $char;
}
echo $result;
?>
It returns:
{"status":201,"success":true,"data":{"lic_id":1,"author":"Author Name","organization_name":"XXXXX XXXXXX GROUP","organization_email":"support#xxxxxx.com","organization_phone":"City/City","lic_expired":"2025-11-22 00:00:00","license_created":"2022-01-12","license_expired":"2025-11-22","device_limit":10},"message":"Your license key activated"}
Now I have converted the above function to C #. The result is strange.
My C# code:
static void Main(string[] args)
{
string key = "rVOlpZOlp6RUa2RhY11UpKeUlZalpFRrpqOnll5TlpKmklRrrVOempWQm5VUa2NdVJKnpZqgpFNsU4FYnaWTnlKKk6Knk6GnVF1UoKSYk5+bq5Olm6CgkKCSn5ZUa1R+e35ziVKEgXeGiHODd1F5g4GGglNeU6GjmZKgmqySppqhn5GWn5KbnVRrVKSnoaKgpKVynpuek6mZo6Gmol+VoJ9TXlOho5mSoJqskqaaoZ+RoZqgoJZUa1SFk6SanJefplF1mqaqVF1UnZuUkZaqoZujl5VUa1RjYmNnXmNiX2NkUWJhbGFia2JhVF1UnZuUl5+llpGUpJaTpZeVVGtUY2JjZF5iYl9iZFNeU56alZagpJeQl6mimqSWllNsU2RhZGZfYmNeZGNUXVSVl6eblJeQnpqfmqZTbGJirl5Tn5alpJOYl1NsU4ugp6NSnZuUl5+lllKcl6pSkpWlm6eTpZeVVK4=";
Console.WriteLine(DecryptIt(key));
}
static string DecryptIt (string key)
{
/// <summary>
/// Decrypt key using custom algarithm
/// </summary>
///
string keyLength = "12";
string result = "";
byte[] data = Convert.FromBase64String(key);
key = Encoding.UTF8.GetString(data);
for (int i = 0; i < key.Length; i++)
{
int res = (i % keyLength.Length) - 1;
int res2 = res < 0 ? keyLength.Length + res : res;
//Console.WriteLine(res2.ToString());
//Console.WriteLine(key.Length);
char ch = key.Substring(i, 1).ToCharArray()[0];
char KeyChar = keyLength.Substring(res2, 1).ToCharArray()[0];
ch = (char)((byte)ch - KeyChar);
result += ch.ToString();
}
return result;
}
}
It returns:
E"EIEIEI":201,"IEIEIEI":EIEI,"EIEI":E"EIEIEI":1,"IEIEIE":"E'EIEI IEIEIEI","IEIEIEIEIEIEIEIEI":"MIMAI IEFEIAIE GIEIE","EIEIEIEIEIEIEIEIEI":"IEIEIEI#IEIEIEIEIE.EIE","EIEIEIEIEIEIEIEIEI":"IEIEIEIE CIEI","IEIEIEIEIEI":"2025-11-22 00:00:00","IEIEIEIEIEIEIEI":"2022-01-12","EIEIEIEIEIEIEIE":"2025-11-22","IEIEIEIEIEIE":10I,"EIEIEIE":"EIEI IEIEIEI IEI IEIEIEIEI"I
I can't understand where I'm making a mistake. The letters are completely different.
If I do it like this:
ch = (char)((char)ch - (char)KeyChar);
It returns:
?"??????":201,"???????":????,"????":?"??????":1,"??????":"?'???? ???????","?????????????????":"MIMA? ??F??A?E G????","??????????????????":"???????#??????????.???","??????????????????":"???????? C???","???????????":"2025-11-22 00:00:00","???????????????":"2022-01-12","???????????????":"2025-11-22","????????????":10?,"???????":"???? ??????? ??? ?????????"?
Can anyone help to solve this problem?
The line:
key = Encoding.UTF8.GetString(data);
is incorrect, you want to deal with bytes directly, not converting it to a utf-8 string.
In fact, the decoded base64 is not even a valid UTF-8 string.
Working Python port:
#!/usr/bin/env python3
import base64
a = "rVOlpZOlp6RUa2RhY11UpKeUlZalpFRrpqOnll5TlpKmklRrrVOempWQm5VUa2NdVJKnpZqgpFNsU4FYnaWTnlKKk6Knk6GnVF1UoKSYk5+bq5Olm6CgkKCSn5ZUa1R+e35ziVKEgXeGiHODd1F5g4GGglNeU6GjmZKgmqySppqhn5GWn5KbnVRrVKSnoaKgpKVynpuek6mZo6Gmol+VoJ9TXlOho5mSoJqskqaaoZ+RoZqgoJZUa1SFk6SanJefplF1mqaqVF1UnZuUkZaqoZujl5VUa1RjYmNnXmNiX2NkUWJhbGFia2JhVF1UnZuUl5+llpGUpJaTpZeVVGtUY2JjZF5iYl9iZFNeU56alZagpJeQl6mimqSWllNsU2RhZGZfYmNeZGNUXVSVl6eblJeQnpqfmqZTbGJirl5Tn5alpJOYl1NsU4ugp6NSnZuUl5+lllKcl6pSkpWlm6eTpZeVVK4="
s = base64.b64decode(a)
key = b'12'
result = ''
for i in range(len(s)):
char = s[i]
pos = (i % len(key)) - 1
if pos < 0:
pos += len(key)
keychar = key[(i % len(key)) - 1]
result += chr((char) - (keychar))
print(result)
UPDATE: C#
using System;
using System.Text;
class Untitled
{
static void Main(string[] args)
{
string key = "rVOlpZOlp6RUa2RhY11UpKeUlZalpFRrpqOnll5TlpKmklRrrVOempWQm5VUa2NdVJKnpZqgpFNsU4FYnaWTnlKKk6Knk6GnVF1UoKSYk5+bq5Olm6CgkKCSn5ZUa1R+e35ziVKEgXeGiHODd1F5g4GGglNeU6GjmZKgmqySppqhn5GWn5KbnVRrVKSnoaKgpKVynpuek6mZo6Gmol+VoJ9TXlOho5mSoJqskqaaoZ+RoZqgoJZUa1SFk6SanJefplF1mqaqVF1UnZuUkZaqoZujl5VUa1RjYmNnXmNiX2NkUWJhbGFia2JhVF1UnZuUl5+llpGUpJaTpZeVVGtUY2JjZF5iYl9iZFNeU56alZagpJeQl6mimqSWllNsU2RhZGZfYmNeZGNUXVSVl6eblJeQnpqfmqZTbGJirl5Tn5alpJOYl1NsU4ugp6NSnZuUl5+lllKcl6pSkpWlm6eTpZeVVK4=";
Console.WriteLine(DecryptIt(key));
}
static string DecryptIt (string key)
{
/// <summary>
/// Decrypt key using custom algarithm
/// </summary>
///
byte[] data = Convert.FromBase64String(key);
byte[] keyLength = new byte[] {0x31, 0x32}; //"12"
byte[] result = new byte[data.Length];
for (int i = 0; i < data.Length; i++)
{
int res = (i % keyLength.Length) - 1;
int res2 = res < 0 ? keyLength.Length + res : res;
//Console.WriteLine(res2.ToString());
//Console.WriteLine(key.Length);
byte ch = data[i];
byte KeyChar = keyLength[res2];
ch = (byte)(ch - KeyChar);
result[i] = ch;
}
return Encoding.UTF8.GetString(result);
}
}
Ideone

SHA1 hash difference between c# windows store app and Objective-c iOS

My Windows Store app on the Surface pro uses the SHA1 to send a hashed password to the server. I wanted to do the same thing for the iOS app but I get different results and I just don't know why.
I tried with different encoding (NSASCII UTF8 UniCode) when converting NSStrings to cString but to no prevail.
c# Windows Store App - Surface pro
/// <summary>
/// Literal copy of the values from web config machinekey
/// </summary>
const String validationKey = "6DB51F17C529AD3CABEC50B3C89CB21F4F1422F58A5B42D0E8DB8CB5CDA146511891C1BAF47F8D29401E3400267682B202B7DA146511891C1BAF47F8D29401E3";
public static string HmacSha1(string baseString)
{
var crypt = MacAlgorithmProvider.OpenAlgorithm("HMAC_SHA1");
var buffer = CryptographicBuffer.CreateFromByteArray(Encoding.Unicode.GetBytes(baseString));
var keyBuffer = Windows.Security.Cryptography.CryptographicBuffer.CreateFromByteArray(HexToByte(validationKey));
var key = crypt.CreateKey(keyBuffer);
var sigBuffer = CryptographicEngine.Sign(key, buffer);
string signature = CryptographicBuffer.EncodeToBase64String(sigBuffer);
return signature;
}
//
// HexToByte
// Converts a hexadecimal string to a byte array. Used to convert encryption
// key values from the configuration.
//
private static byte[] HexToByte(string hexString)
{
byte[] returnBytes = new byte[hexString.Length / 2];
for (int i = 0; i < returnBytes.Length; i++)
returnBytes[i] = Convert.ToByte(hexString.Substring(i * 2, 2), 16);
return returnBytes;
}
Objective-c iOS
//Hash password
NSString *secret = #"6DB51F17C529AD3CABEC50B3C89CB21F4F1422F58A5B42D0E8DB8CB5CDA146511891C1BAF47F8D29401E3400267682B202B7DA146511891C1BAF47F8D29401E3";
NSString *data = password;
const char *cKey = [secret cStringUsingEncoding:NSASCIIStringEncoding];
const char *cData = [data cStringUsingEncoding:NSASCIIStringEncoding];
unsigned char cHMAC[CC_SHA1_DIGEST_LENGTH];
CCHmac(kCCHmacAlgSHA1, cKey, strlen(cKey), cData, strlen(cData), cHMAC);
NSData *HMAC = [[NSData alloc] initWithBytes:cHMAC length:sizeof(cHMAC)];
NSString *signature = [HMAC base64EncodedStringWithOptions:0];
NSString *clientPassword = signature;
So some genius helped me figure it out. In c# code the password is encoded with Unicode as can be seen in this line:
Encoding.Unicode.GetBytes(baseString)
That means I should have changed this the NSASCII encoding line in Objective-C to:
const char *cPassword = [password cStringUsingEncoding:NSUnicodeStringEncoding];
Now comes something tricky. Unicode encodes a string like 'Welkom123' like this: W/0e/0l/0k/0o/0m/01/02/03/0. That means I can't use strlen(cData) to determine the length of cData because it will stop counting when encountering /0, so the length has been wrong the whole time. I had to replace strlen(cData) with (password.length * 2).
But even then the outcome was still not right. The key had to become Hexadecimal data just like in the c# code. Then the NSData should be used to create the CCHmac. This code turns the NSString to NSData with hexadecimal values:
int len = (int)([inputString length] / 2); // Target length
unsigned char *buf = malloc(len);
unsigned char *whole_byte = buf;
char byte_chars[3] = {'\0','\0','\0'};
int i;
for (i=0; i < [inputString length] / 2; i++) {
byte_chars[0] = [inputString characterAtIndex:i*2];
byte_chars[1] = [inputString characterAtIndex:i*2+1];
*whole_byte = strtol(byte_chars, NULL, 16);
whole_byte++;
}
NSData *data = [NSData dataWithBytes:buf length:len];
free( buf );
return data;
In the end the total code would look like this:
-(NSDictionary *)authenticateWithUsername:(NSString *)userName andPassword:(NSString *)password andSyncUrl:(NSString *)syncUrl
{
NSString *key = #"6DB51F17C529AD3CABEC50B3C89CB21F4F1422F58A5B42D0E8DB8CB5CDA146511891C1BAF47F8D29401E3400267682B202B7DA146511891C1BAF47F8D29401E3";
NSData *data = [self stringToHexData:key];
const char *cPassword = [password cStringUsingEncoding:NSUnicodeStringEncoding];
unsigned char cHMAC[CC_SHA1_DIGEST_LENGTH];
CCHmac(kCCHmacAlgSHA1, data.bytes, data.length, cPassword, (password.length * 2), cHMAC);
NSData *HMAC = [[NSData alloc] initWithBytes:cHMAC length:sizeof(cHMAC)];
NSString *hashedPassword = [HMAC base64EncodedStringWithOptions:0];
NSString *clientPassword = hashedPassword;
// some code here left out for readability sake
return dict;
}
- (NSData *) stringToHexData:(NSString *)inputString
{
int len = (int)([inputString length] / 2); // Target length
unsigned char *buf = malloc(len);
unsigned char *whole_byte = buf;
char byte_chars[3] = {'\0','\0','\0'};
int i;
for (i=0; i < [inputString length] / 2; i++) {
byte_chars[0] = [inputString characterAtIndex:i*2];
byte_chars[1] = [inputString characterAtIndex:i*2+1];
*whole_byte = strtol(byte_chars, NULL, 16);
whole_byte++;
}
NSData *data = [NSData dataWithBytes:buf length:len];
free( buf );
return data;
}

integer to utf8 string not working c#

I'm converting UTF8 string to integer, and the other way around.
If i enter 卐 as a string, it converts to 21328.
But when I try to convert 21328 back to string I get "PS".
I tried:
int dec = Convert.ToInt32(decimal1.Text, 10);
byte[] bajti = new byte[4];
bajti[0] = (byte)(dec >> 24);
bajti[1] = (byte)(dec >> 16);
bajti[2] = (byte)(dec >> 8);
bajti[3] = (byte)dec;
znak1.Text = Encoding.UTF8.GetString(bajti);
I have also tried converting using BitConverter and got same result.
I have thought, it could be a problem with TextBox, and I tried to wrote it down in notepad, but got same result...
You can also try the following code:
// Conversion from String to Int32
string text = "§";
byte[] textBytes = Encoding.UTF8.GetBytes(text);
byte[] numberBytes = new byte[sizeof(int)];
Array.Copy(textBytes, numberBytes, textBytes.Length);
int number = BitConverter.ToInt32(numberBytes, 0);
//Conversion from Int32 to String
numberBytes = BitConverter.GetBytes(number);
text = Encoding.UTF8.GetString(numberBytes);
PS: The code will work, but some characters when converted take up less than 4 bytes in space, therefore when converted back to a string from an Int32 (4 bytes), trailing \0 may appear (which are not rendered, because they represent a null character).
Try this:
byte[] bajti = HexToBytes(hex1.Text);
char c = 'a';
if (bajti.Length == 1)
{
c = (char)bajti[0];
}
else if (bajti.Length == 2)
{
c = (char)((bajti[0] << 8) + bajti[1]);
}
else if (bajti.Length == 3)
{
c = (char)((bajti[0] << 16) + (bajti[1] << 8) + bajti[2]);
}
else if (bajti.Length == 4)
{
c = (char)((bajti[0] << 24)+(bajti[1] << 16) + (bajti[2] << 8) + bajti[3]);
}
znak1.Text = c.ToString();

How to encrypt a string using public key cryptography

I am trying to implement my own RSA encryption engine. Given these RSA algorithm values:
p = 61. // A prime number.
q = 53. // Also a prime number.
n = 3233. // p * q.
totient = 3120. // (p - 1) * (q - 1)
e = 991. // Co-prime to the totient (co-prime to 3120).
d = 1231. // d * e = 1219921, which is equal to the relation where 1 + k * totient = 1219921 when k = 391.
I am trying to write a method to encrypt each byte in a string and return back an encrypted string:
public string Encrypt(string m, Encoding encoding)
{
byte[] bytes = encoding.GetBytes(m);
for (int i = 0; i < bytes.Length; i++)
{
bytes[i] = (byte)BigInteger.ModPow(bytes[i], e, n);
}
string encryptedString = encoding.GetString(bytes);
Console.WriteLine("Encrypted {0} as {1}.", m, encryptedString);
return encryptedString;
}
The obvious issue here is that BigInteger.ModPow(bytes[i], e, n) may be too large to fit into a byte-space; it could result in values over 8 bits in size. How do you get around this issue while still being able to decrypt an encrypted string of bytes back into a regular string?
Update: Even encrypting from byte[] to byte[], you reach a case where encrypting that byte using the RSA algorithm goes beyond the size limit of a byte:
public byte[] Encrypt(string m, Encoding encoding)
{
byte[] bytes = encoding.GetBytes(m);
for (int i = 0; i < bytes.Length; i++)
{
bytes[i] = (byte)BigInteger.ModPow(bytes[i], e, n);
}
return bytes;
}
Update: My issue is that encryption would cause a greater number of bytes than the initial input string had:
public byte[] Encrypt(string m, Encoding encoding)
{
byte[] bytes = encoding.GetBytes(m);
byte[] returnBytes = new byte[0];
for (int i = 0; i < bytes.Length; i++)
{
byte[] result = BigInteger.ModPow(bytes[i], (BigInteger)e, n).ToByteArray();
int preSize = returnBytes.Length;
Array.Resize(ref returnBytes, returnBytes.Length + result.Length);
result.CopyTo(returnBytes, preSize);
}
return returnBytes;
}
public string Decrypt(byte[] c, Encoding encoding)
{
byte[] returnBytes = new byte[0];
for (int i = 0; i < c.Length; i++)
{
byte[] result = BigInteger.ModPow(c[i], d, n).ToByteArray();
int preSize = returnBytes.Length;
Array.Resize(ref returnBytes, returnBytes.Length + result.Length);
result.CopyTo(returnBytes, preSize);
}
string decryptedString = encoding.GetString(returnBytes);
return decryptedString;
}
If you ran this code like this:
byte[] encryptedBytes = engine.Encrypt("Hello, world.", Encoding.UTF8);
Console.WriteLine(engine.Decrypt(encryptedBytes, Encoding.UTF8));
The output would be this:
?♥D
?♥→☻►♦→☻►♦oD♦8? ?♠oj?♠→☻►♦;♂?♠♂♠?♠
Obviously, the output is not the original string because I can't just try decrypting each byte at a time, since sometimes two or more bytes of the cypher-text represent the value of one integer that I need to decrypt back to one byte of the original string...so I want to know what the standard mechanism for handling this is.
Your basic code for encrypting and decrypting each byte - the call to ModPow - is working, but you're going about the "splitting the message up and encrypting each piece" inappropriately.
To show that the ModPow part - i.e. the maths - is fine, here's code based on yours, which encrypts a string to a BigInteger[] and back:
using System;
using System.Linq;
using System.Numerics;
using System.Text;
class Test
{
const int p = 61;
const int q = 53;
const int n = 3233;
const int totient = 3120;
const int e = 991;
const int d = 1231;
static void Main()
{
var encrypted = Encrypt("Hello, world.", Encoding.UTF8);
var decrypted = Decrypt(encrypted, Encoding.UTF8);
Console.WriteLine(decrypted);
}
static BigInteger[] Encrypt(string text, Encoding encoding)
{
byte[] bytes = encoding.GetBytes(text);
return bytes.Select(b => BigInteger.ModPow(b, (BigInteger)e, n))
.ToArray();
}
static string Decrypt(BigInteger[] encrypted, Encoding encoding)
{
byte[] bytes = encrypted.Select(bi => (byte) BigInteger.ModPow(bi, d, n))
.ToArray();
return encoding.GetString(bytes);
}
}
Next you need to read more about how a byte[] is encrypted into another byte[] using RSA, including all the different padding schemes etc. There's a lot more to it than just calling ModPow on each byte.
But to reiterate, you should not be doing this to end up with a production RSA implementation. The chances of you doing that without any security flaws are very slim indeed. It's fine to do this for academic interest, to learn more about the principles of cryptography, but leave the real implementations to experts. (I'm far from an expert in this field - there's no way I'd start implementing my own encryption...)
Note: I updated this answer. Please scroll down to the update for how it should actually be implemented because this first way of doing it is not the correct way of doing RSA encryption.
One way I can think to do it is like this (but may not be compliant to standards), and also, note this does not pad:
public byte[] Encrypt(string m, Encoding encoding)
{
byte[] bytes = encoding.GetBytes(m);
byte[] returnBytes = new byte[0];
for (int i = 0; i < bytes.Length; i++)
{
byte[] result = BigInteger.ModPow(bytes[i], (BigInteger)e, n).ToByteArray();
int preSize = returnBytes.Length;
Array.Resize(ref returnBytes, returnBytes.Length + result.Length + 1);
(new byte[] { (byte)(result.Length) }).CopyTo(returnBytes, preSize);
result.CopyTo(returnBytes, preSize + 1);
}
return returnBytes;
}
public string Decrypt(byte[] c, Encoding encoding)
{
byte[] returnBytes = new byte[0];
for (int i = 0; i < c.Length; i++)
{
int dataLength = (int)c[i];
byte[] result = new byte[dataLength];
for (int j = 0; j < dataLength; j++)
{
i++;
result[j] = c[i];
}
BigInteger integer = new BigInteger(result);
byte[] integerResult = BigInteger.ModPow(integer, d, n).ToByteArray();
int preSize = returnBytes.Length;
Array.Resize(ref returnBytes, returnBytes.Length + integerResult.Length);
integerResult.CopyTo(returnBytes, preSize);
}
string decryptedString = encoding.GetString(returnBytes);
return decryptedString;
}
This has the potential of being cross-platform because you have the option of using a different datatype to represent e or n and pass it to a C# back-end service like that. Here is a test:
string stringToEncrypt = "Mary had a little lamb.";
Console.WriteLine("Encrypting the string: {0}", stringToEncrypt);
byte[] encryptedBytes = engine.Encrypt(stringToEncrypt, Encoding.UTF8);
Console.WriteLine("Encrypted text: {0}", Encoding.UTF8.GetString(encryptedBytes));
Console.WriteLine("Decrypted text: {0}", engine.Decrypt(encryptedBytes, Encoding.UTF8));
Output:
Encrypting the string: Mary had a little lamb.
Encrypted text: ☻6☻1♦☻j☻☻&♀☻g♦☻t☻☻1♦☻? ☻g♦☻1♦☻g♦☻?♥☻?☻☻7☺☻7☺☻?♥☻?♂☻g♦☻?♥☻1♦☻$☺☻
c ☻?☻
Decrypted text: Mary had a little lamb.
Update: Everything I said earlier is completely wrong in the implementation of RSA. Wrong, wrong, wrong! This is the correct way to do RSA encryption:
Convert your string to a BigInteger datatype.
Make sure your integer is smaller than the value of n that you've calculated for your algorithm, otherwise you won't be able to decypher it.
Encrypt the integer. RSA works on integer encryption only. This is clear.
Decrypt it from the encrypted integer.
I can't help but wonder that the BigInteger class was mostly created for cryptography.
As an example:
using System;
using System.Collections.Generic;
using System.Linq;
using System.Numerics;
using System.Security.Cryptography;
using System.Text;
using System.Threading.Tasks;
namespace BytePadder
{
class Program
{
const int p = 61;
const int q = 53;
const int n = 3233;
const int totient = 3120;
const int e = 991;
const int d = 1231;
static void Main(string[] args)
{
// ---------------------- RSA Example I ----------------------
// Shows how an integer gets encrypted and decrypted.
BigInteger integer = 1000;
BigInteger encryptedInteger = Encrypt(integer);
Console.WriteLine("Encrypted Integer: {0}", encryptedInteger);
BigInteger decryptedInteger = Decrypt(encryptedInteger);
Console.WriteLine("Decrypted Integer: {0}", decryptedInteger);
// --------------------- RSA Example II ----------------------
// Shows how a string gets encrypted and decrypted.
string unencryptedString = "A";
BigInteger integer2 = new BigInteger(Encoding.UTF8.GetBytes(unencryptedString));
Console.WriteLine("String as Integer: {0}", integer2);
BigInteger encryptedInteger2 = Encrypt(integer2);
Console.WriteLine("String as Encrypted Integer: {0}", encryptedInteger2);
BigInteger decryptedInteger2 = Decrypt(encryptedInteger2);
Console.WriteLine("String as Decrypted Integer: {0}", decryptedInteger2);
string decryptedIntegerAsString = Encoding.UTF8.GetString(decryptedInteger2.ToByteArray());
Console.WriteLine("Decrypted Integer as String: {0}", decryptedIntegerAsString);
Console.ReadLine();
}
static BigInteger Encrypt(BigInteger integer)
{
if (integer < n)
{
return BigInteger.ModPow(integer, e, n);
}
throw new Exception("The integer must be less than the value of n in order to be decypherable!");
}
static BigInteger Decrypt(BigInteger integer)
{
return BigInteger.ModPow(integer, d, n);
}
}
}
Example output:
Encrypted Integer: 1989
Decrypted Integer: 1000
String as Integer: 65
String as Encrypted Integer: 1834
String as Decrypted Integer: 65
Decrypted Integer as String: A
If you are looking to use RSA encryption in C# then you should not be attempting to build your own. For starters the prime numbers you have chosen are probably to small. P and Q are supposed to be large prime numbers.
You should check out some other question/answers:
how to use RSA to encrypt files (huge data) in C#
RSA Encryption of large data in C#
And other references:
http://msdn.microsoft.com/en-us/library/system.security.cryptography.rsacryptoserviceprovider.encrypt(v=vs.110).aspx
http://msdn.microsoft.com/en-us/library/system.security.cryptography.rsacryptoserviceprovider.aspx

Convert PHP encryption code to C#

I'm trying to convert this piece of code from PHP to C#. It's part of a Captive Portal. Could somebody explain what it does?
$hexchal = pack ("H32", $challenge);
if ($uamsecret) {
$newchal = pack ("H*", md5($hexchal . $uamsecret));
} else {
$newchal = $hexchal;
}
$response = md5("\0" . $password . $newchal);
$newpwd = pack("a32", $password);
$pappassword = implode ("", unpack("H32", ($newpwd ^ $newchal)));
I also encountered the need of php's pack-unpack functions in c# but did not get any good resource.
So i thought to do it myself. I have verified the function's input with pack/unpack/md5 methods found at onlinephpfunctions.com. Since i have done code only as per my requirements. This can be extended for other formats
Pack
private static string pack(string input)
{
//only for H32 & H*
return Encoding.Default.GetString(FromHex(input));
}
public static byte[] FromHex(string hex)
{
hex = hex.Replace("-", "");
byte[] raw = new byte[hex.Length / 2];
for (int i = 0; i < raw.Length; i++)
{
raw[i] = Convert.ToByte(hex.Substring(i * 2, 2), 16);
}
return raw;
}
MD5
private static string md5(string input)
{
byte[] asciiBytes = Encoding.Default.GetBytes(input);
byte[] hashedBytes = MD5CryptoServiceProvider.Create().ComputeHash(asciiBytes);
string hashedString = BitConverter.ToString(hashedBytes).Replace("-", "").ToLower();
return hashedString;
}
Unpack
private static string unpack(string p1, string input)
{
StringBuilder output = new StringBuilder();
for (int i = 0; i < input.Length; i++)
{
string a = Convert.ToInt32(input[i]).ToString("X");
output.Append(a);
}
return output.ToString();
}
Eduardo,
if you take a look at the pack manual, pack is used to convert a string in (hex, octal, binary )to his number representation.
so
$hexcal = pack('H32', $challenge);
would convert a string like 'cca86bc64ec5889345c4c3d8dfc7ade9' to the actual 0xcca... de9
if $uamsecret exist do the same things with the MD5 of hexchal concacteate with the uamsecret.
if ($uamsecret) {
$newchal = pack ("H*", md5($hexchal . $uamsecret));
} else {
$newchal = $hexchal;
}
$response = md5("\0" . $password . $newchal);
MD% '\0' + $password + $newchal
$newpwd = pack("a32", $password);
pad password to 32 byte
$pappassword = implode ("", unpack("H32", ($newpwd ^ $newchal)));
do a xor newpwd and newchal and convert it to a hexadecimal string, I don't get the implode() maybe it's to convert to string to an array of character.

Categories

Resources