ElGamal C# implementation - c#

I want to implement ElGamal encryption. I need this for my school work but when I want do decryption the last step is always 0 cause of (b/Math.Pow(a,x))%primenumber is always less then 1.
Here is the keys generation:
public void GenerateKey() {
this.x = 3;
this.prvocislo = PrimeGen.findPrimes(29).Max(); //prime number
this.g = this.prvocislo % 12;
this.y = Convert.ToInt32(Math.Pow(this.g, this.x) % this.prvocislo);
this.k = 23;//601}
Here is encrypt function:
public string Encrypt(string word) {
List<string> words = new List<string>();
words = PrimeGen.SplitToArray(word, 2);
string encrypted="";
string sss = PrimeGen.GetStringFromBytes(PrimeGen.GetBytesFromInt(PrimeGen.GetIntFromBytes(PrimeGen.GetBytesFromString("ah")))); //returns ah so conversion works
foreach (string s in words)
{
int a = Convert.ToInt32(Math.Pow(g,k) % prvocislo);
int b = Convert.ToInt32((Math.Pow(y, k) * PrimeGen.GetIntFromBytes(PrimeGen.GetBytesFromString(s))) % prvocislo);
string aS = PrimeGen.GetStringFromBytes(PrimeGen.INT2LE(a + posun));
string bS = PrimeGen.GetStringFromBytes(PrimeGen.INT2LE(b + posun));
encrypted = encrypted + aS + bS;
}
return encrypted;
}
Here is my decrypt function:
public string Decrypt(string ElgamalEncrypted) {
string decrypted = "";
for (int i = 0; i < ElgamalEncrypted.Length; i = i + 2) {
string aS = ElgamalEncrypted.Substring(i, 2);
string bS = ElgamalEncrypted.Substring(i + 2, 2);
int a = PrimeGen.GetIntFromBytes(PrimeGen.GetBytesFromString(aS)) - posun;
int b = PrimeGen.GetIntFromBytes(PrimeGen.GetBytesFromString(bS)) - posun;
if(b==0) b=1;
if (a == 0) a = 1;
decrypted=decrypted+PrimeGen.GetStringFromBytes(PrimeGen.GetBytesFromInt(Convert.ToInt32(((b/Math.Pow(a,x))%prvocislo))));
}
return decrypted;
}

You're using Math.Pow(base, exponent) % modulus for modular exponentiation. That doesn't work because floating points can't represent the large integers crypto needs. Use System.Numerics.BigInteger.ModPow(base, exponent, modulus) instead.
The division probably doesn't work because you use integer division, instead of multiplying with the modular multiplicative inverse of the right side.

Related

Simple Byte Encryption Not Working

When the message is decrypted, the characters are one less than the original. Example: H Will be G
I have tried to debug the code by printing out values and all goes well until trying to divide by 100000 and multiplying by the date
Here is the code I used:
I didn't include the Main Method Here
public static string encrypt(string input)
{
string final;
string date = DateTime.Now.Date.ToShortDateString().ToString();
var datetime = int.Parse(date.Replace("/", ""));
List<int> semi = new List<int>();
var bytes = Encoding.UTF8.GetBytes(input.ToCharArray());
for (int i = 0; i < bytes.Length; i++)
{
int y = bytes[i] * datetime / 100000;
semi.Add(y);
Console.WriteLine(y);
}
Console.WriteLine(string.Join("", bytes));
final = string.Join(":", semi.ToArray()) + ":" + date;
return final;
}
public static string decrypt(string input)
{
string final;
string[] raw = input.Split(':');
int date = int.Parse(raw[raw.Length - 1].Replace("/",""));
var dump = new List<string>(raw);
dump.RemoveAt(raw.Length - 1);
string[] stringbytes = dump.ToArray();
List<byte> bytes = new List<byte>();
for (int i = 0; i < stringbytes.Length; i++)
{
int x = int.Parse(stringbytes[i]);
Console.WriteLine(x);
x = x * 100000 / date;
byte finalbytes = Convert.ToByte(x);
bytes.Add(finalbytes);
}
Console.WriteLine(string.Join("", bytes.ToArray()));
Console.WriteLine(date);
var bytearray = bytes.ToArray();
final = Encoding.UTF8.GetString(bytearray);
return final;
}
It's likely a rounding error from integer division. when doing integer math it is very possible that ((x * date / 100000) * 100000 / date) != x, in fact the only time it will be == x is when date % 100000 == 0.
Fix the rounding errors introduced by your int division and it should fix your problem.
P.S. I would also be very hesitant to call this "encryption", there is no secret key, all the information required to decrpt the message is in the message itself. You are only relying on the fact that the algorithm is secret which is practically impossible to do with C#. I would rather call what you are doing "Encoding", because to decode something that is encoding all you need to know is the algorithm.
You are using the low precision datatype int to store the result of the division. I have changed the type to double and it works
public static string encrypt(string input)
{
string final;
string date = DateTime.Now.Date.ToString("MMddyyyy");
var datetime = int.Parse(date);
List<double> semi = new List<double>();
var bytes = Encoding.UTF8.GetBytes(input);
for (int i = 0; i < bytes.Length; i++)
{
double y = bytes[i] * datetime / 100000;
semi.Add(y);
Console.WriteLine(y);
}
Console.WriteLine(string.Join("", bytes));
final = string.Join(":", semi.ToArray()) + ":" + date;
return final;
}
public static string decrypt(string input)
{
string final;
string[] raw = input.Split(':');
int date = int.Parse(raw[raw.Length - 1].Replace("/", ""));
var dump = new List<string>(raw);
dump.RemoveAt(raw.Length - 1);
string[] stringbytes = dump.ToArray();
List<byte> bytes = new List<byte>();
for (int i = 0; i < stringbytes.Length; i++)
{
var x = double.Parse(stringbytes[i]);
Console.WriteLine(x);
x = x * 100000 / date;
byte finalbytes = Convert.ToByte(x);
bytes.Add(finalbytes);
}
Console.WriteLine(string.Join("", bytes.ToArray()));
Console.WriteLine(date);
var bytearray = bytes.ToArray();
final = Encoding.UTF8.GetString(bytearray);
return final;
}
Here is a fully working console app http://ideone.com/Rjc13A
I believe this is a number truncation issue. In your decrypt method, the division will actually create a double instead of an int - If you do the math it turns out to have decimal places. Since x is an integer it will be cutt-off.
The following should work:
for (int i = 0; i < stringbytes.Length; i++)
{
var x = double.Parse(stringbytes[i]);
Console.WriteLine(x);
x = Math.Round((x * 100000) / date,0);
byte finalbytes = Convert.ToByte(x);
bytes.Add(finalbytes);
}
Also, as a side note why are you creating your own encryption algorithm? Could you not use one that already exists?

how to convert decimal to Packed decimal/COMP-3

I need convert some decimal to PD6.2, then send it to Mainframe. It's very hard to find any function in C#. please help. Thanks a million
See if this works
public static void Main()
{
string lookup = "0123456789";
int input = 123456789;
string input_str = input.ToString();
List<byte> output = new List<byte>();
int index = 0;
//odd number characters
if (input_str.Length % 2 == 1)
{
output.Add((byte)lookup.IndexOf(input_str.Substring(index++, 1)));
}
for (int i = index; i < input_str.Length; i += 2)
{
output.Add((byte)((lookup.IndexOf(input_str.Substring(i, 1))) << 4 | lookup.IndexOf(input_str.Substring(i + 1, 1))));
}
}

Reverse binary representation of int (only significant bits)

I'm trying to write a program for reversing numbers in binary. For instance, the binary representation of 13 is 1101, and reversing it gives 1011, which corresponds to number 11 right?
Here's my code:
static void Main(string[] args)
{
Console.WriteLine("Enter a Number");
int numb = int.Parse(Console.ReadLine());
int reverse = 0;
while (numb > 0)
{
int rem = numb % 10;
reverse = (reverse * 10) + rem;
numb = numb / 10;
}
Console.WriteLine("Reverse number={0}", reverse);
Console.ReadLine();
}
By this code I only get the numbers to reverse (13 -> 31)...
The input should contain a single line with an integer N, 1≤N≤1000000000 and I want my output in one line with one integer, the number I want to get by reversing the binary representation of N.
Something like that
// 13 = 1101b
int value = 13;
// 11 = 1011b
int result = Convert.ToInt32(new String(
Convert.ToString(value, 2)
.Reverse()
.ToArray()), 2);
Explanation:
Convert.ToString(value, 2) returns value in binary representation ("1101")
Reverse().ToArray() - reverse the string ('1','0','1','1') as sequence of characters and converts to array char[].
new String(...) constructs string "1011" from array of char
finally, Convert.ToInt32(..., 2) convert binary representation back to int
You can use Convert.ToString and Convert.ToInt32 methods, where 2 means binary:
int numb = int.Parse(Console.ReadLine());
var reversedString = Convert.ToString(numb, 2).ReverseString();
var result = Convert.ToInt32(reversedString, 2);
...
public static string ReverseString(this string s)
{
char[] arr = s.ToCharArray();
Array.Reverse(arr);
return new string(arr);
}
A fun excercise would be doing this without using the string conversion.
I have very little experience with bit twiddling so there is probably a faster and better way of doing this, but this seems to work:
public static IEnumerable<bool> ToBinary(this int n)
{
for (int i = 0; i < 32; i++)
{
yield return (n & (1 << i)) != 0;
}
}
public static int ToInt(this IEnumerable<bool> b)
{
var n = 0;
var counter = 0;
foreach (var i in b.Trim().Take(32))
{
n = n | (i ? 1 : 0) << counter;
counter++
}
return n;
}
private static IEnumerable<bool> Trim(this IEnumerable<bool> list)
{
bool trim = true;
foreach (var i in list)
{
if (i)
{
trim = false;
}
if (!trim)
{
yield return i;
}
}
}
And now you'd call it like this:
var reversed = n.ToBinary().Reverse().ToInt();

Processing BigInteger issues

Assume the following Diffie-Hellman info which can also be found on this page
1)P
string givenp = "00e655cc9e04f3bebae76ecca77143ef5c4451876615a9f8b4f712b8f3bdf47ee7f717c09bb5b2b66450831367d9dcf85f9f0528bcd5318fb1dab2f23ce77c48b6b7381eed13e80a14cca6b30b5e37ffe53db15e2d6b727a2efcee51893678d50e9a89166a359e574c4c3ca5e59fae79924fe6f186b36a2ebde9bf09fe4de50453";
BigInteger p = new BigInteger(HexToBytesv2(givenp));
2)G
BigInteger g = new BigInteger(2);
3)Merchant private key
string merchantPrivateKeyHEX = "48887dfd090d175e33beea29e7b38334299289069f9ab492b67807905faa98d96d22d79205bef03f14af093f1797b904734132c34a388fdc79e20497bfa1465fec2aac4fabdf3bb0c9be8685d20f7bfe0346a9abdf7fa89838c3fa9ca6abdb70bea66795ab6699cc154db59490e4159f142f7bddff603c1d3d6c4fff8177e11d";
BigInteger a = new BigInteger(HexToBytesv2(merchantPrivateKeyHEX));
Using the formula publickey = g ^ a mod p I should get the public key provided in the initial link, yet when executing
BigInteger A = BigInteger.ModPow(g, a, p);
ToHex(A.ToByteArray())
the result I get is
00f85c41e84446ecfe43c9911df31d3cf60d83642afd496b741363290139badf75f8b8c5c010dda2446dd483dc553b6c2698c16c9d082391677785f81d54bc9c7c45f8b6d5bdb3e49fec7f5522b880c8c753fb7d3ff2c81e47dcb27d52842def40a812dc95cc679575baf237a955ee9944bd0797326f2a0a58c6c087f9b0b9e82c
instead of
00d9abd78c93dfddeb920d57d6513126d8f1118c9237a45101408dbffe6cfd95b011a016e4e0ab8aef0601e836a452b8bb88be7ca71e4f22f97aa65f8358ee69348d1227d65db6e53641d1a6542aa4be4b4adc75fac816af79a8e3f5097f8313e7b725df37eadc8c774e2033dfa99c95ccef333bf402b066198c30481e2a83875c
Any ideas? I must be missing pretty obvious but I am not sure what that might be.
P.S. Adding the function being used:
public static byte[] HexToBytesv2(this string hex)
{
if (hex.Length % 2 == 1)
hex = '0' + hex;
byte[] ret = new byte[hex.Length / 2];
for (int i = 0; i < ret.Length; i++)
ret[i] = Convert.ToByte(hex.Substring(i * 2, 2), 16);
return ret;
}
public static string ToHex(byte[] ba)
{
StringBuilder hex = new StringBuilder(ba.Length * 2);
foreach (byte b in ba)
hex.AppendFormat("{0:x2}", b);
return hex.ToString();
}
It's an endian problem.
I've adjusted your encoding and decoding and now get the answer you're looking for:
public static byte[] HexToBytesv2(string hex)
{
if (hex.Length % 2 == 1)
hex = '0' + hex;
byte[] ret = new byte[hex.Length / 2];
for (int i = 0; i < ret.Length; i++)
ret[i] = Convert.ToByte(hex.Substring(hex.Length - (i+1) * 2, 2), 16);
return ret;
}
public static string ToHex( byte[] bytes)
{
var sb = new StringBuilder();
foreach (var b in bytes.Reverse())
{
sb.AppendFormat("{0:x2}", b);
}
return sb.ToString();
}
FYI I used LinqPad and the main method is your code from the question (as adjusted) with checks that the data has not lost anything on the way:
void Main()
{
string givenp = "00e655cc9e04f3bebae76ecca77143ef5c4451876615a9f8b4f712b8f3bdf47ee7f717c09bb5b2b66450831367d9dcf85f9f0528bcd5318fb1dab2f23ce77c48b6b7381eed13e80a14cca6b30b5e37ffe53db15e2d6b727a2efcee51893678d50e9a89166a359e574c4c3ca5e59fae79924fe6f186b36a2ebde9bf09fe4de50453";
BigInteger p = new BigInteger(HexToBytesv2(givenp));
(ToHex(p.ToByteArray()) == "00e655cc9e04f3bebae76ecca77143ef5c4451876615a9f8b4f712b8f3bdf47ee7f717c09bb5b2b66450831367d9dcf85f9f0528bcd5318fb1dab2f23ce77c48b6b7381eed13e80a14cca6b30b5e37ffe53db15e2d6b727a2efcee51893678d50e9a89166a359e574c4c3ca5e59fae79924fe6f186b36a2ebde9bf09fe4de50453").Dump();
BigInteger g = new BigInteger(2);
string merchantPrivateKeyHEX = "48887dfd090d175e33beea29e7b38334299289069f9ab492b67807905faa98d96d22d79205bef03f14af093f1797b904734132c34a388fdc79e20497bfa1465fec2aac4fabdf3bb0c9be8685d20f7bfe0346a9abdf7fa89838c3fa9ca6abdb70bea66795ab6699cc154db59490e4159f142f7bddff603c1d3d6c4fff8177e11d";
BigInteger a = new BigInteger(HexToBytesv2(merchantPrivateKeyHEX));
(ToHex(a.ToByteArray()) == "48887dfd090d175e33beea29e7b38334299289069f9ab492b67807905faa98d96d22d79205bef03f14af093f1797b904734132c34a388fdc79e20497bfa1465fec2aac4fabdf3bb0c9be8685d20f7bfe0346a9abdf7fa89838c3fa9ca6abdb70bea66795ab6699cc154db59490e4159f142f7bddff603c1d3d6c4fff8177e11d").Dump();
BigInteger A = BigInteger.ModPow(g, a, p);
(ToHex(A.ToByteArray()) == "00f85c41e84446ecfe43c9911df31d3cf60d83642afd496b741363290139badf75f8b8c5c010dda2446dd483dc553b6c2698c16c9d082391677785f81d54bc9c7c45f8b6d5bdb3e49fec7f5522b880c8c753fb7d3ff2c81e47dcb27d52842def40a812dc95cc679575baf237a955ee9944bd0797326f2a0a58c6c087f9b0b9e82c").Dump();
(ToHex(A.ToByteArray()) == "00d9abd78c93dfddeb920d57d6513126d8f1118c9237a45101408dbffe6cfd95b011a016e4e0ab8aef0601e836a452b8bb88be7ca71e4f22f97aa65f8358ee69348d1227d65db6e53641d1a6542aa4be4b4adc75fac816af79a8e3f5097f8313e7b725df37eadc8c774e2033dfa99c95ccef333bf402b066198c30481e2a83875c").Dump();
}
Before I swapped the ordering, and included the .Concat(new byte[] { 0 }).ToArray() from your original question, the output was:
True
True
True
False
And now it's:
True
True
False
True
The other issue you're seeing is BigInteger.Parse and the Byte[] constructor always expect the top bit of the first nibble or last byte respectively to be the sign bit. So you need to include the extra 0 character or byte respectively to avoid that.
You're doing a number of unnecessary conversions and they're introducing an error somewhere.
If you remove the broken string-byte[]-BigInteger-byte[]-string steps and let BigInteger itself do the work for you then you'll generate the expected result:
string givenp = "00e655cc9e04f3bebae76ecca77143ef5c4451876615a9f8b4f712b8f3bdf47ee7f717c09bb5b2b66450831367d9dcf85f9f0528bcd5318fb1dab2f23ce77c48b6b7381eed13e80a14cca6b30b5e37ffe53db15e2d6b727a2efcee51893678d50e9a89166a359e574c4c3ca5e59fae79924fe6f186b36a2ebde9bf09fe4de50453";
var p = BigInteger.Parse(givenp, NumberStyles.HexNumber);
var g = new BigInteger(2);
var merchantPrivateKeyHEX = "48887dfd090d175e33beea29e7b38334299289069f9ab492b67807905faa98d96d22d79205bef03f14af093f1797b904734132c34a388fdc79e20497bfa1465fec2aac4fabdf3bb0c9be8685d20f7bfe0346a9abdf7fa89838c3fa9ca6abdb70bea66795ab6699cc154db59490e4159f142f7bddff603c1d3d6c4fff8177e11d";
var a = BigInteger.Parse(merchantPrivateKeyHEX, NumberStyles.HexNumber);
var publicKey = BigInteger.ModPow(g, a, p);
Console.WriteLine(publicKey.ToString("x")); // displays 0d9abd7...

Convert a bit-string to a char

I'm trying to convert a bit-string to ASCII characters by 8 bits (each 8 bits = 1 ASCII char).
public string BitsToChar(string InpS)
{
string RetS = "";
for (int iCounter = 0; iCounter < InpS.Length / 8; iCounter++)
RetS = System.String.Concat(RetS, (char)Convert.ToByte(InpS.Substring(iCounter * 8, 8)), 2);
return RetS;
}
It throws a System.OverflowException: Value was either too large or too small for an unsigned byte.
It's not clear for me how comes that an 8-bit portion of a binary string can be too small or too large for an 8-bit Byte type.
Any ideas? Thank you.
Try something like that:
private static Char ConvertToChar(String value) {
int result = 0;
foreach (Char ch in value)
result = result * 2 + ch - '0';
return (Char) result;
}
public string BitsToChar(string value) {
if (String.IsNullOrEmpty(value))
return value;
StringBuilder Sb = new StringBuilder();
for (int i = 0; i < value.Length / 8; ++i)
Sb.Append(ConvertToChar(value.Substring(8 * i, 8)));
return Sb.ToString();
}
...
String result = BitsToChar("010000010010000001100010"); // <- "A b"
Do something like this
public string BitsToChar(string InpS)
{
string RetS = "";
foreach (char c in InpS)
{
RetS = RetS + System.Convert.ToInt32(c);
}
return RetS;
}
Try something like that:
public static string BitsToChar(string bitString)
{
var retString = new StringBuilder();
foreach (Match match in Regex.Matches(bitString, "[01]{8}")) // 8 is size of bits
{
retString.Append((Char)Convert.ToByte(match.Value, 2));
}
return retString.ToString();
}

Categories

Resources