So im trying to encrypt data using C# DES
have the following code
static public string Encrypt(string _dataToEncrypt) {
SymmetricAlgorithm algorithm = DES.Create();
ICryptoTransform transform = algorithm.CreateEncryptor(key, iv);
byte[] inputbuffer = Encoding.Unicode.GetBytes(_dataToEncrypt);
byte[] outputBuffer = transform.TransformFinalBlock(inputbuffer, 0, inputbuffer.Length);
return Convert.ToBase64String(outputBuffer);
}
static public string Decrypt(string _dataToDecrypt) {
SymmetricAlgorithm algorithm = DES.Create();
ICryptoTransform transform = algorithm.CreateDecryptor(key, iv);
byte[] inputbuffer = Convert.FromBase64String(_dataToDecrypt); // Here is the problem.
byte[] outputBuffer = transform.TransformFinalBlock(inputbuffer, 0, inputbuffer.Length);
return Encoding.Unicode.GetString(outputBuffer);
}
And im getting an error System.FormatException: 'Invalid length for a Base-64 char array or string.'
It works when string has an even number of characters.
Is it even real to encrypt/decrypt data with an odd number of characters ?
DES, as well as AES does not have limit on what can be encrypted, the problem is elsewhere.
It looks like it is a Bas64 encoding problem given the line the error occurs on.
Perhaps trailing "=" characters were stripped from the Base64.
Information:
DES is a block based encryption algorithm, as such the input must be an exact multiple of the block size, 8-bytes for DES. When the input is not always an exact multiple padding must be added, the easiest way to to let the implementation do that for you by specifying padding, generally PKCS#5 for DES.
For SymmetricAlgorithm use Padding Property PKCS7, it is always best to fully specify everything and not rely on defaults.
algorithm.Padding = PaddingMode.PKCS7;
Related
I wanted to do a simple message encrypter to dip my toes into the matter but I can't make it to work. The problem is that whatever input I start with, sometimes it encrypts it but when I try to decrypt it, it just doesn't return the original string. It would be really helpful if you could tell me what I'm doing wrong or guide in the right direction.
Complete code
This are the sections in charge of encrypting and decrypting.
void Decrypt()
{
using var crypt = Aes.Create();
string[] input = ClipboardService.GetText()?.Split(SEPARATOR) ?? Array.Empty<string>();
byte[] key = input[0].ToBytes();
byte[] IV = input[^1].ToBytes();
byte[] value = string.Join(string.Empty, input[1..^1]).ToBytes();
crypt.IV = IV;
crypt.Key = key;
var decryptedValue = crypt.DecryptCbc(value, IV, PaddingMode.Zeros);
string decryptedValueInText = decryptedValue.ToUnicodeString();
ClipboardService.SetText(decryptedValueInText);
LogInfoMessage($"{decryptedValueInText}: {decryptedValue.Length}");
crypt.Clear();
}
void Encrypt()
{
using var crypt = Aes.Create();
crypt.GenerateKey();
string value = ClipboardService.GetText() ?? string.Empty;
var encryptedValue = crypt.EncryptCbc(value.ToBytes(), crypt.IV, PaddingMode.Zeros);
string encryptedValueInText = $"{crypt.Key.ToUnicodeString()}{SEPARATOR}{encryptedValue.ToUnicodeString()}{SEPARATOR}{crypt.IV.ToUnicodeString()}";
ClipboardService.SetText(encryptedValueInText);
LogInfoMessage($"{encryptedValueInText}: {encryptedValue.Length}");
crypt.Clear();
}
There are two extension methods:
public static string ToUnicodeString(this byte[] bytes) => Encoding.Unicode.GetString(bytes);
public static byte[] ToBytes(this string str) => Encoding.Unicode.GetBytes(str);
Example
The input links were:
https://www.youtube.com/
https://www.youtube.com/watch?v=bSA91XTzeuA
I don't think it matters because the key and IV are autogenerated everytime anyways but still.
Per our discussion...
Using the clipboard to store binary data as Unicode text will fail due to invalid UTF-16 codepoints. UTF-16 uses some multi-word encoding for certain Unicode characters, using 32 bits in surrogate pairs to encode Unicode code points from the supplementary planes. There are plenty of primers on the UTF-16 encoding, but basically you have a pair of 16-bit values where the first is in the range 0xD800-0xDBFF and the second must be in the range 0xDC00-0xDFFF. Odds on your encrypted data will break this rule.
As noted, if your encrypted binary data must be sent through a text-only transport you should encode the bytes in the encrypted block using Base64 or similar.
I'd also like to stress that writing methods that can be called with parameters rather than directly accessing the clipboard for I/O makes it much simpler to do testing, including round-trip tests on the various parts of the problem. Proving that the codec is working without reference to the clipboard is a good test and separation of concerns helps to more readily identify the source of problems in the future.
My method AESEncrypt(string text) is returning a byte array.
If I encrypt a message, and use the returned byte array as an input for AESDecrypt(byte[] text), everything is working fine. The problem is, that I need to convert it to a string and vice versa, so I tried the following:
byte[] encrypted = enc.AESEncrypt("Testmessage");
string encryptedStr = Convert.ToBase64String(encrypted);
byte[] test = Convert.FromBase64String(encryptedStr);
Console.WriteLine((encrypted == test));
I also tried this with Encoding.ASCII.GetString(), Encoding.UTF8.GetString(),
but encrypted == test returns false everytime...
What method do I need to use to convert the AES byte[] to a string and vice versa?
This is the AESEncrypt method:
public byte[] AESEncrypt(string s)
{
byte[] encrypted;
using (AesManaged aes = new AesManaged()) {
ICryptoTransform encryptor = aes.CreateEncryptor(AESKey, AESIV);
using (MemoryStream ms = new MemoryStream()) {
using (CryptoStream cs = new CryptoStream(ms, encryptor, CryptoStreamMode.Write)) {
using (StreamWriter sw = new StreamWriter(cs)) {
sw.Write(s);
}
encrypted = ms.ToArray();
}
}
}
return encrypted;
}
An encrypted payload held in a byte array is not directly convertible to a string, or at least not without using an ANSI encoding and both sides (encoding and decoding) agreeing on the string's code page. And if you use any Unicode encoding (UTF-8, UTF-16, ...) you're bound to have bytes that contain invalid code points, so who can't be decoded to a character.
That's where base64 comes into play. This is a safe way to represent byte arrays as ASCII strings, a subset implemented by almost every (if not every) encoding. So using that base64 code is fine.
You'll simply want encrypted.SequenceEquals(test), as explained in Comparing two byte arrays in .NET.
The base64 is directly used for this.
here is an example:
Encode
public static string Base64Encode(string plainText)
{
var plainTextBytes = System.Text.Encoding.UTF8.GetBytes(plainText);
return System.Convert.ToBase64String(plainTextBytes);
}
Decode
public static string Base64Decode(string base64EncodedData)
{
var base64EncodedBytes = System.Convert.FromBase64String(base64EncodedData);
return System.Text.Encoding.UTF8.GetString(base64EncodedBytes);
}
Consider byte[] encrypted and byte[] test, when you test for equality with == by default the references are compared not their content. This explains, why you test encrypted == test fails.
You are also asking about how to convert byte[] into a string, which is not related to your encrypted == test test at all. In general you the various System.Text.Encoding.*.GetString(byteArray); preform the conversion but you need to know what encoding was used for the byteArray. This information has to be passed along separately, you might have a specification which says all byte arrays are encoded in UTF-8 or you might pass the encoding along together with the data but there exists no general answer.
I have a client who needs us to use Blowfish ECB encryption with cipherMode 0, output 1. I have tried to solve for this, but I'm getting stuck. How do I fix my code?
Here are the full client instructions:
Algorithm: Blowfish
・ Mode: ECB
・ Padding: PKCS5Padding
*Initial vector is unnecessary because we use ECB mode.
Example
・Encrypt Key: 2fs5uhnjcnpxcpg9
→ Plain Text : 3280:99:20120201123050
→ Cipher Text : daa745f1901364c0bd42b9658db3db96336758cd34b2a576
* Please keep Cipher Text with 16 hexadecimal characters .
* Please generate Cipher Text without “salt”.
I need to write this in C#. Here's what I did, but it doesn't seem to be working:
string member_id = "3280";
string panelType = "99";
string RandomString = "20120201123050";
string encryptionKey = "2fs5uhnjcnpxcpg9";
string cryptstr = member_id + ":" + panelType + ":" + RandomString;
string plainText = cryptstr;
BlowFish b = new BlowFish(encryptionKey);
string cipherText = b.Encrypt_ECB("3280:99:20120201123050");
The result is not daa745f1901364c0bd42b9658db3db96336758cd34b2a576. Where did I go wrong?
Encrypt_ECB() so I assume its Schneier's class.
The ctor expects a hexadecimal string if one is passed, you need the overload for a byte array:
BlowFish b = new BlowFish(Encoding.UTF8.GetBytes(encryptionKey));
The output is still not correct, lets see what it really should be by decrypting their example:
string clear = b.Decrypt_ECB("daa745f1901364c0bd42b9658db3db96336758cd34b2a576");
gives us:
"3280:99:20120201123050\u0002\u0002"
Which is good but there are 2 0x2 bytes on the end, the N x 0xN is due to the PKCS padding. To get the match you need to pad the input:
// input to bytes
List<byte> clearBytes = new List<byte>(Encoding.UTF8.GetBytes("3280:99:20120201123050"));
// how many padding bytes?
int needPaddingBytes = 8 - (clearBytes.Count % 8);
// add them
clearBytes.AddRange(Enumerable.Repeat((byte)needPaddingBytes, needPaddingBytes));
// encrypt
byte[] cipherText = b.Encrypt_ECB(clearBytes.ToArray());
// to hex
string cipherTextHex = BitConverter.ToString(cipherText).Replace("-", "").ToLowerInvariant();
i implement AES 256 bit algorithm in C# but i am encrypting 128bit block of plain text which require padding so i dont want to pad and want use to stream cipher
use stream cipher instead of using 128 bit block
encrypt stream byte by byte
CryptLib _crypt = new CryptLib();
//string plainText = "This is the text to be encrypted";
String iv = CryptLib.GenerateRandomIV(16); //16 bytes = 128 bits
string key = CryptLib.getHashSha256("my secret key", 31); //32 bytes = 256 bits
MessageBox.Show(arm);//////////////////////
String cypherText = _crypt.encrypt(string1, key, iv);
Console.WriteLine("iv=" + iv);
Console.WriteLine("key=" + key);
Console.WriteLine("Cypher text=" + cypherText);
MessageBox.Show(cypherText);
textBox1.Text = cypherText;
Console.WriteLine("Plain text =" + _crypt.decrypt(cypherText, key, iv));
MessageBox.Show(_crypt.decrypt(cypherText, key, iv));
String dypher = _crypt.decrypt(cypherText, key, iv);
string outp = string.Empty;
char[] value = dypher.ToCharArray();
If the input data is always an exact multiple of the block size you can just specify no padding.
if you have data of unknown non-uniform block lengths padding is the general way to handle that.
Why do you not want to use padding.
Additionally:
It is common to prefix the encrypted data with the IV for use during decryption. The IV does not need to be secret and with this method the IV does not need to be shared in some other way and can easily be a different random value for each encryption.
Deriving a key from a password (string) with a hash function is not considered secure, instead use a key derivarion function such as PBKDF2.
I am doing md-5 hashing in both android and c# at the same time. But both the results should be the same for the same inputs. Is there any difference in the way its done in both the languages?
I get different outputs in both the cases. Here is the c# code for md-5 calculation:
//this method hashes the values sent to it using MD5
public static String hashwithmd5(String toHashMD5)
{
byte[] keyArray;
MD5CryptoServiceProvider hashmd5 = new MD5CryptoServiceProvider();
keyArray = hashmd5.ComputeHash(UTF8Encoding.UTF8.GetBytes(toHashMD5));
hashmd5.Clear();
return Convert.ToBase64String(keyArray, 0, keyArray.Length);
}
and here is the code for md5 in android using bouncycastle
public byte[] Hashing(String toHash) throws Exception{
byte[] hashBytes = toHash.getBytes("UTF-8");
EditText et = (EditText) findViewById(R.id.entry);
org.bouncycastle.crypto.digests.MD5Digest digest = new org.bouncycastle.crypto.digests.MD5Digest();
digest.reset();
digest.update(hashBytes, 0, hashBytes.length);
int length = digest.getDigestSize();
byte[] md5 = new byte[length];
digest.doFinal(md5, 0);
et.setText(md5.toString());
return md5;
}
the result of md5 in c# is :XUFAKrxLKna5cZ2REBfFkg==
the result of md5 in android is :[B#4053cf40
The C# code converts the hash to Base64, the java code does not. If you convert both raw hashes to e.g. hex strings, they'll be the same.
When you use this in Java:
byte[] md5 = new byte[length];
// ...
md5.toString()
you are not getting a representation of the byte values. You get the generic "string representation" of an object. Here, [B#4053cf40 basically means "array of bytes (that's for the '[B') which internally happens to be at address 4053cf40".
Use android.util.Base64 to convert your bytes to a Base64 encoded string.
#erik is correct. MD5 is no longer considered a "secure" hash; use SHA-256.
Erik is absolutely right. MD5 usage is near extinction, use any strong SHA