I'm using the C# .NET Mysql Connector, and when running this query:
INSERT INTO convos (`userid`,`time`,`from`,`content`,`read`,`deleted`, `ip`, `source`, `charname`, `to`) VALUES ('3', '1347396787', 'Chára', '........', '0', '0', '0.0.0.0:0000', 'C', 'óóóíííí', 'óóóíííí');
I get the following error:
Incorrect string value: '\xE1ra' for column 'from' at row 1
I understand my encoding, everything was configured for utf8, utf8_general_ci. Database, table and columns are configured for utf8. The data is sent from the client in utf8.
If i use a 3rd party tool like, Workbench to insert the query or use the mysql command line it works fine. I don't know if there is a bug with the connector or i need to be doing something else with the values before insert?
Any idea?
Thanks
Is there any in mysql to covert to the correct type?
I believe you need to alter the column's char set:
use below code for those columns which is using UTF-8.
ALTER TABLE database.table MODIFY COLUMN col VARCHAR(255)
CHARACTER SET utf8 COLLATE utf8_general_ci NOT NULL;
Unicode string prefix with N
First see your table convos and make sure columns data types is either one of nchar, nvarchar and You must precede all Unicode strings with a prefix N when you deal with Unicode string constants in SQL Server
Tyr:
insertQuery = "INSERT INTO convos (`userid`,`time`,`from`,`content`,`read`,`deleted`, `ip`, `source`, `charname`, `to`) VALUES
(N'3', N'1347396787', N'Chára', N'........', N'0', N'0', N'0.0.0.0:0000', N'C', N'óóóíííí', N'óóóíííí')";
I figured this out, its taken a while but it seems i was setting the charset too often. The database, tables, columns are all in UTF8. When i made a connection i had "CHARSET=UTF8" in the connection string. I was also running "SET NAMES 'utf8' COLLATE 'utf8_general_ci'" everytime i made a connection. I dropped the CHARSET=UTF8 and "SET NAMES 'utf8' COLLATE 'utf8_general_ci'" and its all working now.
Update it
INSERT INTO convos (`userid`,`time`,`from`,`content`,`read`,`deleted`, `ip`, `source`, `charname`, `to`) VALUES ('3', '1347396787', 'Chara', '........', '0', '0', '0.0.0.0:0000', 'C', 'óóóíííí', 'óóóíííí');
I think for "chara"in your 3rd value gives it
For someone who has tried all the suggestions, and nothing has worked (like myself), it is worth checking what MySQL types are your fields mapped to in C#. My text fields were automatically mapped as MySqlDbType.Blob and that was causing the error. I changed the type to MySqlDbType.Text, and I don't see the error any more.
Here is my original response to a similar thread:
https://stackoverflow.com/a/16989466/2199026
config mysql like below. it will solve the unicode error when insert into mysql from java or C#
enter image description here
Related
I am trying to insert square root symbol √ into my SQL Server database from an ASP.NET page. The radical symbol (√) gets inserted as letter v instead.
Where could I be going wrong?
Thanks for your assistance.
Your database column type should be nVarChar to insert Unicode characters.
Also you need to pass values like below:
cmd.Parameters.Add("#ColumnName", SqlDbType.NVarChar, 1024).Value = txtName.Text;
Column in your table should be able to store unicode characters, try change the type to nvarchar
and while inserting you should use N symbol before the value
let say the column name is square_root and table is test
insert into test(square_root) values(N'√25 = 5')
I have the following query
UPDATE mytable
SET col1 = ENCRYPTBYPASSPHRASE ('Key', col2)
FROM mytable
when I decrypt it using
SELECT CONVERT(VARCHAR(20), DECRYPTBYPASSPHRASE ('Key', col1))
FROM mytable
The result returned is only the first character, for example if the field contains "Computer" the result is only "C".
col2 is probably nvarchar not varchar. Try
SELECT CONVERT(NVARCHAR(20), DECRYPTBYPASSPHRASE ('Key', col1))
FROM mytable
In nvarchar the code points for standard ASCII letters are the same as for ASCII but padded out with a 0x00 byte.
When you cast that to varchar that it is treated as a null character that terminates the string.
After investigation I had come to many issues so I will post what I came across, so anyone can benefit from it.
If you changed to data type of the SQL column to varbinary then make sure that when you decrypt the data, you use the same old data type. That is if you had a column of varchar that contains data and then you changed it to varbinary, you must decrypt it using varchar, if you use nvarchar ,you will get garbage data.
You must encrypt and decrypt using the same way. That is if you are loading the password from a stored procedure and use it in encrypting,and the SAME EXACT password is loaded using a function for decryption, u will also get garbage data (I tested it but I did not know why is this behaviour!)may be internally there is some difference between how data is returned from SP and functions.
Hope this helps anyone out there !
Use CONVERT with data type and size of the value you are encrypting updating.
Looks like EncryptByKey does not recognize the data properly as per column schema.
Try as below
ENCRYPTBYKEY(KEY_GUID('<Key Name>'), CONVERT(varchar(20),col1))
I am having a problem with writing a string to a mysql database that contains a utf-8 character. Everything before the character gets written to the table, but the character and everything after it is not there.
I have checked the character sets of the database and the default collation is utf8_general_ci and the default characterset is utf8. The column being written to is type longtext with the collation utf8_general_ci.
I have also tried adding SET NAMES utf8; to the query but this did not change the result.
Here is an example of the code being run:
using (var cmd = new MySqlCommand("insert into tablename (BodyText) values (#p1)", connection as MySqlConnection) { CommandType = CommandType.Text })
{
cmd.Parameters.Add("#p1", BodyText);
cmd.ExecuteNonQuery();
}
The connection string is:
"SERVER=xx;DATABASE=xx;USER=xx;PASSWORD=xx;Pooling=true;Validate Connection=true;CHARSET=UTF8"
And the text that is attempting to write to the table is "Thank you! I saw it as a 2 😝 more text...", and what is written to the table is "Thank you! I saw it as a 2 ".
Any help on the matter would be appreciated.
Update: After further research the problem appears to be that the base utf8 encoding in MySql does not support 4-byte characters (of which 😝 is). The solutions to this are either to update the MySql database to use utf8mb4 or remove the characters from the string before writing to the table. The problem with the second solution is on a large codebase this check would have to be done everywhere text is written to the database. Any suggestions on how to handle this issue would be welcome.
You can probably to this to change from utf8 to utf8mb4 in a table without losing or mangling any data:
ALTER TABLE tbl MODIFY COLUMN col ... CHARACTER SET utf8mb4 ...;
(Where the '...' is the stuff currently used to define the column, such as LONGTEXT.) You might want to specify the collation at the same time.
If that fails, this will probably work:
ALTER TABLE Tbl MODIFY COLUMN col LONGBLOB ...;
ALTER TABLE Tbl MODIFY COLUMN col LONGTEXT ... CHARACTER SET utf8mb4 ...;
If col is in any indexes, you might want to DROP INDEX in the first ALTER and ADD INDEX in the second. (This is for efficiency and possibly to avoid index limitations.)
How can I insert Arabic characters into a SQL Server database? I tried to insert Arabic data into a table and the Arabic characters in the insert script were inserted as '??????' in the table.
I tried to directly paste the data into the table through SQL Server Management Studio and the Arabic characters was successfully and accurately inserted.
I looked around for resolutions for this problems and some threads suggested changing the datatype to nvarchar instead of varchar. I tried this as well but without any luck.
How can we insert Arabic characters into SQL Server database?
For the field to be able to store unicode characters, you have to use the type nvarchar (or other similar like ntext, nchar).
To insert the unicode characters in the database you have to send the text as unicode by using a parameter type like nvarchar / SqlDbType.NVarChar.
(For completeness: if you are creating SQL dynamically (against common advice), you put an N before a string literal to make it unicode. For example: insert into table (name) values (N'Pavan').)
Guess the solation is first turn on the field to ntext then write N with the value. For example
insert into eng(Name) values(N'حسن')
If you are trying to load data directly into the database like me, I found a great way to do so by creating a table using Excel and then export as CSV. Then I used the database browser SQLite to import the data correctly into the SQL database. You can then adjust the table properties if needed. Hope this would help.
I am using SQL Server 2008, Visual Web Developer 2012 and .net 4.0. I created a table in SQL Server and added some columns to it. I gave some columns the datatype nchar(10).
Now my problem is that when I insert string of less than 10 characters as a value of the column type nchar(10) and when I fetched the value it inserts blank spaces to complete the 10 character string.
Means if I insert "a" into column of type nchar(10),
then when I fetch the value again I get back: "a "
How can I resolve this issue ?
You can do like this to trim the whitespaces:
SELECT RTRIM(CAST(col As NVARCHAR(10))) FROM test
Check out SQLFIDDLE
define the string as nvarchar(10) it will work fine
If you don't want to change data type, then you will need to TRIM the space from the output
SELECT
RTRIM("a ") AS ColumnName
FROM MyTable
But this means every time you have to do this every place you are using the column. It is better to user VARCHAR(10) or NVARCHAR(20) where VAR... means variable length. So if your string is not up to 10 characters, spaces are not added
Final SQL
SELECT
RTRIM(ColumnName) AS ColumnName
FROM myTable