I'm working with SQL Server 2014. One of the features of my web app is to upload CSV files, and import the data into a table (called TF) in my database (called TMPA).
I have no idea how to do this.
string excelPath = Server.MapPath("~/Files/") + Path.GetFileName(FileUpload2.PostedFile.FileName);
FileUpload1.SaveAs(excelPath);
SqlConnection con = new SqlConnection(#"Data Source=SAMSUNG-PC\SQLEXPRESS;Initial Catalog=TMPA;Persist Security Info=True");
StreamReader sr = new StreamReader(excelPath);
string line = sr.ReadLine();
string[] value = line.Split(',');
DataTable dt = new DataTable();
DataRow row;
foreach (string dc in value)
{
dt.Columns.Add(new DataColumn(dc));
}
while (!sr.EndOfStream)
{
value = sr.ReadLine().Split(',');
if (value.Length == dt.Columns.Count)
{
row = dt.NewRow();
row.ItemArray = value;
dt.Rows.Add(row);
}
}
SqlBulkCopy bc = new SqlBulkCopy(con.ConnectionString, SqlBulkCopyOptions.TableLock);
bc.DestinationTableName = "TF";
bc.BatchSize = dt.Rows.Count;
con.Open();
bc.WriteToServer(dt);
bc.Close();
con.Close();
I tried this code, but it wouldn't work.
PS : TF has more columns than what the CSV file : some of the columns are computed and should be calculated automatically after each insert ..
Here is the canvas of my CSV file : 4 columns :
IdProduit,Mois,Reel,Budget
IdProduit is a string, Mois is a date, Reel and Budget are floats.
On the other hand, my SQL Server table looks like this :
|IdProduit|Mois|Reel|Budget|ReelPreviousMois|VarReelBudget|VarReelPrvM|...
|---------|----|----|------|----------------|-------------|-----------|-----
All the other columns should either be null or automatically calculated.
Help me !
I fixed it using this opensource .net library called Filehelpers.
Here's the link : http://www.filehelpers.net/
Here's what I did :
<asp:FileUpload ID="FileUpload1" runat="server" />
<asp:Button ID="Button1" OnClick = "UploadF" runat="server" Text="Importer" />
And here's the code behind :
[DelimitedRecord("|")]
public class TBFtable
{
public string IdProduit;
public DateTime Mois;
public float Reel;
public float Budget;
}
protected void UploadF(object sender, EventArgs e)
{
string excelPath = Server.MapPath("~/Files/") + Path.GetFileName(FileUpload1.PostedFile.FileName);
FileUpload1.SaveAs(excelPath);
SqlServerStorage storage = new SqlServerStorage(typeof(TBFtable),ConfigurationManager.ConnectionStrings["bd"].ConnectionString);
storage.InsertSqlCallback = new InsertSqlHandler(GetInsertSqlCust);
TBFtable[] res = CommonEngine.ReadFile(typeof(TBFtable), excelPath) as TBFtable[];
storage.InsertRecords(res);
ScriptManager.RegisterClientScriptBlock(this, this.GetType(), "alertMessage", "alert('Données enregistrées avec succès !')", true);
}
protected string GetInsertSqlCust(object record)
{
TBFtable obj = (TBFtable) record;
return String.Format("INSERT INTO TF (IdProduit, Mois, Reel, Budget ) " + " VALUES ( '{0}' , '{1}' , '{2}' , '{3}' ); ", obj.IdProduit, obj.Mois,obj.Reel, obj.Budget );
}
You're on the right path. Using SqlBulkCopy will provide the best performance when inserting the data into SQL Server. However, as opposed to writing your own Csv parser, I would use the stellar one provided in the .NET Framework via the TextFieldParser class in the Microsoft.VisualBasic assembly. You may need to do some digging to see if SqlBulkCopy allows a partial dataset to be used. I don't believe it does, but you could add the missing columns to your DataTable before sending it to SqlBulkCopy as a workaround.
I know this is an old question, but for whoever may be interested.
Unless you're sure that your files will never be massive, you should avoid loading the whole batch into memory and sending it all at once to SQL Server as is the case with the DataTable approach and (I think) your accepted answer. You may be risking an out of memory exception on the client side (your file processing server in this case) or, worse still, on SQL Server side. You can avoid that by using SqlBulkCopy class and an implementation of IDataReader interface.
I wrote a package that I think could be of interest in cases such as yours. The code would look like so:
var dataReader = new CsvDataReader("pathToYourCsv",
new List<TypeCode>(4)
{
TypeCode.String, //IdProduit
TypeCode.DateTime, //Mois
TypeCode.Double, //Reel
TypeCode.Double //Budget
});
this.bulkCopyUtility.BulkCopy("tableName", dataReader);
There are also additional configuration options for more complex scenarios (flexible column mapping, additional static column values which are not present in the csv file, value transformation).
The package is open sourced (project on Github) and should work on .NET Core and .NET Framework.
As a side comment, SQL Server recovery mode may be important when doing massive SQL imports. Whenever possible, use Simple or Bulk Logged to avoid huge transaction files.
Related
I am looking for advice on the best way to parse a Microsoft Excel file and update/store the data into a given SQL Server database. I using ASP.NET MVC so I plan on having a page/view take in an Excel spreadsheet and using that user given file I will need to use C# to parse the data from the columns and update the database based on matches with the spreadsheet column that contains the key column of the database table. The spreadsheet will always be in the same format so I will only need to handle on format. It seems like this could be a pretty common thing I am just looking for the best way to approach this before getting started. I am using Entity Framework in my current application but I don't have to use it.
I found this solution which seems like it could be a good option:
public IEnumerable<MyEntity> ReadEntitiesFromFile( IExcelDataReader reader, string filePath )
{
var myEntities = new List<MyEntity>();
var stream = File.Open( filePath, FileMode.Open, FileAccess.Read );
using ( var reader = ExcelReaderFactory.CreateOpenXmlReader( stream ) )
{
while ( reader.Read() )
{
var myEntity = new MyEntity():
myEntity.MyProperty1 = reader.GetString(1);
myEntity.MyProperty2 = reader.GetInt32(2);
myEntites.Add(myEntity);
}
}
return myEntities;
}
Here is an example of a what a file might look like (Clock# is the key)
So given a file in this format I want to match the user to the data table record using the clock # and update the record with each of the cells information. Each of the columns in the spreadsheet have a relatable column in the data table. All help is much appreciated.
You can use the classes in the namespace Microsoft.Office.Interop.Excel, which abstracts all the solution you found. Instead of me rewriting it, you can check out this article: http://www.codeproject.com/Tips/696864/Working-with-Excel-Using-Csharp.
Better yet, why not bypass the middle man? You can use an existing ETL tool, such as Pentaho, or Talend, or something to go straight from Excel to your database. These types of tools often offer a lot of customization, and are fairly straightforward to use. I've used Pentaho quite a lot for literally what you're describing, and it saved me the head ache of writing the code myself. Unless you want to/need to write it yourself, I think the latter is the best approach.
Try This
public string GetDataTableOfExcel(string file_path)
{
using (OleDbConnection conn = new OleDbConnection())
{
DataTable dt = new DataTable();
string Import_FileName = Server.MapPath(file_path);
//Import_FileName = System.IO.Path.GetDirectoryName(file_path);
string fileExtension = Path.GetExtension(Import_FileName);
if (fileExtension == ".xlsx")
conn.ConnectionString = "Provider=Microsoft.ACE.OLEDB.12.0;Data Source=" + Import_FileName + ";" + "Extended Properties='Excel 12.0 Xml;HDR=YES;'";
using (OleDbCommand comm = new OleDbCommand())
{
comm.CommandText = "Select * from [Sheet1$]";
comm.Connection = conn;
using (OleDbDataAdapter da = new OleDbDataAdapter())
{
da.SelectCommand = comm;
da.Fill(dt);
}
}
}
}
Now Your Data in DataTable. You can create insert query from datatable's data.
file_path is excel file's full path with directory name.
I'm performing a conversion of DBF + DBT files into the SQL. I use Microsoft.Jet.OLEDB.4.0 connector for accessing the files and SqlConnector for writing data in the MS SQL, to improve performance I use SqlBulkCopy approach. Most of the files are converted fine, but on some the method SqlBulkCopy.WriteToServer throws an exception :
Record is deleted.
The search key was not found in any record.
The copy operation doesn't complete and I'm missing lots of records in the SQL.
Is there a way how to bypass this problem, or am I suppose to give up on SqlBulkCopy and copy row after row?
EDIT:
So I decided to PACK the database, however no luck so far. When I use vfpoledb for reading, it crashes even sooner because of some problem with casting decimals. So I'd like to use PACK first (with vfpoledb) and then the JetOleDb reader.Even though the PACK is executed,and I can see that the dbf and dbt files change, the reader.GetValues() is still throwing the same exception.
try
{
string file = #"f:\Elims\dsm\CPAGEMET.DBF";
string tableName = Path.GetFileNameWithoutExtension(file);
var dbfConnectionString = string.Format(#"Provider=Microsoft.Jet.OLEDB.4.0;Data Source={0};Extended Properties='dBASE III;DELETED=YES;HDR=NO;IMEX=1'", Path.GetDirectoryName(file));
var packConnString = string.Format(#"Provider=vfpoledb;Data Source={0};Collating Sequence=machine;", file);
OleDbConnection packConnector = new OleDbConnection(packConnString);
packConnector.Open();
OleDbCommand command = new OleDbCommand(string.Format("PACK {0}",tableName), packConnector);
var result = command.ExecuteNonQuery();
packConnector.Close();
OleDbConnection oleConnector = new OleDbConnection(dbfConnectionString);
oleConnector.Open();
string cmd = string.Format("SELECT * FROM [{0}]", tableName);
var oleDbCommand = new OleDbCommand(cmd, oleConnector);
OleDbDataReader dataReader = oleDbCommand.ExecuteReader();
object[] values = new object[dataReader.FieldCount];
int iRow = 0;
while (dataReader.Read())
{
iRow++;
Console.WriteLine("Row " + iRow);
dataReader.GetValues(values);
}
oleConnector.Close();
}
catch (Exception e)
{
Console.WriteLine(e.Message + e.StackTrace);
}
Thanks
It looks like some of the records are marked as deleted. Restore them back if you need these records, or delete them permanently (using PACK command) if you don't need these records.
So after some digging, I came out to final resolution, so I will just summarize my findings. The "record is deleted" exception was very misleading, as the problem never actually was a deleted record. There were 3 empty rows, that were actually triggering the exception. Once these got deleted, everything started to work, I didn't even have to pack the database. This applied to a scenario, where I used Microsoft.Jet.OLEDB connector.
I tried to use the vfpoledb connector, but I ran into other problem with decimal numbers. I wrote a fix inspired by this msdn discussion and not only everything started to work (so deleted and empty rows are successfully skipped), but also the import is now 15x faster than with the Jet connector (again using BulkCopy)
Have a webform that has an upload button to upload csv file then my code needs to parse the file and use the parsed data to insert into a SQL table. Is what I'm doing correct for the parse data to a List, it's not picking up the filename for the streamreader. Is this the most effective way to parse the data? Should I parse into to a datatable?
protected void UploadBtn_Click(object sender, EventArgs e)
{
if (FileUpload.HasFile)
{
string filename = Path.GetFileName(FileUpload.FileName);
List<string[]> ValuesToUpload = parseData(filename);
//if (!Directory.Exists(ConfigurationManager.AppSettings["temp_dir"].ToString().Trim()))
//{
// Directory.CreateDirectory(ConfigurationManager.AppSettings["temp_dir"].ToString().Trim());
//}
//FileUpload.SaveAs(ConfigurationManager.AppSettings["temp_dir"].ToString().Trim() + filename);
//using (FileStream stream = new FileStream(ConfigurationManager.AppSettings["temp_dir"].ToString().Trim() + filename, FileMode.Open, FileAccess.Read, FileShare.Read))
}
}
public List<string[]> parseData(filename)
{
int j=0;
List <string[]> members = new List<string[]>();
try
{
using (StreamReader read = new StreamReader(filename))
{
while (!read.EndOfStream)
{
string line = read.ReadLine();
string[] values = line.Split(',');
if(j==0)
{
j++;
continue;
}
long memnbr = Convert.ToInt64(values[0]);
int loannbr = Convert.ToInt32(values[1]);
int propval = Convert.ToInt32(values[2]);
members.Add(values);
}
Use KBCsv. We are getting 40K rows parsed per second, and 70K+ rows skipped per second. This is the fastest I have seen. And also pretty stable. Then generate SQL manually as suggested above. If doing data reload and aim for performance, run multi-threaded, no transaction (MS SQL only). Can get up to 10K rows per second of import speed, depending on your network bandwidth to database server.
Do not parse to DataTable - it is very slow.
Since you're going to insert the data into the SQL table, I'd first create a class that represents the table and create a new object for each record. (this is for visibility).
or I could use the following approaches (assuming you're using MS SQL Server)
1. The Dynamic Insert Query
StringBuilder strInsertValues = new StringBuilder("VALUES");
your ParsingCode HERE..
string [] values = line.Split(',');
strInsertValues.AppendFormat("({0},{1},{2}),", values[0], values[1], values[2]);
end parse
using(SqlConnection cn = new SqlConnection(YOUR_CONNECTION_STRING)){
SqlCommand cmd = cn.CreateCommand;
cmd.CommandType = SqlCommandType.Text;
cmd.CommandText = "INSERT INTO TABLE(Column1, Column2, Column3) " + strInsertValues.ToString().SubString(0, strInsertValues.Length);
cn.Open();
cmd.ExecuteNonQuery();
}
2. Use BulkCopy (Recommanded)
Create a DataSet the represents your CSV values
Add a new record for each line parsed
Create Column Mappings for your DataSet and SQL Table,
Use BulkCopy Object to insert your data.
Ref to BulkCopy: http://msdn.microsoft.com/en-us/library/system.data.sqlclient.sqlbulkcopy.aspx
Not really an answer, but too long to post as a comment...
As it looks like you're throwing away your parsed values (memnbr, etc...), you could significantly reduce your csv parsing code to:
return
File
.ReadLines(filename)
.Skip(1)
.Select(line => line.Split(','))
.ToList();
The below code sample will bulk insert CSV data into a staging table that has matching columns and then will execute a Stored Procedure to normalize the data on the server.
This is significantly more efficiently than manually parsing the data and inserting the data line-by-line. A few months ago I used similar code to submit 1,500,000+ records to our database and normalize the data in a matter of seconds.
var sqlConnection = new SqlConnection(DbConnectionStringInternal);
// Bulk-import our unnormalized data from the .csv file into a staging table
var inputFileConnectionString = String.Format("Driver={{Microsoft Text Driver (*.txt; *.csv)}};Extensions=csv;Readonly=True;Dbq={0}", Path.GetDirectoryName(csvFilePath));
using (var inputFileConnection = new OdbcConnection(inputFileConnectionString))
{
inputFileConnection.Open();
var selectCommandText = String.Format("SELECT * FROM {0}", Path.GetFileName(csvFilePath));
var selectCommand = new OdbcCommand(selectCommandText, inputFileConnection);
var inputDataReader = selectCommand.ExecuteReader(CommandBehavior.CloseConnection);
var sqlBulkCopy = new SqlBulkCopy(sqlConnection) { DestinationTableName = "Data_Staging" };
if (sqlConnection.State != ConnectionState.Open)
sqlConnection.Open();
sqlBulkCopy.WriteToServer(inputDataReader);
}
// Run a stored-procedure to normalize the data in the staging table, then efficiently move it across to the "real" tables.
var addDataFromStagingTable = String.Format("EXEC SP_AddDataFromStagingTable");
if (sqlConnection.State != ConnectionState.Open)
sqlConnection.Open();
using (var addToStagingTableCommand = new SqlCommand(addDataFromStagingTable, sqlConnection) { CommandTimeout = 60 * 20 })
addToStagingTableCommand.ExecuteNonQuery();
sqlConnection.Close();
I'm trying to reindex a table in a simple database that I created using SQLite.NET and VS2008. I need to reindex tables after every DELETE command and here is the code snippet I have written (it does not work):
SQLiteCommand currentCommand;
String tempString = "REINDEX tf_questions";
//String tempString = "REINDEX [main].tf_questions";
//String tempString = "REINDEX main.tf_questions";
currentCommand = new SQLiteCommand(myConnection);
currentCommand.CommandText = tempString;
currentCommand.ExecuteNonQuery()
When run within my program, the code produces no errors, but it also doesn't reindex the "tf_questions" table. In the above example, you will also see the other query strings I've tried that also don't work.
Please help,
Thanks
I figured out a workaround for my problem. Consider the following code:
SQLiteCommand currentCommand;
String tempString;
String currentTableName = "tf_questions";
DataTable currentDataTable;
SQLiteDataAdapter myDataAdapter;
currentDataTable = new DataTable();
myDataAdapter = new SQLiteDataAdapter("SELECT * FROM " + currentTableName, myConnection);
myDataAdapter.Fill(currentDataTable);
//"tf_questions" is the name of the table
//"question_id" is the name of the primary key column in "tf_questions" (set to auto inc.)
//"myConnection" is and already open SQLiteConnection pointing to the db file
for (int i = currentDataTable.Rows.Count-1; i >=0 ; i--)
{
currentCommand = new SQLiteCommand(myConnection);
tempString = "UPDATE "+ currentTableName +"\nSET question_id=\'"+(i+1)+"\'\nWHERE (question_id=\'" +
currentDataTable.Rows[i][currentDataTable.Columns.IndexOf("question_id")]+"\')";
currentCommand.CommandText = tempString;
if( currentCommand.ExecuteNonQuery() < 1 )
{
throw new Exception("There was an error executing the REINDEX protion of the code...");
}
}
This code works, though I would have liked to have used the built-in SQL REINDEX command.
If you are doing this for the sake of performance, consider this answer:
REINDEX does not help. REINDEX is only
needed if you change a collating
sequence.
After a lot of INSERTs and DELETEs,
you can sometimes get slightly better
performance by doing a VACUUM. VACUUM
improves locality of reference
slightly.
I am trying to save unicode data (greek) in oracle database (10 g). I have created a simple table:
I understand that NVARCHAR2 always uses UTF-16 encoding so it must be fine for all (human) languages.
Then I am trying to insert a string in database. I have hardcoded the string ("How are you?" in Greek) in code. Then I try to get it back from database and show it.
class Program
{
static string connectionString = "<my connection string>";
static void Main (string[] args) {
string textBefore = "Τι κάνεις;";
DeleteAll ();
SaveToDatabase (textBefore);
string textAfter = GetFromDatabase ();
string beforeData = String.Format ("Before: {0}, ({1})", textBefore, ToHex (textBefore));
string afterData = String.Format ("After: {0}, ({1})", textAfter, ToHex (textAfter));
Console.WriteLine (beforeData);
Console.WriteLine (afterData);
MessageBox.Show (beforeData);
MessageBox.Show (afterData);
Console.ReadLine ();
}
static void DeleteAll () {
using (var oraConnection = new OracleConnection (connectionString)) {
oraConnection.Open ();
var command = oraConnection.CreateCommand ();
command.CommandText = "delete from UNICODEDATA";
command.ExecuteNonQuery ();
}
}
static void SaveToDatabase (string stringToSave) {
using (var oraConnection = new OracleConnection (connectionString)) {
oraConnection.Open ();
var command = oraConnection.CreateCommand ();
command.CommandText = "INSERT into UNICODEDATA (ID, UNICODESTRING) Values (11, :UnicodeString)";
command.Parameters.Add (":UnicodeString", stringToSave);
command.ExecuteNonQuery ();
}
}
static string GetFromDatabase () {
using (var oraConnection = new OracleConnection (connectionString)) {
oraConnection.Open ();
var command = oraConnection.CreateCommand ();
command.CommandText = "Select * from UNICODEDATA";
var erpReader = command.ExecuteReader ();
string s = String.Empty;
while (erpReader.Read ()) {
string text = erpReader.GetString (1);
s += text + ", ";
}
return s;
}
}
static string ToHex (string input) {
string bytes = String.Empty;
foreach (var c in input)
bytes += ((int)c).ToString ("X4") + " ";
return bytes;
}
}
Here are different outputs:
Text before sending to database in a message box:
Text after getting from database in a message box:
Console Output:
Please can you suggest what I might be doing wrong here?
I can see five potential areas for problems:
How are you actually getting the text into your .NET application? If it's hardcoded in a string literal, are you sure that the compiler is assuming the right encoding for your source file?
There could be a problem in how you're sending it to the database.
There could be a problem with how it's being stored in the database.
There could be a problem with how you're fetching it in the database.
There could be a problem with how you're displaying it again afterwards.
Now areas 2-4 sound like they're less likely to be an issue than 1 and 5. How are you displaying the text afterwards? Are you actually fetching it out of the database in .NET, or are you using Toad or something similar to try to see it?
If you're writing it out again from .NET, I suggest you skip the database entirely - if you just display the string itself, what do you see?
I have an article you might find useful on debugging Unicode problems. In particular, concentrate on every place where the encoding could be going wrong, and make sure that whenever you "display" a string you dump out the exact Unicode characters (as integers) so you can check those rather than just whatever your current font wants to display.
EDIT: Okay, so the database is involved somewhere in the problem.
I strongly suggest that you remove anything like ASP and HTML out of the equation. Write a simple console app that does nothing but insert the string and fetch it again. Make it dump the individual Unicode characters (as integers) before and after. Then try to see what's in the database (e.g. using Toad). I don't know the Oracle functions to convert strings into sequences of individual Unicode characters and then convert those characters into integers, but that would quite possibly be the next thing I'd try.
EDIT: Two more suggestions (good to see the console app, btw).
Specify the data type for the parameter, instead of just giving it an object. For instance:
command.Parameters.Add (":UnicodeString",
OracleType.NVarChar).Value = stringToSave;
Consider using Oracle's own driver instead of the one built into .NET. You may wish to do this anyway, as it's generally reckoned to be faster and more reliable, I believe.
You can determine what characterset your database uses for NCHAR with the query:
SQL> SELECT VALUE
2 FROM nls_database_parameters
3 WHERE parameter = 'NLS_NCHAR_CHARACTERSET';
VALUE
------------
AL16UTF16
to check if your database configuration is correct, you could run the following in SQL*Plus:
SQL> CREATE TABLE unicodedata (ID NUMBER, unicodestring NVARCHAR2(100));
Table created
SQL> INSERT INTO unicodedata VALUES (11, 'Τι κάνεις;');
1 row inserted
SQL> SELECT * FROM unicodedata;
ID UNICODESTRING
---------- ---------------------------------------------------
11 Τι κάνεις;
One more thing worth noting.
If you are using oracle client, and would like to include unicode characters in the CommandText, you should add the folloing line to the start of your application:
System.Environment.SetEnvironmentVariable("ORA_NCHAR_LITERAL_REPLACE", "TRUE");
This will allow you, in case you need it, to use the following syntax:
command.CommandText = "INSERT into UNICODEDATA (ID, UNICODESTRING) Values (11, N'Τι κάνεις;')";
After some investigations here we go:
string input = "•";
char s = input[0];
//table kuuku with column kuku(nvarchar2(100))
string connString = "your connection";
//CLEAN TABLE
using (System.Data.OracleClient.OracleConnection cn = new System.Data.OracleClient.OracleConnection(connString))
{
cn.Open();
System.Data.OracleClient.OracleCommand cmd = new System.Data.OracleClient.OracleCommand("delete from kuku ", cn);
cmd.ExecuteNonQuery();
cn.Close();
}
//INSERT WITH PARAMETER BINDING - UNICODE SAVED
using (System.Data.OracleClient.OracleConnection cn = new System.Data.OracleClient.OracleConnection(connString))
{
cn.Open();
System.Data.OracleClient.OracleCommand cmd = new System.Data.OracleClient.OracleCommand("insert into kuku (kuku) values(:UnicodeString)", cn);
cmd.Parameters.Add(":UnicodeString", System.Data.OracleClient.OracleType.NVarChar).Value = input + " OK" ;
cmd.ExecuteNonQuery();
cn.Close();
}
//INSERT WITHOUT PARAMETER BINDING - UNICODE NOT SAVED
using (System.Data.OracleClient.OracleConnection cn = new System.Data.OracleClient.OracleConnection(connString))
{
cn.Open();
System.Data.OracleClient.OracleCommand cmd = new System.Data.OracleClient.OracleCommand("insert into kuku (kuku) values('" +input+" WRONG')", cn);
cmd.ExecuteNonQuery();
cn.Close();
}
//FETCH RESULT
using (System.Data.OracleClient.OracleConnection cn = new System.Data.OracleClient.OracleConnection(connString))
{
cn.Open();
System.Data.OracleClient.OracleCommand cmd = new System.Data.OracleClient.OracleCommand("select kuku from kuku", cn);
System.Data.OracleClient.OracleDataReader dr = cmd.ExecuteReader();
if(dr.Read())
{
string output = (string) dr[0];
char sa = output[0];
}
cn.Close();
}
}
On reading records, try
Encoding utf = Encoding.Default;
var utfBytes = odatareader.GetOracleString(0).GetNonUnicodeBytes();//OracleDataReader
Console.WriteLine(utf.GetString(utfBytes));
Solution: set NLS_LANG!
Details:
I just had the same problem, and actually had exact the same situation as described in Sergey Bazarnik's investigation. Using bind variables it works, and without them it doesn't.
The SOLUTION is to set NLS_LANG in proper place. Since I have Windows server I set it in windows registry under
HKEY_LOCAL_MACHINE\SOFTWARE\Wow6432Node\ORACLE\KEY_OraClient11g_home1
Please note that regitry location may difer so the easiest way is to search registry for "ORACLE_HOME" string. Also other systems like Linux, Unix can set this on different way (export NLS_LANG ...)
In my case I put "NLS_LANG"="CROATIAN_CROATIA.UTF8". Since I had no that variable set it went to default value.
After changing registry you should restart process.
In my case I restarted IIS.
Regarding reason why it works with bind variables may be because it actually happens on server side, while without it actually happens on client side. So even that DB can insert proper values - before that happens, client does the unwanted corrections, since it thinks that is should do that. That is because NLS_LANG defaults to simpler code page. But instead of doing useful task, that creates a problem, which (as shown in investigation looks hard to understand).
In case you have multiple oracle versions, be sure to correct all versions in registry (in my case Oracle 10 had valid setting, but Oracle 11 had no NLS_LANG set at all).