This question already has answers here:
Creating a byte array from a stream
(18 answers)
Closed 2 years ago.
I am trying to insert data from ASP.NET into SQL Server and retrieve it from SQL Server back to ASP.NET.
The insert part is done, but I am having problems to retrieve data. I am using this code, but is throwing an error:
SqlConnection con = new SqlConnection(myconnstrng);
con.Open();
SqlCommand cmd = new SqlCommand("selection", con);
cmd.CommandType = CommandType.StoredProcedure;
cmd.Parameters.AddWithValue("#id", parameter);
SqlDataAdapter da = new SqlDataAdapter(cmd);
DataSet dsa = new DataSet();
da.Fill(dsa);
if (dsa.Tables[0].Rows.Count > 0)
{
MemoryStream ms = new MemoryStream((byte[])dsa.Tables[0].Rows[0]["Data"]);
string strBase64 = Convert.ToBase64String(ms);
ImageButton2.ImageUrl = "data:Image/png;base64," + strBase64;
}
and the error I got is :
Cannot convert from 'System.IO.MemoryStream' to 'byte[]'
I am new to programming, and if someone could help me about this problem.
Thanks to everyone !
The particular line you are stuck on, you don't need a MemoryStream at all. You can pass the value from the DataTable straight to ToBase64String.
But you can save yourself some bother with these tips:
ALWAYS dispose the connection, command and adapter/reader correctly, by putting them in using blocks`
For a single result, you can skip the table and adapter, and just use (byte[]) cmd.ExecuteScalar().
If you have more than one row which you need to process (as opposed to just displaying in a grid view), you may find it again easier to skip the DataTable and grab data out via this:
using(var reader = cmd.ExecuteReader())
{
while(reader.Read())
DoSomethingWithResult(reader.IsDBNull(0) ? null : reader.GetBytes(0));
}
Generally, DoSomethingWithResult should not be very heavy processing, or you will block the SQL server. If so, store in memory and process it afterwards.
Related
I have a textbox that autocompletes from values in a SQL Server database. I also created a stored procedure, which is very simple:
Stored procedure code
My code is this:
public AutoCompleteStringCollection AutoCompleteFlight(TextBox flight)
{
using (SqlConnection connection = new SqlConnection(ConnectionLoader.ConnectionString("Threshold")))
{
AutoCompleteStringCollection flightCollection = new AutoCompleteStringCollection();
connection.Open();
SqlCommand flights = new SqlCommand("AutoComplete_Flight", connection);
flights.CommandType = CommandType.StoredProcedure;
SqlDataReader readFlights = flights.ExecuteReader();
while (readFlights.Read())
{
flightCollection.Add(readFlights["Flight_Number"].ToString());
}
return flight.AutoCompleteCustomSource = flightCollection;
}
}
Is there a point to having this stored procedure since it's such a simple query? Or am I doing this wrong, since it still has to use the data reader and insert it into the collections.
My previous code before the stored procedure was:
using (SqlConnection connection = new SqlConnection(ConnectionLoader.ConnectionString("Threshold")))
{
AutoCompleteStringCollection flightCollection = new AutoCompleteStringCollection();
connection.Open();
SqlCommand flights = new SqlCommand("SELECT DISTINCT Flight_Number FROM Ramp_Board", connection);
SqlDataReader readFlights = flights.ExecuteReader();
while (readFlights.Read())
{
flightCollection.Add(readFlights["Flight_Number"].ToString());
}
return flight.AutoCompleteCustomSource = flightCollection;
}
Is the second piece of code better or are they both wrong, and there is a way better way of doing this?
"Better way" is a little undefined.
If you are looking for a performance answer of stored procedure or not, I'm not sure it matters all that much with that small of a data set and a simple query. Stored procedures shine when there are complex operations to perform that can limit back and forth with the server or limit the amount of data returned. In your case, the server side effort is the same either way, and the amount of data returned is also the same. #Niel points out that the procedures can be updated server side without changing your deployed code. This is another useful feature of Stored procedures that you probably will not need for this scenario though.
If you are looking for an alternate code answer then you could use a DataAdapter instead of a DataReader. There are many articles on this site that talk about the performance of the two, and most of them agree that they are more or less the same. The only exception is if you dont't plan on reading all of the rows. In your case, you are reading the whole table, so they are effectively the same.
SqlCommand sqlCmd = new SqlCommand("SELECT * FROM SomeTable", connection);
SqlDataAdapter sqlDA= new SqlDataAdapter();
sqlDA.SelectCommand = sqlCmd;
DataTable table = new DataTable();
// Fill table from SQL using the command and connection
sqlDA.Fill(table);
// Fill autoComplete from table
autoComplete.AddRange(table.AsEnumerable().Select(dr => dr["ColumnName"].ToString()).ToArray());
If you decide to use this kind of a LINQ statement, it is best to set the column to not allow nulls, or add a where that filters nulls. I'm not sure how or if AutoCompleteStringCollection handles nulls.
This question already has answers here:
Parameterized Query for MySQL with C#
(6 answers)
Closed 4 years ago.
I am attempting to run a query using a parameter in C#. I am getting an issue where no rows are being returned. I am pulling the sql from a file and putting it into the command text. When the query (a SELECT statement) is run, no results are returned. I have confirmed that the result is in my database and that the query is correct (after replacing the param) by running it normally.
conn.Open();
//create the command
var command = conn.CreateCommand();
//Read sql from file
FileInfo file = new FileInfo("SQL/GetPage.sql");
string script = file.OpenText().ReadToEnd();
command.CommandText = script;
command.Parameters.AddWithValue("?PageID", PageName);
command.Prepare();
MySqlDataReader rdr = command.ExecuteReader();
rdr.Read();
SQL:
SELECT * FROM `Page` WHERE PageID = '?PageID'
I have tried with both the prepare and without it. I have no clue why it is not working. Also, I am only expecting one result max (PageID is unique), so that is why it isn't in a loop. I also know my connection is good because I hardcoded the query without the where clause and it worked fine.
Please let me know if anyone has any suggestions.
Thanks
Read() just advances the DataReader to the next record (this is why it used in a loop). You need to extract the data from this record
while (rdr.Read())
{
int i = rdr.GetInt32(0);
string s = rdr.GetString(1);
}
I'm currently trying to use C# to read through an SQL DB. To do so, I use OleDB with a select statement. This goes into a dataset, which then populates a data adapter. I then iterate through each row and calculate stuff.
First of all, I feel like there's a better/more efficient way of doing this because I NEVER actually write back to the SQL DB. I just calculate based on what I'm selecting.
Anyways, past a certain point I get out of memory errors and/or an error from Ssms.exe saying "a new guard page for the stack cannot be created."
From the other questions I've seen, I need to use DataReader but I can't seem to get it to work the same way as the data adapter (which I suppose isn't that surprising).
The code I have now:
OleDbConnection myConn = new OleDbConnection(#"SQLDB connection string here");
OleDbCommand cmd = new OleDbCommand();
cmd.CommandText = <selectstatement here>
cmd.Connection = myConn;
cmd.CommandTimeout = 0;
OleDbDataAdapter da = new OleDbDataAdapter(cmd);
DataSet ds = new DataSet();
da.Fill(ds);
myConn.Close();
foreach (DataTable table in ds.Tables)
{
foreach (DataRow dr in table.Rows)
{
//do stuff
I guess my question is twofold, like I said above. One would DataReader solve my problem and allow me to iterate through the data, and two how do I adapt the first code snippet above to support that?
Also, since I've seen it elsewhere, I'm using x64 on the application.
I have been writing database software for a few years now and know that there are multiple ways to access data. Personally I do everything manually when pulling data by using a data reader object. This has just started to get on my nerves when working with tables that have a large amount of columns. It becomes very inefficient to have to write 30 lines of this code at a time
if(reader[count] != DBNull.Value)
someObject = reader.GetString(count++);
else
count++;
it is bad enough that the queries themselves take a long time to type out and get ready.
I was thinking on possibly using a datatable to retrieve my records since you can do that in a few lines and then write a few helper methods that iterate throught the rows and return me an observable collection of objects. Is there another more simple, or more concise way to go about doing this?
I just use the datatable
using (SqlConnection conn = new SqlConnection())
{
conn.Open();
DataTable dt = new DataTable();
using (SqlDataAdapter adapter = new SqlDataAdapter("select * from mytable", conn))
{
adapter.Fill(dt);
}
}
This question already has answers here:
Is datareader quicker than dataset when populating a datatable?
(9 answers)
Closed 10 years ago.
I am using the following code (Variant DataReader):
public DataTable dtFromDataReader(list<String> lstStrings)
{
OleDBConn_.Open();
using (OleDbCommand cmd = new OleDbCommand())
{
DataTable dt = new DataTable();
OleDbDataReader reader = null;
cmd.Connection = OleDBConn_;
cmd.CommandText = "SELECT * from TableX where SUID=?";
foreach (String aString in lstStrings)
{
cmd.Parameters.AddWithValue("?", aNode.SUID);
reader = cmd.ExecuteReader();
if (reader != null)
dt.Load(reader);
cmd.Parameters.Clear();
}
return dt;
}
}
and compare it to (Variant DataAdapter):
public DataTable dtFromDataAdapter(list<String> lstStrings)
{
dt = new DataTable();
foreach (string aString in lstStrings)
{
sOledb_statement = String.Concat("SELECT * FROM TableX where SUID='", aString, "'");
OleDbDataAdapter oleDbAdapter;
using (oleDbAdapter = new OleDbDataAdapter(sOledb_statement, OleDBConn_))
{
GetOleDbRows = oleDbAdapter.Fill(dt);
}
}
}
When i connect to an offline database (microsoft access) my reading time is (~1.5k retrieved items):
DataReader 420 ms
DataAdapter 5613 ms
When reading from oracle server (~30k retrieved items):
DataReader 323845 ms
DataAdapter 204153 ms
(several tests, times do not change much)
Even changing the order of the commands (dataadapter before datareader) didn't change much (i thought that there may have been some precaching..).
I thought DataTable.Load should be somewhat faster than DataAdapter.Fill?
And i still believe, even though i see the results, that it should be faster. Where am i losing my time? (There are no unhandled exceptions..)
Your comparison isn't really an Adapter vs DataReader with the way you have the code setup. You are really comparing the Adapter.Fill vs DataTable.Load methods.
The DataReader would normally be faster on a per-record basis because you would be traversing the records one at a time and can react accordingly when you read each record.
Since you are returning a DataTable in both instances, the Adapter.Fill method would probably be the optimal choice to use. It was designed to do just that.