Below is my Code to Query a 24 Column database using linq
var list = (from x in db.graph_update
where x.serial == serial_number
&& (x.plotdate >= date_from && x.plotdate <= date_to)
select new {
x.plotdate,
x.Temperature
}).ToList();
Because the Database contains a large amounts to data, the connection time to grab all this data will take a long time and usually results in an error:
500 - The request timed out.
The web server failed to respond within the specified time.
How do I maintain this long running connection so it wouldn't timeout and be able to grab all of the data?
For whatever reason your query takes that long you can profile it and get max query time and just set timeout:
Entity Framework 6:
this.context.Database.CommandTimeout = 180;
Entity Framework 5:
((IObjectContextAdapter)this.context).ObjectContext.CommandTimeout = 180;
Related
Used by:
EntityFramework - 6.2.0;
MS Sql Server 2012;
In MS Access, I use the following query(req_GroupsStud_CurGroup):
SELECT req_GroupsStud_Stud.*, req_GroupsStud_Stud.id_group
FROM req_GroupsStud_Stud
WHERE (((req_GroupsStud_Stud.id_group)=[Forms]![frm_00_00_MainForm]![id_group_Frm]));
From the form, using the expression [Forms]![Frm_00_00_MainForm]![Id_group_Frm], the parameter [id_group_Frm] is passed to the request.
Questions.
How in MS Sql Server to transfer in the query req_GroupsStud_CurGroup parameter
`id_group_Frm "?
Or are there other tools in MS Sql Server and EntityFramework to get a similar data set?
Update_1:
Requirements for the request req_GroupsStud_CurGroup:
1. if changes are made to the record in the request, these changes should be displayed in the original source;
if the record is added to the request, then this record is displayed in the source;
the request is planned to be used in another request.
I managed to implement these requirements in MSAccess.
I do not understand how to do the same using Entity-Framework and MS SQL Server.
I tried the following methods:
- method_1
cntDB.req_GroupsStud_Stud.Load();
System.Data.SqlClient.SqlParameter id_group_Param = new System.Data.SqlClient.SqlParameter("#id_group_frm", id_group);
var sp_Get_Stud_var = cntDB.Database.SqlQuery<req_GroupsStud_Stud>("sp_Get_Stud #id_group_frm", id_group_Param).ToList();
bs_Grid_2.DataSource = sp_Get_Stud_var;
dataGridView2.DataSource = bs_Grid_2;
- method_2
cntDB.sp_Get_Stud(id_group);
Can not fully understand.
I think these methods will not meet the above requirements for the request.
We recently upgraded to SQL Server 2016 from SQL Server 2014 and since doing so a repository is getting a EntityCommandExecutionException with the InnerException of {"Timeout expired. The timeout period elapsed prior to completion of the operation or the server is not responding."} after ~15 seconds after executing the repo.
I can manually increase the timeout length in the model to fix this, but it seems rather odd since this repo was working fine prior to the SQL Server upgrade. The underlying database/model, size of the data we're retrieving, and server settings are also unchanged. Is there something related to using EF 6 & .NET 4.6.1 with SQL Server 2016 that I'm not taking into account?
Below is the code with a hardcoded connection string to see if there's anything in there that needs to be modified for SQL Server 2016 specifically.
Model Constructor (w/ timeout amount increased as a workaround)
public OperationsModel(string connectionString, bool lazyLoading = false)
: base(connectionString)
{
Database.CommandTimeout = 500;
Configuration.LazyLoadingEnabled = lazyLoading;
Configuration.ProxyCreationEnabled = lazyLoading;
}
Method
public void Example()
{
//X'ed out the server and database, but those are correct
using (var uow = new ModelManager(new OperationsModel("data source=XXXXX;initial catalog=XXXXX;integrated security=True;MultipleActiveResultSets=True;App=EntityFramework\" providerName=System.Data.SqlClient")))
using (var repo = new MyRepository(uow))
{
var test = repo.GetAll().ToList(); //Exception gets thrown here
}
}
I can't make SqlConnection.RetrieveStatistics work, it always return a hashtable of 18 elements that are all zeros.
What I want to do : In my test environnent, I want to add a header for each httpresponse with the number of sql commands that were executed during the request processing.
My app is only (Rest) webservices, I have one sql connection par request ( One NHibernate session per request ).
After the request is processed, I do :
var connection = statelessSession.NHSession.Connection as ReliableSqlDbConnection;
if (connection != null)
queries += Convert.ToInt32(connection.ReliableConnection.Current.RetrieveStatistics()["Prepares"]);
The number is always 0, and when I break on this line, I see that all numbers are zeros.
After the session is opened, I do :
var ret = module.OpenSession();
var connection = (ret.Connection as ReliableSqlDbConnection);
if(connection != null)
{
connection.ReliableConnection.Current.StatisticsEnabled =true;
ret.CreateSQLQuery("SET STATISTICS IO ON; SET STATISTICS TIME ON;").ExecuteUpdate();
connection.ReliableConnection.Current.ResetStatistics();
}
return ret;
I use https://github.com/MRCollective/NHibernate.SqlAzure because in production I'm using sql azure, but in local I have an instance of Sql Server 2014 Developer Edition.
For my database, I have :
Auto create statistics true
Auto update statistics true
Did I miss something ?
The cause of this was that NHibernate opens a new connection for each request, this is why I was loosing the statistics each time.
When I execute my QueryRequest object, I get a totalRows of around 110,000 while the response rows are around 38,000. So I am not receiving the entire result and must do paging.
But I see that my QueryRequest object has no startIndex property.
How can I receive the entire result set?
I am using a Premium version of Google Analytics. Does Google still return 10MB of data with each request?
UPDATE: I don't think my question is a duplicate. What I meant by the question was how can I get a specific page of results when my QueryRequest has no startIndex property.
JobsResource j = null;
QueryRequest qr = null
...
j.qr.Query = "SELECT examplecolumns FROM myTable";
QueryResponse response = j.Query(qr, projectId).Execute();
Call the getQueryResults() method to fetch the rest of the results.
https://cloud.google.com/bigquery/docs/reference/v2/jobs/getQueryResults
You cannot return the entire result set because Google has general quota limits for all of their API's:
General Quota Limits (All APIs)
The following quota limits are shared between the Management API, Core Reporting API, MCF Reporting API, Metadata API, and Real Time Reporting API.
50,000 requests per project per day – can be increased
10 queries per second (QPS) per IP.
In the Developers Console this quota is referred to as the per-user limit. By default, it is set to 1 query per second (QPS) and can be adjusted to a maximum value of 10. If the per-user limit is set to a value larger than 10 QPS, the Google Analytics quota policy will still take effect and limit per-user requests to 10 QPS.
If your application makes all API requests from a single IP address (i.e. on behalf of your users) you should consider using the userIP or quotaUser parameters with each request to get full QPS quota for each user. See the query parameters summary for details.
For more information have a look at this link: Configuration and Reporting API Limits and Quotas
You can also find more information on the subject here:Querying Data
The following additional limits apply for querying data.
Maximum tables per query: 1,000
Maximum query length: 256 KB
The query / getQueryResults methods are used to push some of the waiting for job completion into the server. Clients may see faster notification of job complete when using this mechanism, and will receive the first page of the query results in that response, avoiding the need for one additional round trip to fetch the data.
The general mechanism for using these apis, in pseudo code, is:
response = query(...)
while (!response.jobComplete) {
response = getQueryResults(response.jobReference);
}
moreData = false
do {
// consume response.rows
moreData = response.pageToken != null
if (moreData) {
response = getQueryResults(response.jobReference, response.pageToken)
}
} while (moreData)
Note that some type safe languages will be more difficult to code this, as the first response in that do loop may be a QueryResponse or a GetQueryResultsResponse type, depending on whether the query job finished within the initial timeout on the query() call, or whether it finished within the while (!response.jobComplete) polling loop.
I have an asp.net web page which interacts with a SQL Server database, grabs some data and then returns an XML response (which I feed into Freeswitch using xml_curl).
Because Freeswitch (FS from now on) does not store cookies, each request creates a new session.
When the number of requests gets too much (about 97 to 100), the SqlConnection.Open() method gets timeout from the SQL Server Instance, which then results in HTTP Error 500.
To test my assumption, I have created a small script using PHP and cURL, which make repeated requests to my asp.net page. If I store cookies (and thus sessions) in the PHP script I can make 10000 successful requests in almost 314 seconds.
But without sessions, I get stuck at about 97~100 requests, and then I get HTTP Error 500.
Is there anyway to overcome this problem?
==Edit==
Here is how I interact with the database:
String connectionString = WebConfigurationManager.ConnectionStrings["SqlServerConnection"].ConnectionString;
SqlConnection connection = new SqlConnection(connectionString);
SqlCommand command = connection.CreateCommand();
command.CommandType = CommandType.Text;
command.CommandText = "Select * from dbo.Template where Name = '" + name + "'";
Template template = new Template();
connection.Open();
SqlDataReader reader = command.ExecuteReader();
if (reader.HasRows)
{
reader.Read();
template.Name = reader["Name"].ToString();
template.XMLContent = reader["XMLContent"].ToString();
}
else
{
template.Name = "";
template.XMLContent = "";
}
reader.Close();
connection.Close();
return template;
And the Template table has these fields:
ID => int, identity, primary key
Name => nvarchar(255), unique
XMLContent => ntext
It appears you are using a connection pool. By default these pools have a max of 100 connections to your SQL server and queue any additional connections. The queue has a timeout (default 15 seconds) which can be extended if you wish to queue your requests longer. This means that you might get backed up on your server. You can also increase the pool max size if your SQL server can handle it.
Here is how you increase your connection settings by adding these parameters:
Timeout=60
Max Pool Size=150
etc etc
Some steps to impove this code.
If you do not need session, disabled it for this page so not cookie is going to be made.
Use some cache here base on the name If the request for name is the same, get it from cache and not open the database
Use a static variable to read only one time the connection string.
Place it on [try catch | using] , to be sure that you close the connection in case of failure
Maybe you can try a mutex lock logic, to avoid too many request together.
Use parameters on your sql call.
In addition to #Aristos suggestions:
Use Async-Pages!
Example and "Benchmark"
Some time ago I asked nearly the same question here on so