I have made a chart and added the points on runtime. The data is plotted as
Chart1.ChartAreas(0).AxisX.IntervalType = DateTimeIntervalType.Weeks
Chart1.Series("PH").Points.AddXY(DateTime.Parse(dr.Item("readtime")).ToString("MM-dd"), dr.Item("ph"))
The data that is plotted holds 7 days of data.
As shown below, the date is repeated.
How to achieve the x axis showing only 11-08, 11-13, 11-14 in this case?
Here is a sample data
When you get data from database, use like this,
SELECT SUM(ph),SUM(tmp),SUM(orp),SUM(sal),SUM(ec),SUM(tds),SUM(do),readtime
FROM <your tables>
GROUP BY readtime
The results set returned from this query is the thing you are looking for....
The data that you show is obviously invalid data.
You have several zero readings occurring at exactly the same time - midnight.
I suspect these are not even real readings.
You need to exclude them (and you should probably find out why this is happening - maybe a parsing error?)
Related
This is my first post on here. I'm attempting to create a 'simple' charting program (windows form based in c#). I say simple because I'm currently only playing around with only 2 series maximum, and a few transformations (percent changes, actual changes, moving averages and moving sums). It will get more complicated than this but having the limited functionality first might help me get a better handle on how all of this works in C#.
I've been searching on and off for a couple of days now but have had no luck so far with my specific situation. I'll try to be as detailed as possible. The user retrieves the time series data from a SQL Server. This part of the program is behaving as expected. I'm creating 2 queries (one for each series) to retrieve the data separately. I do the transformations in the SQL query. Each comes via a SQL adapter which is then placed into a data table. The series may be of different frequencies and the dates may not overlap (i.e. sometimes a stock prices will be daily, and exports will be monthly, or GDP quarterly). The exports number may come in as the first of every month, while stock prices may be missing a value on this date if it was a weekend. I suspect this part is important for my issue.
Nonetheless, I double checked this step and everything works as expected (the values are all coming in correctly at the right dates).
To add those series to a chart, I merge the two data tables like so (I believe this is where my issues are coming from):
myTable = myTables1.Copy();
myTables1.PrimaryKey = new DataColumn[] { myTables1.Columns["dates"] };
myTables2.PrimaryKey = new DataColumn[] { myTables2.Columns["dates"] };
myTable.Merge(myTables2, true);
Datagridview of the Merged data table here looks good to me
Then I create the chart: I've tried two different methods here. The first is to create each series in a 2 step loop (one for each series) and loop through each row in the table and add an x and y value for each series. I've also tried to set the data source for the chart to be the table, and create 2 series and set the X and Y members as the names of the columns in the table.
Like so:
mychart.DataSource = myTable;
Then in a loop for each series:
mychart.Series.Add(seriesname);
mychart.Series[seriesname].XValueMember = "dates";
mychart.Series[seriesname].YValueMembers = seriesname;
Regardless, my second series is always a bit off. Either there are straight lines going across or it misses some values (and by adding a tool tip i can tell that the dates where one series may have a value while the other does not, is where the problems are occurring).
I'm not looking for help with syntax (just ideas). So my question is: is there a standard or preferred way of getting and plotting series with different frequencies (or which may have different x values)? Alternatively, is there a good source/documentation where i can read up on this? I would add that i have a similar program using visual basic which uses one SQL query regardless of how many series there are. This works in terms of the chart looking as I'd expect it to, but it makes transformations much more complicated given the randomness of the null or empty values in the final table.
I'm rather new to Parse and Cloud Code, and I'm having trouble writing a certain query script.
I have a table of Salespeople, who have two integers : dailySold and dailyQuota.
The dailySold is reset to 0 each day, and the dailyQuota is defined by upper management.
Now, I'd like to make queries that call out bulks of users. Say, all users which dailySold is below their dailyQuota. In MySQL it would just look like this :
select * from salespeople where dailySold < dailyQuota
But in Parse / CloudCode I have been unable to find something like this. Currently, I'm loading all the entries, and going through them one by one, populating a large array clientside. This feels like the absolutely wrong way of doing it.
And the query.WhereNotEqualTo() function (and their siblings) seem to only be able to compare with static queries.
Does anyone know how to put together a query to optimize this ? I need it to go through thousands of records, and its often only 10-20 results I'm interested in. If nothing else, I'll have to make a cloudcode function that iterates for me serverside, but I still feel like there is some function I should be able to use, to make a more lean query.
You can't compare two columns in a query. You can only compare a key with a provided object. If the dailyQuota is set by upper management, I'm assuming this is the same for all salespeople, or for groups of people. I'd suggest first making a query for the daily quota and then either use
whereKey:matchesKey:inQuery
or just fetch the dailyQuota first and then use that value in the second query.
I am trying to figure out the best way to match items on a datagridview to items in an access database. (Think Quicken match transaction)
I import an excel sheet into a datagridview,from there it checks the access db looks for a match - if a match is found then it reports match in a column if not unmatched is reported.
i have tried to count the rows on an sql query - if = 1 then match is yes, but that for some reason will goof up sometimes.
so i am looking for the best way to do this.
Thanks - please let me know if you need any additional info.
There isn't a simple answer to this, and it depends on what your data looks like, and what you consider a "match" to be. As a very basic answer, this is one way to attack the problem. How far you take it is up to you...
Create an algorithm that takes all fields for a row and generates a "key" for it. For example if there are two fields [First], [Last] then perhaps the key would be "Bubba|Gump"
Apply that algorithm to both sets of data (the datagrid records and the access db records).
Compare the two sets of keys to determine what's identical/missing/added.
It's not foolproof but with some additional sophistication it'll take you surprisingly far.
I have an operation (That I can't change) that starts threads that make calls to our Oracle database to see if a certain hotel(s) has availability on a certain date.
If a date/hotel combination has availability, that thread returns information about the date/hotel in the form of a DataTable that is merged into a Main DataTable of results. Yes, I know ... I inherited this.
So I am trying to re-write this operation. I still must query Oracle in threads to get the availability information, but I want to display the data as it is returned (in chunks of 5, 10? I'm flexible), instead of having the user sit in front of the screen for up to 4 minutes before a complete result is spat out into a GridView.
How do I do this directly from an .aspx page so I can make a web service call and populate a grid (JqGrid?) with the results?
If I haven't provided enough information or described what I am trying to achieve, please let me know and I will elaborate.
Oracle provides a field on each row called "rowid"
(http://www.adp-gmbh.ch/ora/concepts/rowid.html)
The first time you send the query, send in the int (x) to define what the highest rownumber you want is. Have the service return the total number of rows and the first x rows.
Then, the 2nd time you send the query, get the next x rows, rinse and repeat.
Basically, you need to send an ajax query for rows x through y each time until you have them all loaded.
I would recommend paging as well, since users typically don't want to see hundreds of results at a time.
I use the following columns stored in a SQL table called tb_player:
Date of Birth (Date), Times Played (Integer), Versions (Integer)
to calculate a "playvalue" (integer) in the following formula:
playvalue = (Today - Date of Birth) * Times Played * Versions
I display upto 100 of these records with the associataed playvalue on a webpage at any time.
My question is, what is the most efficient way of calculating this playvalue given it will change only once a day, due to the (today-date of birth) changing? The other values (times played & versions) remain the same.
Is there a better way than calculating this on the fly each time for the 100 records? If so, is it more efficient to do the calculation in a stored proc or in VB.NET/C#?
In a property/method on the object, in C#/VB.NET (your .NET code).
The time to execute a simple property like this is nothing compared to the time to call out-of-process to a database (to fetch the rows in the first place), or the transport time of a web-page; you'll never notice it if just using it for UI display. Plus it is on your easily-scaled-out hardware (the app server), and doesn't involve a huge update daily, and is only executed for rows that are actually displayed, and only if you actually query this property/method.
Are you finding that this is actually causing a performance problem? I don't imagine it would be very bad, since the calculation is pretty straightforward math.
However, if you are actually concerned about it, my approach would be to basically set up a "playvalue cache" column in the tb_player table. This column will store the calculated "playvalue" for each player for the current day. Set up a cronjob or scheduled task to run at midnight every day and update this column with the new day's value.
Then everything else can simply select this column instead of doing the calculation, and you only have to do the calculation once a day.