I try to cache my userlist, so that when 200 users are online, there are not 200 database querys every 10 seconds.
I have this code:
using System;
using System.Collections.Generic;
using System.Linq;
using System.Text;
namespace Business
{
public class UserList
{
private static object locker = new object();
public static List<DAL.OnlineList> userList;
public static DateTime date;
}
}
-
public static string GetOnlineList(HttpContext con)
{
List<DAL.OnlineList> onlineList = new List<DAL.OnlineList>();
if (Business.UserList.date == DateTime.MinValue || Business.UserList.date < DateTime.Now.AddSeconds(-30))
{
Business.UserList.date = DateTime.Now;
onlineList = DAL.UserDAL.GetAllOnlineUser().OrderBy(x => x.Username).ToList();
Business.UserList.userList = onlineList;
}
else
{
onlineList = Business.UserList.userList;
}
//Before
//List<DAL.OnlineList> onlineList = DAL.UserDAL.GetAllOnlineUser().OrderBy(x => x.Username).ToList();
}
The method GetOnlineList is called every 10 seconds from a WebMethod / pageMethod / JavaScript-call.
So before it was: 200 Users, every 10 seconds = 200 x 10 x 6 = 12000 db-querys per minute.
Then my code is right, the first user will load the list from the database and store it - and it will be refreshed every 30 seconds - correct?
I think that the condition in your code snippet needs an adjustment
if (Business.UserList.date == DateTime.MinValue ||
Business.UserList.date > DateTime.Now.AddSeconds(-30))
You can always use the built in caching mechanism ASP.NET that has. You can read about it here.
Basically, you have two options to cache objects with sliding expiration and absolute expiration.
With sliding expiration an object remains in the cache if you retrieve it sooner than the expiration timespan you have set. For example, if you set a timespan of 2 minutes and you retrieve the object every 1 minute it will remain forever in the cache.
With absolute expiration, an object stays in the cache based on the timespan regardless of how many times it has been retrieved.
In your example, you are have the absolute expiration logic. Here is an example on how to use it:
public List<DAL.OnlineList> Users
{
get
{
List<DAL.OnlineList> users = null;
string CacheKey = "dal_users";
users = HttpContext.Current.Cache[CacheKey];
if ((users == null))
{
users = DAL.UserDAL.GetAllOnlineUser()
.OrderBy(x => x.Username).ToList();
HttpContext.Current.Cache.Add(CacheKey, users, Nothing,
DateTime.Now.AddSeconds(30), Caching.Cache.NoSlidingExpiration,
CacheItemPriority.Default, null);
}
return users;
}
}
Related
I have an Azure Function that is time triggered. The Azure Function starts at every occasion when it is somewhere 00:00 am (local time). What I would like to achieve is to find the time zone strings (e.g. Europe/London) for the time zones where it is currently 00:00 am when the Azure Function is running.
I.e., I provide an UTC value and it provides me all time zone ids where it is currently 00:00 am local time.
How can I achieve that using NodaTime?
A slightly simpler version than yours, if you always want to check for midnight:
static List<string> GetTimeZonesAtMidnight(Instant instant) =>
// Extension method in NodaTime.Extensions.DateTimeZoneProviderExtensions
DateTimeZoneProviders.Tzdb.GetAllZones()
.Where(zone => instant.InZone(zone).TimeOfDay == LocalTime.Midnight)
.Select(zone => zone.Id)
.ToList();
If you need to check for non-midnight values, pass in a LocalTime:
static List<string> GetTimeZonesAtMidnight(Instant instant, LocalTime timeOfDay) =>
// Extension method in NodaTime.Extensions.DateTimeZoneProviderExtensions
DateTimeZoneProviders.Tzdb.GetAllZones()
.Where(zone => instant.InZone(zone).TimeOfDay == timeOfDay)
.Select(zone => zone.Id)
.ToList();
My first approach (prototype) looks as follows:
using System;
using System.Collections.Generic;
using NodaTime;
namespace TimeZones
{
class Program
{
static void Main(string[] args)
{
Instant utcDateTime = Instant.FromDateTimeUtc(DateTime.UtcNow);
Console.WriteLine(utcDateTime);
List<string> zoneIds = GetTimeZonesWithCondition(utcDateTime, 0, 0);
Console.ReadLine();
}
static List<string> GetTimeZonesWithCondition(Instant utcDateTime, int hourComparison, int minuteComparison)
{
List<string> zoneIdsCheck = new List<string>();
IDateTimeZoneProvider timeZoneProvider = DateTimeZoneProviders.Tzdb;
foreach (var id in timeZoneProvider.Ids)
{
var zone = timeZoneProvider[id];
var zoneDateTime = utcDateTime.InZone(zone);
int hourZone = zoneDateTime.Hour;
int minuteZone = zoneDateTime.Minute;
if (hourZone == hourComparison && minuteZone == minuteComparison)
{
zoneIdsCheck.Add(zone.ToString());
Console.WriteLine($"{zone} / {zoneDateTime}");
}
}
return zoneIdsCheck;
}
}
}
If someone has a better solution please let me know.
I'm trying to build a standalone application that creates a custom report for Encompass360 without needing to put certain fields into the reporting database.
So far I have only found one way to do it, but it is extremely slow. (Much slower than a normal report within encompass when retrieving data outside of the reporting database.) It takes almost 2 minutes to pull the data for 5 loans doing this:
int count = 5;
StringList fields = new StringList();
fields.Add("Fields.317");
fields.Add("Fields.3238");
fields.Add("Fields.313");
fields.Add("Fields.319");
fields.Add("Fields.2");
// lstLoans.Items contains the string location of the loans(i.e. "My Pipeline\Dave#6")
foreach (LoanIdentity loanID in lstLoans.Items)
{
string[] loanIdentifier = loanID.ToString().Split('\\');
Loan loan = Globals.Session.Loans.Folders[loanIdentifier[0]].OpenLoan(loanIdentifier[1]);
bool fundingPlus = true; // if milestone == funding || shipping || suspended || completion;
if (!fundingPlus)
continue;
bool oneIsChecked = false;
LogMilestoneEvents msEvents = loan.Log.MilestoneEvents;
DateTime date;
MilestoneEvent ms = null; // better way to do this probably
if (checkBox4.Checked)
{
ms = msEvents.GetEventForMilestone("Completion");
if (ms.Completed)
{
oneIsChecked = true;
}
}
else if (checkBox3.Checked)
{
ms = msEvents.GetEventForMilestone("Suspended");
if (ms.Completed)
{
oneIsChecked = true;
}
}
else if (checkBox2.Checked)
{
ms = msEvents.GetEventForMilestone("Shipping");
if (ms.Completed)
{
oneIsChecked = true;
}
}
else if (checkBox1.Checked)
{
ms = msEvents.GetEventForMilestone("Funding");
if (ms.Completed)
{
oneIsChecked = true;
}
}
if (!oneIsChecked)
continue;
string LO = loan.Fields["317"].FormattedValue;
string LOid = loan.Fields["3238"].FormattedValue;
string city = loan.Fields["313"].FormattedValue;
string address = loan.Fields["319"].FormattedValue;
string loanAmount = loan.Fields["2"].FormattedValue;
if (loanAmount == "")
{
Console.WriteLine(LO);
continue;
}
int numLoans = 1;
addLoanFieldToListView(LO, numLoans, city, address, loanAmount);
if (--count == 0)
break;
}
}
I haven't been able to figure out how to use any of the pipeline methods to retrieve data outside the reporting database, but when all of the fields I am looking for are in the reporting database, it hardly takes a couple seconds to retrieve the contents of hundreds of loans using these tools:
session.Reports.SelectReportingFieldsForLoans(loanGUIDs, fields);
session.Loans.QueryPipeline(selectedDate, PipelineSortOrder.None);
session.Loans.OpenPipeline(PipelineSortOrder.None);
What would really help me is if somebody provided a simple example for retrieving data outside of the reporting database by using the encompass sdk that doesn't take longer than it ought to for retrieving the data.
Note: I am aware I can add the fields to the reporting database that aren't in it currently, so this is not the answer I am looking for.
Note #2: Encompass360 doesn't have it's own tag, if somebody knows of better tags that can be added for the subject at hand, please add them.
I use the SelectFields method on Loans to retrieve loan field data that is not in the reporting database in Encompass. It is very performant compared to opening loans up one by one but the results are returned as strings so it requires some parsing to get the values in their native types. Below is the example from the documentation for using this method.
using System;
using System.IO;
using EllieMae.Encompass.Client;
using EllieMae.Encompass.BusinessObjects;
using EllieMae.Encompass.Query;
using EllieMae.Encompass.Collections;
using EllieMae.Encompass.BusinessObjects.Loans;
class LoanReader
{
public static void Main()
{
// Open the session to the remote server
Session session = new Session();
session.Start("myserver", "mary", "maryspwd");
// Build the query criterion for all loans that were opened this year
DateFieldCriterion dateCri = new DateFieldCriterion();
dateCri.FieldName = "Loan.DateFileOpened";
dateCri.Value = DateTime.Now;
dateCri.Precision = DateFieldMatchPrecision.Year;
// Perform the query to get the IDs of the loans
LoanIdentityList ids = session.Loans.Query(dateCri);
// Create a list of the specific fields we want to print from each loan.
// In this case, we'll select the Loan Amount and Interest Rate.
StringList fieldIds = new StringList();
fieldIds.Add("2"); // Loan Amount
fieldIds.Add("3"); // Rate
// For each loan, select the desired fields
foreach (LoanIdentity id in ids)
{
// Select the field values for the current loan
StringList fieldValues = session.Loans.SelectFields(id.Guid, fieldIds);
// Print out the returned values
Console.WriteLine("Fields for loan " + id.ToString());
Console.WriteLine("Amount: " + fieldValues[0]);
Console.WriteLine("Rate: " + fieldValues[1]);
}
// End the session to gracefully disconnect from the server
session.End();
}
}
You will highly benefit from adding these fields to the reporting DB and using RDB query instead. Internally, Encompass has to open / parse files when you read fields without RDB, which is a slow process. Yet it just does a SELECT query on fields in RDB which is a very fast process. This tool will allow you quickly checking / finding which fields are in RDB so that you can create a plan for your query as well as a plan to update RDB: https://www.encompdev.com/Products/FieldExplorer
You query RDB via Session.Loans.QueryPipeline() very similarly to your use of Loan Query. Here's a good example of source code (in VB): https://www.encompdev.com/Products/AlertCounterFieldPlugin
I have a console application that I must generally speaking check the entire base to do stuffs...
To do so, I am using Tasks like this:
static void Main(string[] args)
{
var dateStart = DateTime.Now.AddDays(-35);
var dateEnd = DateTime.Now;
var taskList = new List<Task>();
while (dateStart > dateEnd ? dateStart >= dateEnd : dateStart <= dateEnd)
{
var d = dateStart.Date;
var dispositivesBll = new DispositivesBll();
taskList.Add(Task.Run(() =>
{
dispositivesBll.Foo(d);
}).ContinueWith(
x => dispositivesBll.Dispose())
.ContinueWith(x => GC.Collect()));
var dispositivesBllNew = new DispositivesBll();
taskList.Add(Task.Run(() =>
{
dispositivesBllNew.Boo(d);
}).ContinueWith(
x =>
dispositivesBllNew.Dispose())
.ContinueWith(x => GC.Collect()));
if (taskList.Count >= 2 * 5)
{
Task.WaitAll(taskList.ToArray());
taskList.Clear();
}
dateStart = dateStart > dateEnd ? dateStart.AddDays(-1) : dateStart.AddDays(1);
}
Task.WaitAll(taskList.ToArray());
So Basically I want to run 10 days at once as you may noticed at if (taskList.Count >= 2 * 5) but the problem is that my Foo and Boo methods have multiples connection to one Oracle Database.
public class DispositivesBll : IDisposable
{
private readonly OracleDal _oracleDal = new OracleDal();
public void Foo(DateTime data)
{
var t1 = Task.Run(() =>
{
_listSuccess = _oracleDal.GetSuccessList();
});
var t2 =
Task.Run(() =>
{
listFailure = _oracleDal.GetFailureList();
});
t1.Wait();
t2.Wait();
foreach (var success in _listSuccess)
{
//Some logic to insert objects into a "mergeList"
}
if (mergeList.Any())
Task.Run(() => _oracleDal.MergeList(mergeList)).Wait();
}
public void Dispose()
{
if (_hash != null)
_hash.Clear();
_hash = null;
}
}
and my Merge Method:
public void MergeList(List<MyObject> mergeList)
{
using (var conn = new OracleConnection(Connection.ConnectionString))
{
if (conn.State != ConnectionState.Open)
conn.Open();
using (var oCommand = conn.CreateCommand())
{
oCommand.CommandType = CommandType.Text;
oCommand.CommandText = string.Format(#"
MERGE INTO MyTable dgn
USING (select id from another_table where field = :xpe) d ON ( TO_CHAR(dateHappen, 'DDMMYYYY') = {0} and id = :xId) WHEN MATCHED THEN
UPDATE SET OK = :xOk, dateHappen = SYSDATE
WHEN NOT MATCHED THEN
INSERT (fields....)
VALUES (values...)");
oCommand.BindByName = true;
oCommand.ArrayBindCount = mergeList.Count;
oCommand.Parameters.Add(":xId", OracleDbType.Int32,
mergeList.Select(c => Convert.ToInt32(c.Id)).ToArray(), ParameterDirection.Input);
oCommand.Parameters.Add(":xPe", OracleDbType.Varchar2,
mergeList.Select(c => Convert.ToString(c.Xpe)).ToArray(), ParameterDirection.Input);
oCommand.ExecuteNonQuery();
}
}
}
The problem is: For each "day" it tooks about 2 hours to process everything... and we have a daily plan to backup our database causing the database to stop about 10 minutes... so it would cause a lock in my process...
So what I do? I stop manually this process and start it again avoiding the dates already executed. BUT if I have 20 connections opened, they would stayed that way... So I have to kill those sessions everytime... Is there a way to force all connections to dispose?
EDIT:
MyTable has 50 mi rows.... composed by ID | STATE | DATE... basically I have to cross those STATES with their DATES...
So the big delay it's on the database... It's kinda a known issue that we have to refactor the entire database model.... and we are going to do it soon...
But anyways, despite the process time, if I can just manage (or force) the connection kill, it would be fine...
Any ideas?
Okay, in order kill a session, you would need to involve a DBA, which if you do several times a week, would not make for a good friendship! As myself and others pointed out, a database should not need to be brought down for a backup (except for a cold backup), or for even a export, but if you have these long merge processes a consistent export would take quite a while. So first step is to educate (nicely) the DBAs on having the database in archivelog mode, and have RMAN manage online backups.
Second, you need to improve the merge criteria so that the merging columns are indexed. It could well be that dateHappen is indexed, but since you are invoking a function on it in the merge criteria, that index cannot be used unless a function-based index is created. I am referring to the
TO_CHAR(dateHappen, 'DDMMYYYY') = {0}
specifically; and in general,
USING (select id from another_table where field = :xpe) d
ON ( TO_CHAR(dateHappen, 'DDMMYYYY') = {0} and id = :xId)
You should check to see if id and field on another_table is indexed, and create a function-based index on dateHappen:
create index i_date_string on whatever_table (TO_CHAR(dateHappen, 'DDMMYYYY'))
For your database tuning, just invoke the merge statement by itself in a SQL tool to try different approaches; then you won't have to worry about killing merges half-way, whicn would be causing a lot of work in the database rolling back transactions, etc.
Update:
OK, I will answer the question instead of providing my solution. :)
You can create a profile that is then assigned to a user to limit sessions to a specific CONNECT_TIME or IDLE_TIME or both. Then if the user exceeds the time, according to http://docs.oracle.com/database/121/SQLRF/statements_6012.htm#SQLRF01310
If a user exceeds the CONNECT_TIME or IDLE_TIME session resource
limit, then the database rolls back the current transaction and ends
the session. When the user process next issues a call, the database
returns an error.
So you can do this:
create profile MERGE_PROF limit idle_time 5 connect_time 86400;
alter user BATCH_MERGER_USER profile MERGE_PROF;
Then when your session is created, if the session is idle for 5 minutes, Oracle will kill the session, and if it is running solid for 24 hours, (running commands with less than 5 minutes between commands for 24 hours) Oracle will kill the session.
I have three collections. First, a collection of days. Next, a collection of time spans in each day. These time spans are the same for each day. Next, I have a collection of sessions.
There are 4 days. There are 6 time spans. There are 30 sessions.
I need to iterate through each day, assigning all of the time spans to each day the same way for each day. However, I need to assign the sessions to time blocks in sequence. For example, day 1 gets all 6 time spans, but only the first 6 sessions, 1-6. Day 2 gets the same time spans, but gets the next 6 sessions, 7-12.
How can I do this within the same method?
Here's what I have so far, but I'm having trouble wrapping my head around the paged iteration part.
var timeSlots = TimeSlotDataAccess.GetItems(codeCampId);
var assignableSlotCount = timeSlots.Where(t => !t.SpanAllTracks);
// determine how many days the event lasts for
agenda.NumberOfDays = (int)(agenda.CodeCamp.EndDate - agenda.CodeCamp.BeginDate).TotalDays;
// iterate through each day
agenda.EventDays = new List<EventDayInfo>(agenda.NumberOfDays);
var dayCount = 0;
while (dayCount <= agenda.NumberOfDays)
{
var eventDate = agenda.CodeCamp.BeginDate.AddDays(dayCount);
var eventDay = new EventDayInfo()
{
Index = dayCount,
Day = eventDate.Day,
Month = eventDate.Month,
Year = eventDate.Year,
TimeStamp = eventDate
};
// iterate through each timeslot
foreach (var timeSlot in timeSlots)
{
var slot = new AgendaTimeSlotInfo(timeSlot);
// iterate through each session
// first day gets the first set of assignableTimeSlotCount, then the next iteration gets the next set of that count, etc.
slot.Sessions = SessionDataAccess.GetItemsByTimeSlotId(slot.TimeSlotId, codeCampId).ToList();
// iterate through each speaker
foreach (var session in slot.Sessions)
{
session.Speakers=SpeakerDataAccess.GetSpeakersForCollection(session.SessionId, codeCampId);
}
}
agenda.EventDays.Add(eventDay);
dayCount++;
}
I ended up using LINQ in a new method based upon the GetItemsByTimeSlot() method. The new signature and example of getting a matching subset of that collection is below.
Here's how I'm calling it:
slot.Sessions = SessionDataAccess.GetItemsByTimeSlotIdByPage(slot.TimeSlotId,
codeCampId, dayCount + 1, timeSlotCount).ToList();
Here's what it looks like:
public IEnumerable<SessionInfo> GetItemsByTimeSlotIdByPage(int timeSlotId, int codeCampId, int pageNumber, int pageSize)
{
var items = repo.GetItems(codeCampId).Where(t => t.TimeSlotId == timeSlotId);
items.Select(s => { s.RegistrantCount = GetRegistrantCount(s.SessionId); return s; });
// this is the important part
var resultSet = items.Skip(pageSize * (pageNumber - 1)).Take(pageSize);
foreach (var item in resultSet)
{
item.Speakers = speakerRepo.GetSpeakersForCollection(item.SessionId, item.CodeCampId);
}
return resultSet;
}
I have added custom attribute lastLogonTime syntax: UTC Coded Time. I extended UserPrincipal class to GET/SET that custom attribute.
...
[DirectoryProperty("lastLogonTime")]
public DateTime? LastLogonTime
{
get
{
object[] result = this.ExtensionGet("lastLogonTime");
if (result != null && result.Count() > 0) return (DateTime?)result[0];
return null;
}
set
{
this.ExtensionSet("lastLogonTime", value);
}
}
...
I have also extended AdvancedFilters to be able to search by this custom attribute.
MyUserPrincipalSearch searchFilter;
new public MyUserPrincipalSearch AdvancedSearchFilter
{
get
{
if (null == searchFilter)
searchFilter = new MyUserPrincipalSearch(this);
return searchFilter;
}
}
public class MyUserPrincipalSearch: AdvancedFilters
{
public MyUserPrincipalSearch(Principal p) : base(p) { }
public void LastLogonTime (DateTime? lastLogonTime, MatchType mt)
{
this.AdvancedFilterSet("lastLogonTime", lastLogonTime.Value, typeof(DateTime?), mt);
}
}
Now, I would like to search through all users who has lastLogonTime less than a day.
using (PrincipalContext ctx = ADLDSUtility.Users)
{
MyUserPrincipal filter = new MyUserPrincipal(ctx);
filter.AdvancedSearchFilter.LastLogonTime((DateTime.Now - new TimeSpan(1,0,0,0,0)), MatchType.LessThan);
PrincipalSearcher ps = new PrincipalSearcher(filter);
foreach (MyUserPrincipal p in ps.FindAll())
{
//my custom code
}
}
The above search is not returning me any results. I have test users who have not logged in in the last couple days.
I have tried MatchType.GreaterThan, MatchType.Equals. None of them are returning any results, yet there're users who match those criteria.
The only filter that does work is
filter.AdvancedSearchFilter.LastLogonTime(DateTime.Now , MatchType.NotEquals);
But this is basically returning all users. Any ideas why my search result is not returning any results?
My goal is to search for all users who's last logon time is less than X days.
I'm open to other solutions as long as I get those users.
P.S. I do know a way around this. Loop through all users, get their lastLogonTime and then do a comparison, but that's just an overkill for what I'm doing.
After spending sometimes on this issue, I found the problem.
As I mentioned on my post the custom attribute lastLogonTime has syntax: UTC Coded Time. However, the date and time is not getting stored as DateTime. It's actually getting stored as string in this format:
yyyyMMddHHmmss.0Z
How I ended up solving this issue is by modifying my AdvancedSearchFilter.LastLogonTime to search using formatted string.
public void LastLogonTime (DateTime? lastLogonTime, MatchType mt)
{
const string lastLogonTimeFormat = "yyyyMMddHHmmss.0Z";
this.AdvancedFilterSet("lastLogonTime", lastLogonTime.Value.ToUniversalTime().ToString(lastLogonTimeFormat), typeof(string), mt);
}
Hope this helps someone.