I'm using the latest version of Hangifre (1.6.17) and Hangfire.Mongo (0.5.5) and I'm having troubles sorting the Hangfire failed jobs logs.
Currently they're sorted in ascending order (older logs appearing first) and I'd like to have them sorted in descending order.
I've seen some screenshots where people had their logs properly sorted by the timestamp thus I know it's possible to do this but I didn't find anything useful in the documentation.
Any help is much appreciated!
Thanks
As Alex suggested in the comment - this is indeed a sorting bug in Hangfire.Mongo library which will be fixed in one of the next releases. This was confirmed by the devs on lib's github page and you can access the issue via link in the comment above.
A temporary workaround could be logging the files via custom logger class to avoid spending considerable time searching through the dashboard logs.
Related
I wounder how to setup Serilog.Sinks.File to produce this:
log.txt <-- current log
log20200704.txt <-- rolled over yesterday log
log20200703.txt
instead of:
log20200705.txt <-- current log
log20200704.txt <-- rolled over yesterday log
log20200703.txt
I am used to such behavior since log4net days.
This is currently not supported by the Serilog.Sinks.File and there are no plans to support it in the short term. You can see a long discussion around this on the link below:
Fixed filename with rolling archive files #40
You can see an initial attempt to add this feature as a separate package (though it's still early days and has known limitations) on this repository: https://github.com/dfacto-lab/serilog-sinks-file
Of course, you can always roll your own version of Serilog.Sinks.File that adds the behavior you're looking for.
Other, related links:
Allow option to keep root log file as most recent #115
Properly rotate log files #667
Serilog does not support this feature yet.
You can find the discussion here. There is also a workaround in place :)
https://github.com/serilog/serilog-sinks-file/issues/40
So I have a C# (asp.net) based dashboard for a proprietary content management system. One of the things that the dashboard allows is for the user to go in and add custom css/sass to their site. When they do this, my controller calls a program that compiles the sass using NSass.Core.
Up until now, I have been using Foundation 5 as my responsive framework. Yesterday when attempting to update my controller to allow for Foundation 6 compilation, it started throwing errors. The errors were occurring every time the compiler attempted to parse a sass map (associative array).
I started doing some research into the problem and found out that sass maps are a relatively new mechanic in sass and the last time Nsass was updated was three years ago, so I am assuming this is the problem.
Has anyone had similar experience? If so, what was your solution. If not, does anyone use anything else that would work for me? I have tried installing a couple other packages, but started receiving various other errors such as libsassnet not being able to find the 32 bit dll. Hopefully someone here can give me an answer that saves me some time.
The errors I have received when using Nsass were all along the lines of "error reading values after primary" where primary is the first value in the first map the compiler comes across. When I take that map out, it just moves to the next one and gives the same error.
As far as narrowing my question down... I just want to know what other people are using out there to compile Sass in C#
There is a nuget package: Bundle Transformer: Sass and SCSS is a provider for Bundle Transformer. In turn, this is an extension of System.Web.Optimisation that could allow you to add code to your CMS to compile user generated SCSS into Css files.
An example of this can be found in the Optimus package for the Umbraco CMS. Looking through this code could give you a good basis for creating your own system. If you speak with the Author of the package (a really nice guy) he might be able to help you create your own targeted package that isn't dependent on Umbraco.
Hope that helps.
I want to code some algorithm or parser which should get site position in google search results. The issue is every time google page layout will change I should correct/change the algorithm. How do you think guys is will really often change? Is there any techniques/advices/tricks about determining Google' site position?
How can I make robust position detection algorithm?
I want to use C#, .NET 2.0 and HtmlAgilityPack for that purpose. Any advices or proposes will be very appreciated. Thanks in advance, guys!
POST UPDATE
I know that google will show captcha to prevent machine queries. I got special service for that, that will recognise any captcha. Could you guys tell me about your experience in exact scraping results?
Google offer a plethora of APIs to access their services. For searching there's the Custom Search API.
I asked about this a year ago and got some good answers. Definitely the Agility Pack is the way to go.
In the end we did code up a rough scraper which did the job and ran without any problems. We were hitting Google relatively lightly (about 25 queries per day). We took the precaution of randomising 1) the order and 2) time of day and 3) time paused between queries. I don't know if any of that helped, but we were never hit by a captcha.
We don't bother with it much now.
Its main weaknesses were/are:
we only bothered to check the first page (we perhaps could have coded an enhanced version which looked at the first X pages, but maybe that would be a higher risk - in terms of being detected by Google).
its results were unreliable and jumped around. You could be 8th every day for weeks, except for a single random day when you were 3rd. Perhaps ... the whole idea of carefully taking a daily or weekly reading and logging our ranking is too flawed
To answer your question about Google breaking your code: Google didn't make a fundamentally breaking change in all the months we ran it but they changed something which broke the "snapshot" we were saving of the result (maybe a CSS change?) which did nothing to improve the credibility of the results.
We went through this process a few months back. We tried the API's mentioned above and the results were not even close to the actual search results. (Google for this lots of information).
Scraping the page is a problem, google seem to change the markup every few months and also have checks in place to work out if you are human or not.
We eventually gave up and went with one of the commercially available (And updated often) bits of kit.
I've coded a couple of projects on this, parsing organic results and adwords results. HTML Agility pack is definitely the way to go.
I was running a query every 3 minutes I think and that never triggered a CAPTCHA.
In regards to formatting changing, I was picking up on the ID of the UL (talking from memory here) and that only changed once in around a year (organic and adwords).
As mentioned above though, Google don't really like you doing this! :-)
I'm pretty sure that you will not easily get access to Google search results. They are constantly trying to stop people doing it.
If thought about screen scraping - be aware that they will start displaying captcha and you won't be able to get anything.
I have a number of applications running on top of ASP.NET I want to monitor. The main things I care about are:
Exceptions: We currently some custom code which will email us when an exception occurs. If the application is failing hard it will crash our outlook... I know (and use) elmah which partly solves the problem however it is still just a big table of exceptions with a pretty(ish) UI. I want something that makes sense of all of these exceptions (e.g. groups exceptions, alerts when new ones occur, tells me what the common ones are that I should fix, etc)
Logging: We currently log to files which are then accessible via a shared folder which dev's grep & tail. Does anyone know of better ways of presenting this information. In an ideal world I want to associate it with exceptions.
Performance: Request times, memory usage, cpu, etc. whatever stats I can get
I'm guessing this is probably going to be solved by a number of tools, has anyone got any suggestions?
You should take a look at Gibraltar not used it myself but looks very good! Also works with nLog and log4net so if you use those you are in luck!!
Well, we have exactly the same current solution. Emails upon emails litter my inbox and mostly get ignored. Over the holidays an error caused everyone in dev to hit their inbox limit ;)
So where are we headed with a solution?
We already generate classes for all our excpetions, they rarely get used from more than one place. This was essentially step one, but we started the code base this way.
We modified the generator to create unique HRESULT values for all exceptions.
Again we added a generator to create a .MC message resource file for every exception.
Now every exception can write itself to the Windows Event Log, and thus we removed all emails etc, and rely on the event log.
Now with an event log full of information, including unique message codes and parameters for each exception, we can use off-the-shelf tools to aggregate, monitor, and alert.
The exception generator (before modifications above) is located here:
http://csharptest.net/browse/src/Tools/Generators
It integrates with visual studio by replacing the ResX generator with this:
http://csharptest.net/browse/src/Tools/CmdTool
I have not yet published the MC generator, or the HRESULT generation; however, it will be available in the above locations when I get the time.
-- UPDATE --
All the tools and source are now available online for this. So where do I go from here?
Download the source or binaries from: http://code.google.com/p/csharptest-net/
Take a look at the help for CmdTool.exe Visual Studio Integration
Then review the help on Generators for ResX and MC files, there are several ways to generate MC files or complete message DLLs from ResX files. Pick the approach that fits you best.
Run CmdTool.exe REGISTER to register the tool with Visual Studio
Create a ResX file as normal, then change the custom tool to CmdTool
You will need to add some entries to the resx file. At minimal create the following:
".AutoLog" = true
".NextMessageId" = 1
".EventSource" = "HelloWorld"
"MyCustomException" = "Some exception text"
Other settings exampled by the NUnit: http://csharptest.net/browse/src/Tools/Generators/Test/TestResXAutoLog.cs#80
Now you should have an exception class being generated that writes log events. You will need to build the MC file as a pre/post build action with something like:
CSharpTest.Net.Generators.exe RESXTOMESSAGEDLL /output=MyCustomMessages.dll /input=TheProjectWithAResX.csproj
Lastly, you need to register it, run the framework's InstallUtil.exe MyCustomMessages.dll
That should get you started until I get time to document it all.
One suggestion from Ryans Roberts I really like is exceptioneer which seems to solve my exception woes at least.
I would first go for log4net. The SmtpAppender can wait for N exceptions to cumulate before sending an email and avoid crashing Outlook. And log4net also logs to log files that can be stored on network drives, read with cat and grep, etc.
About stats, you can perform a health/performance logging with the same tools, ie. spawn a thread that every minute logs CPU usage etc.
I don't have a concrete answer for the first part of question, since it implies automated log analysis and coalescence. At university, we made a tool that is designed to do part of these things but doesn't apply to your scenario (but it's two-way integrated with log4net).
In terms of handled exceptions or just typical logging l4ndash is worth a look. I always set our log4net to not only write out text files, but to append to the database. That way l4ndash can analyse it easily. It'll group your errors, let you see where bad things are occurring a lot. You get one free dev license
With Elmah we just pull down the logs periodically. It can exports as csv, then we use Excel do filter/group the data. It's not ideal, but it works. It would be nice to write a script to automate this a bit more. I've not seen much else out there for Elmah.
You can get some metrics on request times (and anything else that's saved) by running LogParser over the IIS logs.
We have built a simple monitoring app that sits on the desktop and flashes up red when there is either an exception written to the event log from one of the apps on the server or it writes an error to the error log table on the database. It also monitors the database health, checking fragmentation and the like.
The only problem we have with this is that it can be a little intrusive on the desktop as it keeps popping up with a red message box if there is a a problem. However it does encourage you to fix it asap.
We currently have this running on several of the developers machines. The improvement we are thinking of making is to have one monitoring app running on a server that then publishes an rss feed so that the app is only checking once in one place but we can consume the information from anywhere using whichever method we choose at the time (such as through our phones when we aren't in the office).
You can have an RSS feed select from your Exceptions table (and other things).
Then you can subscribe to the RSS feed in MS Outlook or on any smart phone. I use an RSS feed reader called NewsRob because it alerts me when there is something new.
I blog about how to do this HERE.
As a related step, I found a way to notify myself when something DIDN'T happen. That blog is HERE.
Ok,
I was eagerly awaiting the release of subsonic 3.0 to use as my low-level data layer, and now its out. I'm currently using the ActiveRecord templates (having tried both the repository and advanced templates) and I have one HUGE request and a few questions:
Request: Other than bug fixes, Rob please spend the time to provide documentation. I don't mean 5 examples, I mean API complete documentation. Here's why:
I'm testing subsonic by writing ASP.NET MembershipProvider and RoleProvider classes and simple questions continually slow me up using subsonic:
Q. Assuming I have a class 'User' and I update/save/delete a record using
user.Save();
I need information on how to get success/failure? Do I look for an exception on failure or can I get a count of 'affected' records (old school?)
Q. If I get an exception, which exception(s) can I expect?
I'll have more issues, but I really believe a good functional API documentation would solve the problem.
If the answer is 'read the source code', then I'm sure you're going to chase quite a few developers away from subsonic. I really want to use the library, but the point is "use" the library, not reverse engineer it.
-Jeff
Q. I need information on how to get success/failure? Do I look for an exception on failure or can I get a count of 'affected' records (old school?)
If it doesn't throw an exception then it's worked
Q. If I get an exception, which exception(s) can I expect?
You can expect a DbException
We don't use custom exceptions. I spent 5 weeks writing docs - so yah I did spend some time on this. You could also find your answer there as well: http://subsonicproject.com/docs
3.0 is a little too buggy for me so far. I think I am going back to 2.x for now, thanks for all the hard work though.