Combining multiple services in a single Azure web role - c#

I have a web application hosted on Azure, which spins up roles for a short period of time to perform a task. This task is necessary, so I can't do without it. I want to combine these services and assign them to a single role, so that when the service is run ten times in an hour, I don't get billed for ten hours of use.
I found a blog post and potential solution, created in 2012. However, this project is horribly out of date, uses MVC3, and uses packages that haven't been available for several years now. Trying to make it work is not feasible.
Is multiple services on a single role still a valid solution, or do I have to take another approach?

I found a blog post and potential solution, created in 2012. However, this project is horribly out of date, uses MVC3, and uses packages that haven't been available for several years now. Trying to make it work is not feasible.
Is multiple services on a single role still a valid solution, or do I have to take another approach?
I agree with David Makogon, Web / worker role stuff hasn't changed in years. If you’d like to include background processing into your web role, you could still create the web role and override the Run() method in class WebRole and implement worker tasks with your logic in Run() method. The following simple sample that add message to storage queue works fine on my side, please refer to it.
public class WebRole : RoleEntryPoint
{
public override bool OnStart()
{
return base.OnStart();
}
public override void Run()
{
//Replace the code with your logic
CloudStorageAccount storageAccount = CloudStorageAccount.Parse("my connection string");
CloudQueueClient queueClient = storageAccount.CreateCloudQueueClient();
CloudQueue queue = queueClient.GetQueueReference("mymessage");
while (true)
{
CloudQueueMessage message = new CloudQueueMessage("worker run at " + DateTime.UtcNow.ToString());
queue.AddMessage(message);
System.Threading.Thread.Sleep(60000);
}
}
}

Related

Queuing for web api methods

I have one web api method :-
[POST]
public void SaveData()
{
}
I want to implement in such a way that , if code inside SaveData() method is already getting executed , then SaveData method will not accept any other call.
i.e. I want to keep calls to the savedata method in a queue.
When one call will finish , other call will occure.
Note:- Calling SaveData() method is not within my scope. I can not decide to call this api method synchronously or asynchronously.
Only thing within my scope is writing code for SaveData Method. It will be called thorough external system.
I strongly suggest that you use a message queuing system in your application. mjwills provided two great message-broker software solutions in the comments section:
RabbitMQ: is open-source lightweight and easy to deploy on premises and in the cloud
Amazon Simple Queue Service (SQS): is a fully managed cloud based solution provided by AWS
There's an integration effort before you can use any of these solutions in your app, but afterwards it will be as simple as:
[POST]
public void SaveData()
{
var msg = new SaveDataMessage();
/* populate msg object... */
this.queueClient.Publish(msg);
}
This pseudo-code publishes a message to the queue. On the other end a queue subscriber will be receiving and processing theses messages sequentially, for instance:
public void OnMessage(SaveDataMessage msg)
{
/* process message... */
}
Additional benefits are i) the subscriber can run in an independent process, which means your API request can return immediately, and a background worker will take care of processing messages ii) this architecture will be compatible with load balanced APIs, as your app scales up.
It's definitely worth the effort of building this structure early on into your app. It will give you more flexibility and productivity in the long run.
I have written an article about this subject which contains more detailed, complementary information to this answer: Using queues to offload Web API

First Call to Web API is very slow C#

I have my Asp.Net WebApi hosted on Godaddy windows shared hosting.
When I access my api from different devices/machines, It takes around 30 sec for first request; after that, it works fine.
What is the issue? Can I make my web api run all the time? If so, how?
I have used Entity framework code first approach . Every time I face this issue when I call this api from my website which is:
Rs Travels - Go to holidays, click on domestic, see the slowness of web api.
Is there any way I can improve the performance of the web api?
If the API is not used often, it will take time on the first request to make things ready, it's the same if you restart IIS generally, things need to warm up.
Internally, we have a custom healthcheck system that calls specific URLs to monitor them, but as a consequence, it also keeps the service alive.
You could also do this fairly simply by creating a windows scheduler task locally, or on any server that simply calls the API periodically. It might be best to implement a specific Monitor method that performs any other keepalives that might be relevant.
Try this as an example Open Website from windows scheduler
It would be kinda difficult to change it since you do not own the web server (and thus its pool). You could try to call the api before you will actually need it (imagine a splash screen). Then it will be ready when you will actually need it. Of course, this will not work if form the initial page you are calling the API...
This worked for me !
https://stackoverflow.com/a/9474978/6426192
static Thread keepAliveThread = new Thread(KeepAlive);
protected void Application_Start()
{
keepAliveThread.Start();
}
protected void Application_End()
{
keepAliveThread.Abort();
}
static void KeepAlive()
{
while (true)
{
WebRequest req = WebRequest.Create("http://www.mywebsite.com/DummyPage.aspx");
req.GetResponse();
try
{
Thread.Sleep(60000);
}
catch (ThreadAbortException)
{
break;
}
}
}

Tracking outgoing requests in Azure Functions

As part of a Microservice based solution that we are building we have a number of Azure Functions sitting in one Azure Function App. The functions Orchestrate numerous requests to different APIs some of which take a long time to complete. We added Application Insights to the functions to allow for some tracking of the requests made, but dependency tracking is not working yet in Azure Functions. It is possible to manually track dependencies but that involves inserting some tracking code around each dependency call, however we want to avoid manually tracking dependencies on each and every call.
One of the solutions I have thought of would be to create a request tracker that tracks all outgoing web requests from the functions. Within the request tracker I could then track the dependency requests including their time. I want to hook the request tracker into some sort of web traffic handler, unfortunately I was unable to find much about doing this. A lot of posts mention using System.Net trace writer for this, but as far as I can see this requires a Web.config to setup and functions do not have one.
I have seen a few posts mentioning to create a request wrapper and place that on my outgoing requests, but unfortantely that is not an option as we use a number of packages that make requests internally. If you have any ideas that could get me going in the right direction please let me know. Thanks
Update:
I added the following helper method which allows me to manually track tasks as dependency requests
public static async Task<T> TrackDependency<T>(this Task<T> task, string dependecyName, string callName, string operationId)
{
var telemtryClient = new TelemetryClient();
var startTime = DateTime.UtcNow;
var timer = System.Diagnostics.Stopwatch.StartNew();
var success = true;
T result = default(T);
try
{
result = await task;
}
catch (Exception)
{
success = false;
}
finally
{
timer.Stop();
var dependencyTelemetry = new DependencyTelemetry(dependecyName, callName, startTime, timer.Elapsed, success);
dependencyTelemetry.Context.Operation.Id = operationId;
telemtryClient.Track(dependencyTelemetry);
}
return result;
}
It can then be used as follows:
client.Accounts.UpdateWithHttpMessagesAsync(accountId, account).TrackDependency("Accounts", "UpdateAccounts", requestContextProvider.CorrelationId);
I can now see individual request dependencies in Application Insights, but obviously the actual telemetry on them is very limited, it does not contain path info or much else.
So when you say dependency tracking is not working in Azure Functions, what exactly do you mean? Have you actually added and configured the Application Insights SDK to your actual function yet? The out-of-the-box monitoring experience with Azure Functions doesn't automatically add dependency tracing, but if you actually add/configure the Application Insights SDK in your function project it should start tracking everything going on in there.

Creating new AppDomain calling method in same class as the AppDomain in made

I want to start below potentially long running thread in it's own AppDomain to prevent the webserver from aborting it during recycling. It compiles fine, however during runtime I get this cryptic error
Type is not resolved for member 'MyCore.MyWebService,MyCore,
Version=5.0.0.0, Culture=neutral, PublicKeyToken=null'.
How do I find out what member is not resolved?
Are there any better ways running a long standing thread in a MVC business service layer, that does not get aborted by the server recycling mechanism?
Here is the code:
namespace MyCore
{
[Serializable]
public class MyWebService : IMyWebService
{
AppDomain domain = AppDomain.CreateDomain("Domain");
Thread.CurrentThread.Name = "MVCThread";
domain.SetData("lDatabaseID", lDatabaseID);
domain.DoCallBack(() =>
{
long lID = Convert.ToInt64(AppDomain.CurrentDomain.GetData("lDatabaseID"));
Thread thread = new Thread(
(() =>
{
PopulateTables(lID );
}));
thread.Name = "DomThread";
thread.Start();
});
}
}
IIS is heavily optimised to respond very quickly to hundreds of small simultaneous requests and just isn't the right tool for what you're attempting. You can try to work around that but in the long term you'll be better off building a tool that is designed for long-running tasks. You've then got a pre-packaged solution the next time this problem arises.
The basic idea is to create an external application that does your background processing with some way to pass tasks to it and get results back. I like using the database to communicate as most web applications that need baground processing already use a database. Add a 'tasks' table with {status, startedDateTime, finishedDateTime, parameters, etc}, then write an external application that will periodically look for a new task, complete it and update the database. Your web site can poll the database for status or your application could make an AJAX call to notify the web site when a job has completed (a small iframe in the web site header that shows waiting / completed tasks can be useful if someone will be waiting for the job to complete and is easy to do).
EDIT: Before you do the above review HangFire (which works inside IIS, as a Windows Service or as a console app). Same principles, but a pre-packaged solution. Note that I haven't implemented this yet but it looks good.
Although it's a bit of work to set up, handing this task off to a Windows Service is a good approach if you might have multiple tasks and need them responded to quickly. There are a lot of tutorials on the web that will help you create a Windows Service, such as http://www.codeproject.com/Articles/106742/Creating-a-simple-Windows-Service but you'll have to build a simple task executor on top of that so if that's the way you want to go I'd look for a pre-built task engine (I couldn't find one quickly but I'm probably using the wrong search phrase).
But that's overkill if turn-around time isn't important and a better approach for you might be to create a small console application that will be started every five minutes by task scheduler. It would connect to the database, execute any waiting tasks then shut down again. This is easier to debug and install than a Windows service and achieves the same goal of moving the task execution out of IIS.
Remember that you still have to detect and handle Windows shutdown so that you don't get half-finished orphaned jobs - at the very least just tag that task as aborted and exit cleanly.
Alright after having mucked with Hangfire, I finally got it to work in .Net 4.0 and MVC 3. Had to install Common.Logging.Core 2.2.0, since the NuGet installed the wrong version (3.3.0)
In my Initial controller I added the following
namespace Core.Controllers
{
...
public void Configuration(IAppBuilder app)
{
app.UseHangfire(config =>
{
config.UseSqlServerStorage(ConnectionString.GetTVConnectionString());
config.UseServer();
});
}
...
}
ConnectionString.GetTVConnectionString() gets the connection string from the config file.
Up top I added the following
[assembly: OwinStartup(typeof(Core.Controllers.BaseController))]
In the code that starts the background thread I added the following, passing in a long instead of the class and having the job load the POCO class from the db.
BackgroundJob.Enqueue(() => PopulateTables(lDatabaseID, JobCancellationToken.Null));
The Enqueue() function returns a job id, that later can be used to cancel the job if needed, through the BackgroundJob.Delete(jobid) function.
In the job method I then have this
while (idxMin < max)
{
try
{
cancellationToken.ThrowIfCancellationRequested();
....
}
catch (JobAbortedException jobEx)
{
....
}
}
It's important to use dependency injection, so my class had a parameter less constructor added that re-reads the connection string rather than have it passed in.
public MyWebService ()
: this(ConnectionString.GetTVConnectionString())
{
}
public MyWebService (string sConnStr)
{
msConnStr = sConnStr;
}
After that it seems to run pretty well. A number of tables are added to the database specified in the connection string. So far it seems like the jobs survive recycling on the webserver.

Schedule a job in hosted web server

Can some one give me a best way to implement a daily job with .NET technology.
I have an asp.net application with the sqlserver database hosted in shared hosting, GODaddy in my instance.
My application is used to add / change the data in the database which is performing quite fairly at this time.
I got a new requirement to send some email alerts daily based on some data criteria that were stored in the database.
Initially I thought to write a windows service, but godaddy is not allowing to access the database other than its hosted applications.
Does someone has any idea to send alerts daily at 1:00AM?
Thanks in advance
See Easy Background Tasks in ASP.NET by Jeff Atwood.
Copy/paste from the link:
private static CacheItemRemovedCallback OnCacheRemove = null;
protected void Application_Start(object sender, EventArgs e)
{
AddTask("DoStuff", 60);
}
private void AddTask(string name, int seconds)
{
OnCacheRemove = new CacheItemRemovedCallback(CacheItemRemoved);
HttpRuntime.Cache.Insert(name, seconds, null,
DateTime.Now.AddSeconds(seconds), Cache.NoSlidingExpiration,
CacheItemPriority.NotRemovable, OnCacheRemove);
}
public void CacheItemRemoved(string k, object v, CacheItemRemovedReason r)
{
// do stuff here if it matches our taskname, like WebRequest
// re-add our task so it recurs
AddTask(k, Convert.ToInt32(v));
}
I haven't used GoDaddy for anything other than domain registration, so I have no experience with what you can or cannot do on their hosting platform. I also don't know what their support or knowledge base is like, but I'd say your best option is to ask GoDaddy what they recommend. Otherwise, you might keep implementing something that's technically feasible, but is blocked by the hosting company.
If it's not something that's a prime-time application, one quick and dirty thing to do is to have some kind of external bot calling a (secure) web page on the server that fires off the notification process. Not a real solution, but if this site is just a hobby of yours, it could get you by until you find something the host will allow.
Might also be a good time to find a new host, if this one is not meeting your requirements. There are lots of good ASP.NET hosts available these days.
You can use windows scheduler from the web server to schedule a stored procedure call that can send mail based on particular criteria.
osql.exe -S servername -d database -U username -P password -Q "EXEC spAlertOnCriteria"
References:
osql
Task Scheduler
Many hosting providers can request a URL for you every X minutes. I don't know if GoDaddy does, but if so, you could create an ASMX page that kicks off the job, and tell them to execute it automatically.
If they don't, one solution might be to fire off the job in a background thread at every page request. If you do that, make sure you put in code that limits it to running every X minutes or more (perhaps using a static variable or a database table) - read this story
If you can expose a service on the website hosting the application and database -- authenticated service, of course -- then you can hit that service remotely from any box with credentials, pull down the data, and send the mail that way.
This could be an automated process written as a Windows service, an application that is run under the Scheduler, or some button you push at 1:00 AM. Your pick.
Just because the app is the only thing that can access the database doesn't mean you can't expose the data in other ways.
Use either System.Timers, System.Threading to create a instance that is run at a predetermined time. Have that thread execute whatever the task is that you want... Make sure the code is thread safe!

Categories

Resources