Tracking outgoing requests in Azure Functions - c#

As part of a Microservice based solution that we are building we have a number of Azure Functions sitting in one Azure Function App. The functions Orchestrate numerous requests to different APIs some of which take a long time to complete. We added Application Insights to the functions to allow for some tracking of the requests made, but dependency tracking is not working yet in Azure Functions. It is possible to manually track dependencies but that involves inserting some tracking code around each dependency call, however we want to avoid manually tracking dependencies on each and every call.
One of the solutions I have thought of would be to create a request tracker that tracks all outgoing web requests from the functions. Within the request tracker I could then track the dependency requests including their time. I want to hook the request tracker into some sort of web traffic handler, unfortunately I was unable to find much about doing this. A lot of posts mention using System.Net trace writer for this, but as far as I can see this requires a Web.config to setup and functions do not have one.
I have seen a few posts mentioning to create a request wrapper and place that on my outgoing requests, but unfortantely that is not an option as we use a number of packages that make requests internally. If you have any ideas that could get me going in the right direction please let me know. Thanks
Update:
I added the following helper method which allows me to manually track tasks as dependency requests
public static async Task<T> TrackDependency<T>(this Task<T> task, string dependecyName, string callName, string operationId)
{
var telemtryClient = new TelemetryClient();
var startTime = DateTime.UtcNow;
var timer = System.Diagnostics.Stopwatch.StartNew();
var success = true;
T result = default(T);
try
{
result = await task;
}
catch (Exception)
{
success = false;
}
finally
{
timer.Stop();
var dependencyTelemetry = new DependencyTelemetry(dependecyName, callName, startTime, timer.Elapsed, success);
dependencyTelemetry.Context.Operation.Id = operationId;
telemtryClient.Track(dependencyTelemetry);
}
return result;
}
It can then be used as follows:
client.Accounts.UpdateWithHttpMessagesAsync(accountId, account).TrackDependency("Accounts", "UpdateAccounts", requestContextProvider.CorrelationId);
I can now see individual request dependencies in Application Insights, but obviously the actual telemetry on them is very limited, it does not contain path info or much else.

So when you say dependency tracking is not working in Azure Functions, what exactly do you mean? Have you actually added and configured the Application Insights SDK to your actual function yet? The out-of-the-box monitoring experience with Azure Functions doesn't automatically add dependency tracing, but if you actually add/configure the Application Insights SDK in your function project it should start tracking everything going on in there.

Related

simple webjob - process the response from a web link and save it to blob on a regular time interval - imposible to find an example or solution

I am looking for an example for a simple webjob:
the task would be to process the response from a web link and save it to blob on a regular time interval.
first of all the ms documentation is confusing me as far as time triggers are concerned:
https://learn.microsoft.com/en-us/azure/app-service/webjobs-create#ncrontab-expressions
https://learn.microsoft.com/en-us/azure/azure-functions/functions-bindings-timer?tabs=csharp#example
and also how exactly should I proceed on building the WebJob, should I use an azure webjob template (.net 4.x.x), or .net core console app ??
https://learn.microsoft.com/en-us/azure/app-service/webjobs-sdk-how-to
https://github.com/Azure/azure-webjobs-sdk-samples/tree/master/BasicSamples
https://learn.microsoft.com/en-us/azure/app-service/webjobs-sdk-get-started
https://learn.microsoft.com/en-us/azure/app-service/webjobs-create
all this resource and no simple example for a time scheduled task that would get a web response, also the confusion on building the webjob VS, wth?? I want to build a c# app in VS and deploy to azure as webjob via azure devops.
wasted 3 days on this since im not a .net developer...
Webjobs have changed and grown over the years including contributions from Azure Functions, which is also built on top of the Webjobs SDK. I can see how this can get confusing, but the short answer is that all of the different methods are still valid, but some are newer than others. Of the two timer trigger styles, the second is more current.
I generally recommend Functions instead of Webjobs for something like this since at this point as it will save you some boiler-plate code, but it is entirely up to you. As I mentioned, the foundations are very similar. You can deploy Functions apps to any App Service plan, including the Consumption plan- this is specific to Functions that is pay-by-usage instead of a monthly fee like you would need for WebJobs.
As far as .NET Framework vs. .NET Core, you can use it will depend on what runtime you used to set up your App Service. If you have a choice, I would recommend using Core since that will be the only version moving forward. If you elect to use Functions, you will definitely want to use Core.
As far as the Console App question, all WebJobs are essentially console apps. From a code perspective, they are a console app that implements the Webjobs SDK. You could run them outside of Azure if you wanted to. Functions apps are different. The Function's host is what actually runs behind the scenes and you are creating a class library that the host consumes.
Visual Studio vs. Visual Studio Code is very much a personal preference. I prefer VS for Webjobs and work with both VS and VS Code for Functions apps depending on which language I am working in.
The most basic version of a Webjob in .NET Core that pulls data from a webpage on a schedule and outputs it to blob storage would look something like this. A Function app would use exactly the same GetWebsiteData() method plus a [FunctionName("GetWebsiteData")] at the beginning, but you wouldn't need the Main method as that part is handled by the host process.
public class Program
{
static async Task Main(string[] args)
{
var builder = new HostBuilder();
builder.ConfigureWebJobs(b =>
{
b.AddAzureStorageCoreServices();
b.AddAzureStorage();
b.AddTimers();
});
builder.ConfigureAppConfiguration((context, configurationBuilder) =>
{
configurationBuilder
.AddJsonFile($"appsettings.json", optional: true);
});
var host = builder.Build();
using (host)
{
await host.RunAsync();
}
}
public async static void GetWebsiteData(
[TimerTrigger("0 */1 * * * *")] TimerInfo timerInfo,
[Blob("data/websiteData", FileAccess.Write)] Stream outputBlob,
ILogger logger)
{
using(var client = new HttpClient())
{
var url = "https://microsoft.com";
var result = await client.GetAsync(url);
//you may need to do some additional work here to get the output format you want
outputBlob = await result.Content.ReadAsStreamAsync();
}
}
}

Detect is when a windows service has been deleted

Is there a way to detect when a windows service has been deleted? I've checked the event log but it doesn't pick up deleted actions only added.
I believe there may be a way using audit logs but I'm unsure how to do this?
Any help is much appreciated.
Thanks
While there is no trace of service deletion in Event or Audit logs, what you can do is create a small console app that detects if a service exists and attach this app to Windows Task Scheduler such that it is scheduled to execute based on frequency or a Trigger that you can customize to your requirements such that you will receive an alert if a service has been added or removed etc..
The console app is designed such that on the first run, it logs all
the services on the system and on the subsequent runs it will be
tracking changes made on the services via servicesRemoved and
servicesAdded, with this we can decide what action to take when a
service has been modified
Console App: ServiceDetector.exe
static void Main(string[] args)
{
var path = #"C:\AdminLocation\ServicesLog.txt";
var currentServiceCollection = ServiceController.GetServices().Select(s => s.ServiceName).ToList(); //Queries the most current Services from the machine
if (!File.Exists(path)) //Creates a Log file with current services if not present, usually means the first run
{
// Assumption made is that this is the first run
using (var text = File.AppendText(path))
{
currentServiceCollection.ForEach((s) => text.WriteLine(s));
}
return;
}
// Fetches the recorded services from the Log
var existingServiceCollection = File.ReadAllLines(path).ToList();
var servicesRemoved = existingServiceCollection.Except(currentServiceCollection).ToList();
var servicesAdded = currentServiceCollection.Except(existingServiceCollection).ToList();
if (!servicesAdded.Any() && !servicesRemoved.Any())
{ Console.WriteLine("No services have been added or removed"); return; }
//If any services has been added
if (servicesAdded.Any())
{
Console.WriteLine("One or more services has been added");
using (var text = File.AppendText(path))
{
servicesAdded.ForEach((s) => text.WriteLine(s));
}
return;
}
//Service(s) may have been deleted, you can choose to record it or not based on your requirements
Console.WriteLine("One or more services has been removed");
}
Scheduling Task
Windows Start > Task Scheduler > Create Basic Task > Set Trigger > Attach your exe > Finish
You're right that deleting a Windows Service does cause an event to be added to the System Event Log (source: https://superuser.com/questions/1238311/how-can-we-detect-if-a-windows-service-is-deleted-is-there-an-event-log-id-for-i).
AFAIK there's no audit policy to audit the deletion of a service and I think if there were I think it would be listed here: https://learn.microsoft.com/en-us/windows/security/threat-protection/auditing/basic-audit-process-tracking
I assume polling ServiceController.GetServices() is out of the question because your program might not be running when the service is uninstalled?
There are lots of ways to build instrumentation, until you learn what constitutes good instrumentation. My how-to is essentially taken directly from the Wikipedia entry https://en.wikipedia.org/wiki/Instrumentation.
Instrumentation How-to
http://www.powersemantics.com/e.html
Non-integrated
Primary data only
Pull not push
Organized by process
Never offline
The solution to the problem of measuring indicators exists, but you're stuck conceptualizing how to also have "push-based" instrumentation signal another system. As my E article explains, instruments should always pull data never push it. Event-driven signalling is a potential point of failure you don't need.
To clear up any indecisiveness or doubts you may have about building a separate application, monitors are normally independent (non-integrated as Wikipedia says) processes. So saying your monitor "might not be running" means you have not chosen to build a real non-integrated monitor, one which is always on. Your consumer system doesn't correctly model instrumentation, because it integrates the check in its own process.
Separate these responsibilities and proceed. Decide how often the instrument should reasonably poll for deleted services and poll the data with a timer. If you use the API call simon-pearson suggested, you can also detect when services have been added. Of course, the monitor needs to locally cache a copy of the service list so that indicators can infer what's been added or removed.

Queuing for web api methods

I have one web api method :-
[POST]
public void SaveData()
{
}
I want to implement in such a way that , if code inside SaveData() method is already getting executed , then SaveData method will not accept any other call.
i.e. I want to keep calls to the savedata method in a queue.
When one call will finish , other call will occure.
Note:- Calling SaveData() method is not within my scope. I can not decide to call this api method synchronously or asynchronously.
Only thing within my scope is writing code for SaveData Method. It will be called thorough external system.
I strongly suggest that you use a message queuing system in your application. mjwills provided two great message-broker software solutions in the comments section:
RabbitMQ: is open-source lightweight and easy to deploy on premises and in the cloud
Amazon Simple Queue Service (SQS): is a fully managed cloud based solution provided by AWS
There's an integration effort before you can use any of these solutions in your app, but afterwards it will be as simple as:
[POST]
public void SaveData()
{
var msg = new SaveDataMessage();
/* populate msg object... */
this.queueClient.Publish(msg);
}
This pseudo-code publishes a message to the queue. On the other end a queue subscriber will be receiving and processing theses messages sequentially, for instance:
public void OnMessage(SaveDataMessage msg)
{
/* process message... */
}
Additional benefits are i) the subscriber can run in an independent process, which means your API request can return immediately, and a background worker will take care of processing messages ii) this architecture will be compatible with load balanced APIs, as your app scales up.
It's definitely worth the effort of building this structure early on into your app. It will give you more flexibility and productivity in the long run.
I have written an article about this subject which contains more detailed, complementary information to this answer: Using queues to offload Web API

First Call to Web API is very slow C#

I have my Asp.Net WebApi hosted on Godaddy windows shared hosting.
When I access my api from different devices/machines, It takes around 30 sec for first request; after that, it works fine.
What is the issue? Can I make my web api run all the time? If so, how?
I have used Entity framework code first approach . Every time I face this issue when I call this api from my website which is:
Rs Travels - Go to holidays, click on domestic, see the slowness of web api.
Is there any way I can improve the performance of the web api?
If the API is not used often, it will take time on the first request to make things ready, it's the same if you restart IIS generally, things need to warm up.
Internally, we have a custom healthcheck system that calls specific URLs to monitor them, but as a consequence, it also keeps the service alive.
You could also do this fairly simply by creating a windows scheduler task locally, or on any server that simply calls the API periodically. It might be best to implement a specific Monitor method that performs any other keepalives that might be relevant.
Try this as an example Open Website from windows scheduler
It would be kinda difficult to change it since you do not own the web server (and thus its pool). You could try to call the api before you will actually need it (imagine a splash screen). Then it will be ready when you will actually need it. Of course, this will not work if form the initial page you are calling the API...
This worked for me !
https://stackoverflow.com/a/9474978/6426192
static Thread keepAliveThread = new Thread(KeepAlive);
protected void Application_Start()
{
keepAliveThread.Start();
}
protected void Application_End()
{
keepAliveThread.Abort();
}
static void KeepAlive()
{
while (true)
{
WebRequest req = WebRequest.Create("http://www.mywebsite.com/DummyPage.aspx");
req.GetResponse();
try
{
Thread.Sleep(60000);
}
catch (ThreadAbortException)
{
break;
}
}
}

Combining multiple services in a single Azure web role

I have a web application hosted on Azure, which spins up roles for a short period of time to perform a task. This task is necessary, so I can't do without it. I want to combine these services and assign them to a single role, so that when the service is run ten times in an hour, I don't get billed for ten hours of use.
I found a blog post and potential solution, created in 2012. However, this project is horribly out of date, uses MVC3, and uses packages that haven't been available for several years now. Trying to make it work is not feasible.
Is multiple services on a single role still a valid solution, or do I have to take another approach?
I found a blog post and potential solution, created in 2012. However, this project is horribly out of date, uses MVC3, and uses packages that haven't been available for several years now. Trying to make it work is not feasible.
Is multiple services on a single role still a valid solution, or do I have to take another approach?
I agree with David Makogon, Web / worker role stuff hasn't changed in years. If you’d like to include background processing into your web role, you could still create the web role and override the Run() method in class WebRole and implement worker tasks with your logic in Run() method. The following simple sample that add message to storage queue works fine on my side, please refer to it.
public class WebRole : RoleEntryPoint
{
public override bool OnStart()
{
return base.OnStart();
}
public override void Run()
{
//Replace the code with your logic
CloudStorageAccount storageAccount = CloudStorageAccount.Parse("my connection string");
CloudQueueClient queueClient = storageAccount.CreateCloudQueueClient();
CloudQueue queue = queueClient.GetQueueReference("mymessage");
while (true)
{
CloudQueueMessage message = new CloudQueueMessage("worker run at " + DateTime.UtcNow.ToString());
queue.AddMessage(message);
System.Threading.Thread.Sleep(60000);
}
}
}

Categories

Resources