I have a c# console application that runs on an Azure VM - the application runs OK and does what it needs to. The application connects to an Azure storage account downloads a file of MAC addresses and creates an snmp trap for each MAC address which it sends to a Azure Load Balancer - this all works fine.
Is there a way I can call, monitor a c# console application from an ADF pipeline and run it as a server-less application on Azure - maybe using Azure Functions or Azure Batch or Azure other? Can you recommend the best Azure Service/Technology to use?
The Azure server-less solution will need to be able to connect to an Azure Storage Account and Key Vault so permissions is something I need advice on ?
NB: this c# application is not a big data application or require massive parallel processing.
Regards
I have found a solution to this problem.
The c# console application can be run via Azure Batch on a node. A node is a VM running in Azure.
The Az data factory has a component that can call a Azure Batch which runs the c# console app.
Related
I'm using the Quartz Enterprise Scheduler .NET for my jobs. Can I port my console application wrapped on the Quartz into Windows Azure Web App? I just need to install "Windows Service" and configure some port to listen on Azure. I don't want to use Windows Azure VM because I don't want to configure server and install updates, etc.
Think of Web Apps as a managed IIS instance - you can't access the host OS at all. So no installing things or configuring ports.
You might take a look at Cloud Services which are kind of a hybrid of Web Apps and VMs.
You can use Azure Web Jobs, which will in the same context of Azure Web App. It accepts console application and also .sh, .js and other formats. Probably you won't need to change almost nothing in your code.
I've managed to get logstash working as a Proof Of Concept, locally on my windows setup. Here's an overview of the setup:
My .Net App writes messages to a logger.txt file, through NLogger.
I have logstash running as a windows service after following this guide (https://community.ulyaoth.net/threads/how-to-install-logstash-on-a-windows-server-with-kibana-in-iis.17/)
The logstash agent takes the .txt file as an 'input', filters the info into more meaningful fields then writes these to my ElastcSearch instance.
I can then view these with the Kibana Web UI to get some meaningful info from my logs.
Happy days (works on my machine!)
My problem arrives when trying to move this onto my production environment on windows Azure.
All the info on the web currently mentions running Linux VMs to host logstash on Azure but being a M$ monkey, I'd rather not.
So any guides/advice on getting the logstash agent running on Azure in a fail-safe/ automated manner is much appreciated?
Microsoft has a PowerShell script available that is able to setup ELK in Azure. See https://github.com/mspnp/semantic-logging/tree/elk.
In advance, sorry for stupid question, but all search results in google are off-topic for me.
I want to create a C# application, that will run continuously on an Azure VM. It should NOT be event driven, as it will use different factors (db monitoring, time schedule, overall usage) to decide its activity.
Now, should I just create a console app in VS Express 2013 for Desktop and run it using RDP on VM? Or is there some azure-specific project that I can use (maybe for better integration with management portal)? All I can see and find is web-related (a website, a webjob, a background worker, a webapi), and mine app will not in any way be accessed remotely (it will periodically check db shared with a ASP.NET website)
It is possible. You should create the equivalent of a Windows Service, but then for Azure.
There is a useful question for that already on SO: Windows Service to Azure?
It has a reference to a full walk though: Migrating a Windows service to Windows Azure.
The corresponding azure type is "Cloud Services / Worker role". They work just as windows services.
So you can basically take all classes from your windows service (except Service1.cs) and put them in the new azure project.
The copy all start/stop code from your Service1.cs to the corresponding class in your new Cloud service project.
http://www.windowsazure.com/en-us/documentation/articles/cloud-services-dotnet-multi-tier-app-storage-4-worker-role-a/
I have developed my application with mongoDB and now I'm ready to live in cloud.
I have followed the tutorial from official mongoDB website about how to deploy it in
windows azure's worker role.
I have tried with local deployment in windows azure emulator, everything work really fine.
But when I tried to publish it into cloud service, the result wasn't like my expectation.
MongoDB.WindowsAzure.MongoDBRole_IN_X (where X is the instance number) is always in busy status with message ..
Starting role...
UnhandledException:Microsoft.WindowsAzure.StorageClient.CloudDriveException
I have no clue about this.
Could anyone have any suggestion ?
Thx.
PS1. Sorry for my english.
PS2. I have used the latest version of windows azure SDK.
In that worker role setup, MongoDb sets itself up to store the database on a durable drive (basically a vhd mounted in a blob). Since you're getting a CloudDriveException, the first thing I'd look at is the storage connection string for the account being used for mounting drives. Could it be that your configuration file is still pointing at local dev storage?
Can an application recognize whether it runs in a cloud or on a normal server?
If that is possible, I could automatically let the application deside wehere to store f.e. user pictures: IO or blob storage.
Check the RoleManager.IsRoleManagerRunning property. This will be true if your app is running under Azure Fabric. Note however, that this can be the development fabric (the fabric that is running on your development machine during your dev phase) or the actual Azure cloud fabric.