I've managed to get logstash working as a Proof Of Concept, locally on my windows setup. Here's an overview of the setup:
My .Net App writes messages to a logger.txt file, through NLogger.
I have logstash running as a windows service after following this guide (https://community.ulyaoth.net/threads/how-to-install-logstash-on-a-windows-server-with-kibana-in-iis.17/)
The logstash agent takes the .txt file as an 'input', filters the info into more meaningful fields then writes these to my ElastcSearch instance.
I can then view these with the Kibana Web UI to get some meaningful info from my logs.
Happy days (works on my machine!)
My problem arrives when trying to move this onto my production environment on windows Azure.
All the info on the web currently mentions running Linux VMs to host logstash on Azure but being a M$ monkey, I'd rather not.
So any guides/advice on getting the logstash agent running on Azure in a fail-safe/ automated manner is much appreciated?
Microsoft has a PowerShell script available that is able to setup ELK in Azure. See https://github.com/mspnp/semantic-logging/tree/elk.
Related
I have a c# console application that runs on an Azure VM - the application runs OK and does what it needs to. The application connects to an Azure storage account downloads a file of MAC addresses and creates an snmp trap for each MAC address which it sends to a Azure Load Balancer - this all works fine.
Is there a way I can call, monitor a c# console application from an ADF pipeline and run it as a server-less application on Azure - maybe using Azure Functions or Azure Batch or Azure other? Can you recommend the best Azure Service/Technology to use?
The Azure server-less solution will need to be able to connect to an Azure Storage Account and Key Vault so permissions is something I need advice on ?
NB: this c# application is not a big data application or require massive parallel processing.
Regards
I have found a solution to this problem.
The c# console application can be run via Azure Batch on a node. A node is a VM running in Azure.
The Az data factory has a component that can call a Azure Batch which runs the c# console app.
I am looking for a solution to send the application logs generated on iot edge devices to an azure log analytics workspace.
I have tried using the Microsoft Monitoring agent using which I was able to send logs generated by running docker containers. However, on an edge device, we are using the moby engine instead of the docker daemon because of which monitoring agent is not collecting the log records(followed this set up to run with docker - https://learn.microsoft.com/en-us/azure/azure-monitor/insights/containers#install-and-configure-windows-container-hosts). Moreover, since I am running my edge environment on windows, I didn't find any container image of monitoring agent targeted for windows.(present for Linux https://hub.docker.com/r/microsoft/oms/)
I am looking for a completely automated way of streaming application logs , generated on the edge device, to azure log analytics workspace.
There is no built-in way as of today (might be worth checking with team on Github, as they might have this on the roadmap).
However, you can build your own solution using the new log-pull feature:
Write a small time-triggered Azure Function that pulls the logs every few minutes for the containers you are interested in (or all containers). The logs will be written to a storage account
A second blob-triggered Function picks up the uploaded logs and sends them into Log Analytics.
//Edit: Very new feature (still in Release Candidate for Edge 1.0.9): https://github.com/veyalla/ehm This might be exactly what you are looking for
There are 2 ways for publish website to Azure - via simple Publish feature vs Deploy as Cloud service. I have also one worker role in solution, so, I selected Cloud Service instead of simple Publish website feature.
But I'm very disappointed with Cloud service. First at all, deploy as cloud service takes in 10 times more time, than simple Publish website. Second problem - I have to each time, when I want to deploy, change connection strings in web.config to SQL Azure (instead of my local SQL Server). Website Publish has ability to set necessary SQL connection strings for deploy. Maybe I do something wrong and deploy can doing in 10 sec and exist ability to set different connection strings (like Website publish)?
I think about put to Cloud only worker role and website deploy as website, without Cloud service...
First, I would highly recommend that you go through this question comparing Azure Websites and Cloud Service: What is the difference between an Azure Web Site and an Azure Web Role
Now coming on to your questions:
First at all, deploy as cloud service takes in 10 times more time,
than simple Publish website.
It is bound to happen because when you deploy a cloud service (say through Visual Studio), following things happen that will cause the delay:
As a part of build process for cloud services, Visual Studio creates a package file and uploads it into blob storage. This package is then used to create a cloud service.
Azure Fabric Controller which is responsible for managing life cycle of a cloud service creates a brand new Virtual Machine for you, installs necessary software (IIS for example) and then deploys your code from the package file.
Both of these things don't happen in websites.
Second problem - I have to each time, when I want to deploy, change
connection strings in web.config to SQL Azure (instead of my local SQL
Server). Website Publish has ability to set necessary SQL connection
strings for deploy. Maybe I do something wrong and deploy can doing in
10 sec and exist ability to set different connection strings (like
Website publish)?
You're not doing anything wrong per se. Your web.config file gets bundled into the package file so after any change you make to your web.config file, you would need to recreate the package and update the deployment (which will include uploading to blob storage).
One possible solution for your problem would be to use config transformation and have your web.config.release file contain the connection string for your production database. When you build your project in release mode, you will have correct connection string in your web.config file.
I think about put to Cloud only worker role and website deploy as
website, without Cloud service...
This is certainly a viable option. Another alternative would be look into WebJobs. Like Worker Roles, they are meant for handling background processing workloads but have the same convenience of a website when it comes to deployment. You may also find this blog post useful as well: http://www.hanselman.com/blog/IntroducingWindowsAzureWebJobs.aspx.
I am trying without complete success to create a wso2 identity server on a windows server 2008 vm. I followed the online instructions and installed the pre-requisits (jre and jdk), downloaded the zip file, setup the environment variables, and ran the wso2server.bat file. There were a lot of errors.
I then realized I needed to add active directory role on the server, which I did. It still wouldn't install. I did some more online research which led me to believe I needed to install wso2 esb as well, which I did. I was able to get that install to work and was able to create a desktop app to consume the web services. All well and good. but then as part of the requirements they wanted to enable passive sts with an asp.net client (not really sure what that is), so I went back to the identity server and am still getting errors while running the bat file, though I an able to run the gui from the browser, but unable to log in.
The exception I am getting is :
TID: [0] [IS] [2013-10-07 10:34:58,746] ERROR {org.wso2.carbon.event.core.internal.builder.EventBrokerHandler} - Can not create the event broker {org.wso2.carbon.event.core.internal.builder.EventBrokerHandler} org.wso2.carbon.event.core.exception.EventBrokerConfigurationException: Can not access the user registry
in addition, I am also getting authentication errors on some of the esb web calls, but not all, and I would also like to know how to change from the default user store (ldap or ad or whatever it is) to sql server.
I have seen a few examples for doing it with mysql and oracle, but not sql server, specific product information is sometimes challenging to find.
Any help would be greatly appreciated.
Thanks. Mike
I have developed my application with mongoDB and now I'm ready to live in cloud.
I have followed the tutorial from official mongoDB website about how to deploy it in
windows azure's worker role.
I have tried with local deployment in windows azure emulator, everything work really fine.
But when I tried to publish it into cloud service, the result wasn't like my expectation.
MongoDB.WindowsAzure.MongoDBRole_IN_X (where X is the instance number) is always in busy status with message ..
Starting role...
UnhandledException:Microsoft.WindowsAzure.StorageClient.CloudDriveException
I have no clue about this.
Could anyone have any suggestion ?
Thx.
PS1. Sorry for my english.
PS2. I have used the latest version of windows azure SDK.
In that worker role setup, MongoDb sets itself up to store the database on a durable drive (basically a vhd mounted in a blob). Since you're getting a CloudDriveException, the first thing I'd look at is the storage connection string for the account being used for mounting drives. Could it be that your configuration file is still pointing at local dev storage?