I have developed my application with mongoDB and now I'm ready to live in cloud.
I have followed the tutorial from official mongoDB website about how to deploy it in
windows azure's worker role.
I have tried with local deployment in windows azure emulator, everything work really fine.
But when I tried to publish it into cloud service, the result wasn't like my expectation.
MongoDB.WindowsAzure.MongoDBRole_IN_X (where X is the instance number) is always in busy status with message ..
Starting role...
UnhandledException:Microsoft.WindowsAzure.StorageClient.CloudDriveException
I have no clue about this.
Could anyone have any suggestion ?
Thx.
PS1. Sorry for my english.
PS2. I have used the latest version of windows azure SDK.
In that worker role setup, MongoDb sets itself up to store the database on a durable drive (basically a vhd mounted in a blob). Since you're getting a CloudDriveException, the first thing I'd look at is the storage connection string for the account being used for mounting drives. Could it be that your configuration file is still pointing at local dev storage?
Related
I have a c# console application that runs on an Azure VM - the application runs OK and does what it needs to. The application connects to an Azure storage account downloads a file of MAC addresses and creates an snmp trap for each MAC address which it sends to a Azure Load Balancer - this all works fine.
Is there a way I can call, monitor a c# console application from an ADF pipeline and run it as a server-less application on Azure - maybe using Azure Functions or Azure Batch or Azure other? Can you recommend the best Azure Service/Technology to use?
The Azure server-less solution will need to be able to connect to an Azure Storage Account and Key Vault so permissions is something I need advice on ?
NB: this c# application is not a big data application or require massive parallel processing.
Regards
I have found a solution to this problem.
The c# console application can be run via Azure Batch on a node. A node is a VM running in Azure.
The Az data factory has a component that can call a Azure Batch which runs the c# console app.
I've managed to get logstash working as a Proof Of Concept, locally on my windows setup. Here's an overview of the setup:
My .Net App writes messages to a logger.txt file, through NLogger.
I have logstash running as a windows service after following this guide (https://community.ulyaoth.net/threads/how-to-install-logstash-on-a-windows-server-with-kibana-in-iis.17/)
The logstash agent takes the .txt file as an 'input', filters the info into more meaningful fields then writes these to my ElastcSearch instance.
I can then view these with the Kibana Web UI to get some meaningful info from my logs.
Happy days (works on my machine!)
My problem arrives when trying to move this onto my production environment on windows Azure.
All the info on the web currently mentions running Linux VMs to host logstash on Azure but being a M$ monkey, I'd rather not.
So any guides/advice on getting the logstash agent running on Azure in a fail-safe/ automated manner is much appreciated?
Microsoft has a PowerShell script available that is able to setup ELK in Azure. See https://github.com/mspnp/semantic-logging/tree/elk.
I am working with MS Excel files in my web application is hosted on Azure
I never run into following error when I am trying to access the excel file on my development fabric, but once I deploy to Azure, I get this error message.
The 'Microsoft.ACE.OleDb.12.0' provider is not registered on the local machine.
I do not want to change my code and cannot use any 3rd party tool. My questions are:
Is there a way around this issue?
Can I create a VM on Azure, install the OLEDB Driver and upload my site there?
No such support is there. I guess the development fabric is flawed in this area of azure
You can do anything with an Azure VM that you can with a local machine :)
Just create one through the portal, login to it and go here:
http://www.microsoft.com/en-au/download/details.aspx?id=13255
Based on this post you may also need to download SQLEXPR_x86_ENU.exe from here and tick Allow In Process calls in the provider.
I'm pretty new to the Azure Platform, I have tried to search on Google for any assistance but unfortunately my Google searching skills aren't the best.
I have a Linux VM in Azure, the Linux VM occasionally will have wav files on it, that will need to be copied off of it.
My plan is to use a Wokrer Role to access the Linux VM and copy the files off using scp, and then storing them in a storage account in Azure.
Is it possible to have a few pointers in the right direction of how this could be accomplished?
In your case, there would be no need for a worker role (which is nothing more than a Windows Server VM running in a Cloud Service). If you did need a worker role instance talking to a Linux instance, you'd have to connect them with a Virtual network. I'm guessing you're just starting out, and that solution sounds over-engineered.
Instead: Just write directly to blob storage from your linux-based app. If you're using .NET, Java, PHP, Python, or Ruby, there are already SDKs that handle this for you - go here and scroll down to Developer Centers, download the SDK of choice, and then look at some of the getting-started tutorials.
Just remember that blob storage is Storage-as-a-Service, accessible from anywhere. Underneath, it's just REST calls, with the language sdk's wrapping those calls.
There are more examples in the Azure Training Kit.
In Azure Development Storage's UI, there's a Reset button which stops, wipes, and restarts the devstore. Is there a programmatic way of doing this, similar to how I can stop and start the storage using
DevStore.Shutdown();
While I haven't reset the devstore programmatically, I suppose you could shell out to DSInit.exe programmatically:
DSInit /ForceCreate
#david-makogon's response is correct for the version of the SDK used in 2011, however in later versions of Azure Storage Emulator dsinit had been replaced with WAStorageEmulator and then with AzureStorageEmulator. (Maybe there was something else in between, but it doesn't matter as long as you use the latest SDK at the time of writing of this answer.)
A good overview of what's used at the moment can be found in Use the Azure Storage Emulator for Development and Testing.
And regarding your question that would be:
Start Azure Storage Emulator to bring up the storage emulator command-line tool.
Run AzureStorageEmulator init /forceCreate