Creating Azure VM via C# Throws Error While Creating Resource Group - c#

I'm attempting to create a VM programmatically...actually, following an example in a book. Before running the program I went ahead and created an Azure AD application and service principal via the portal https://learn.microsoft.com/en-us/azure/active-directory/develop/howto-create-service-principal-portal. (On a side note, maybe someone can explain to me why one needs to do this, but can create a VM straightaway via the portal without creating a service principal/AD app.)
While running the app, I'm able to successfully create the management client. Next step is to create the resource group, and that's where it fails with a "System.Net.Http.HttpRequestException: 'No such host is known.'" error. Please advise as to what could be the problem here. Thank you.
//Create the management client. This will be used for all the operations we will perform in Azure.
var credentials = SdkContext.AzureCredentialsFactory.FromFile("../../../azureauth.properties");
var azure = Azure.Configure().WithLogLevel(HttpLoggingDelegatingHandler.Level.Basic).Authenticate(credentials).WithDefaultSubscription();
//Create a resource group
var groupName = "az204-ResourceGroup";
var vmName = "az204VMTesting";
var location = Region.USEast;
var vNetName = "az204VNET";
var vNetAddress = "172.16.0.0/16";
var subnetName = "az204Subnet";
var subnetAddress = "172.16.0.0/24";
var nicName = "az204NIC";
var adminUser = "azureadminuser";
var adminPassword = "Pa$$w0rd!2019";
Console.WriteLine($"Creating resource group {groupName} ... ");
//Below fails with 'No such host is known'
var resourceGroup = azure.ResourceGroups.Define(groupName).WithRegion(location).Create();

I tried this code in my system .Try with this code
using Microsoft.Azure.Management.Compute.Fluent.Models;
using Microsoft.Azure.Management.Fluent;
using Microsoft.Azure.Management.ResourceManager.Fluent;
using Microsoft.Azure.Management.ResourceManager.Fluent.Core;
namespace AzureVirtualMachine
{
class Program
{
static void Main(string[] args)
{
var credentials = SdkContext.AzureCredentialsFactory
.FromServicePrincipal("clientId", "clientSecret", "tenantId", AzureEnvironment.AzureGlobalCloud);
var azure = Azure
.Configure()
.WithLogLevel(HttpLoggingDelegatingHandler.Level.Basic)
.Authenticate(credentials)
.WithSubscription("SubscriptionID”)
var groupName = "sampleResourceGroup";
var vmName = "VMWithCSharp";
var location = Region.EuropeWest;
var resourceGroup = azure.ResourceGroups.Define(groupName)
.WithRegion(location)
.Create();
var network = azure.Networks.Define("sampleVirtualNetwork")
.WithRegion(location)
.WithExistingResourceGroup(groupName)
.WithAddressSpace("10.0.0.0/16")
.WithSubnet("sampleSubNet", "10.0.0.0/24")
.Create();
var publicIPAddress = azure.PublicIPAddresses.Define("samplePublicIP")
.WithRegion(location)
.WithExistingResourceGroup(groupName)
.WithDynamicIP()
.Create();
var networkInterface = azure.NetworkInterfaces.Define("sampleNetWorkInterface")
.WithRegion(location)
.WithExistingResourceGroup(groupName)
.WithExistingPrimaryNetwork(network)
.WithSubnet("sampleSubNet")
.WithPrimaryPrivateIPAddressDynamic()
.WithExistingPrimaryPublicIPAddress(publicIPAddress)
.Create();
var availabilitySet = azure.AvailabilitySets.Define("sampleAvailabilitySet")
.WithRegion(location)
.WithExistingResourceGroup(groupName)
.WithSku(AvailabilitySetSkuTypes.Aligned)
.Create();
azure.VirtualMachines.Define(vmName)
.WithRegion(location)
.WithExistingResourceGroup(groupName)
.WithExistingPrimaryNetworkInterface(networkInterface)
.WithLatestWindowsImage("MicrosoftWindowsServer", "WindowsServer", "2012-R2-Datacenter")
.WithAdminUsername("sampleUser")
.WithAdminPassword("Sample123467")
.WithComputerName(vmName)
.WithExistingAvailabilitySet(availabilitySet)
.WithSize(VirtualMachineSizeTypes.StandardB1s)
.Create();
}
}
}
Output:

I resolved the issue by replacing AzureCredentialsFactory.FromFile with AzureCredentialsFactory.FromServicePrincipal. Thanks ShrutiJoshi-MT for the input. I simply created a json file with the necessary credentials.
I still had some issues related to authorization. It turns out I didn't give the App Service appropriate authorization level. This post helped resolve that issue: The client with object id does not have authorization to perform action 'Microsoft.DataFactory/datafactories/datapipelines/read' over scope.
Final code:
string jsonString = File.ReadAllText("../../../azureauth.json");
AuthItem authItem = JsonSerializer.Deserialize<AuthItem>(jsonString);
var credentials = SdkContext.AzureCredentialsFactory
.FromServicePrincipal(authItem.ClientId, authItem.SecretValue, authItem.TenantId, AzureEnvironment.AzureGlobalCloud);
//Create the management client. This will be used for all the operations we will perform in Azure.
var azure = Azure.Configure().WithLogLevel(HttpLoggingDelegatingHandler.Level.Basic).Authenticate(credentials).WithSubscription(authItem.Subscription);
//Create a resource group
var groupName = "az204-ResourceGroup";
var vmName = "az204VMTesting";
var location = Region.USEast;
var vNetName = "az204VNET";
var vNetAddress = "172.16.0.0/16";
var subnetName = "az204Subnet";
var subnetAddress = "172.16.0.0/24";
var nicName = "az204NIC";
var adminUser = "azureadminuser";
var adminPassword = "Pa$$w0rd!2019";
Console.WriteLine($"Creating resource group {groupName} ... ");
var resourceGroup = azure.ResourceGroups.Define(groupName).WithRegion(location).Create();
//Every VM needs to be connected to a virtual network
Console.WriteLine($"Creating virtual network {vNetName} ...");
var network = azure.Networks.Define(vNetName)
.WithRegion(location)
.WithExistingResourceGroup(groupName)
.WithAddressSpace(vNetAddress)
.WithSubnet(subnetName, subnetAddress)
.Create();
//Any VM needs a network interface for connecting to the virtual network
Console.WriteLine($"Creating network interface {nicName} ... ");
var nic = azure.NetworkInterfaces.Define(nicName)
.WithRegion(location)
.WithExistingResourceGroup(groupName)
.WithExistingPrimaryNetwork(network)
.WithSubnet(subnetName)
.WithPrimaryPrivateIPAddressDynamic()
.Create();
//Create the VM
Console.WriteLine($"Creating VM {vmName} ... ");
azure.VirtualMachines.Define(vmName)
.WithRegion(location)
.WithExistingResourceGroup(groupName)
.WithExistingPrimaryNetworkInterface(nic)
.WithLatestWindowsImage("MicrosoftWindowsServer", "WindowsServer", "2012-R2-Datacenter")
.WithAdminUsername(adminUser)
.WithAdminPassword(adminPassword)
.WithComputerName(vmName)
.WithSize(VirtualMachineSizeTypes.StandardDS2V2)
.Create();

Related

Salesforce Pub/Sub client in .NET

I would like to subscript to a Salesforce platform event. I am trying to create a client in C#/.NET for a Salesforce Pub/Sub API. There are examples in other languages but not in .NET : https://github.com/developerforce/pub-sub-api
I am using the Grpc.Net.Client nuget packages.
var topicName = "/event/SomeEvent__e";
var pubSubEndpoint = "https://api.pubsub.salesforce.com:7443";
var accessToken = "xxxx";
var organisationId = "xxxx";
var instanceUrl = "https://xxxxx.sandbox.my.salesforce.com";
var credentials = CallCredentials.FromInterceptor((c, m) =>
{
m.Add("accesstoken", accessToken);
m.Add("instanceurl", instanceUrl);
m.Add("tenantid", organisationId);
return Task.CompletedTask;
});
var options = new GrpcChannelOptions
{
Credentials = ChannelCredentials.Create(new SslCredentials(), credentials)
};
var channel = GrpcChannel.ForAddress(pubSubEndpoint, options);
var client = new PubSub.PubSubClient(channel);
var topicRequest = new TopicRequest()
{
TopicName = topicName
};
var topic = client.GetTopic(topicRequest);
I know my credentials are correct because I can use postman to hit the oauth2 endpoint and get a valid access token. But when I try and call a client method like client.GetTopic, then I get the following error.
Status(StatusCode="PermissionDenied", Detail="An error occurred while getting the metadata for org CORE/prod/00DN0000000c8Hk and topic /event/SomeEvent__e. Ensure the credentials and topic name are correct. rpcId: 21d854fb-17dc-4778-9524-6264bd1a920d")
Am I setting up the credentials object wrong? I cannot find any example of subscribing to a Salesforce Pub/Sub in .NET.

Creating snapshot and export in new Azure SDK .NET - Azure.ResourceManager

Because old Azure SDK for .NET is deprecated, I'm trying to migrate it to new version. I've been stucked with finding substitions for old methods and properties in new SDK. We do a snapshot of existing database and export to Storage Account.
Snippet of old approach:
var sp = new ServicePrincipalLoginInformation()
{
ClientId = clientId,
ClientSecret = clientSecret
};
var credentials = new AzureCredentials(sp, tenantId, AzureEnvironment.AzureGlobalCloud);
var azureClient = Authenticate(credentials).WithSubscription(subscriptionId);
var sqlServer = await azureClient.SqlServers.GetByIdAsync(db.SourceServerId);
var serverDbs = await sqlServer.Databases.ListAsync();
var snapshotDb = serverDbs.FirstOrDefault(i => i.Name == snapshotDbName);
if(snapshotDb is not null)
return;
snapshotDb = await azureClient.SqlServers.Databases
.Define(snapshotDbName)
.WithExistingSqlServer(sqlServer)
.WithSourceDatabase(sourceDatabaseId)
.WithMode(CreateMode.Copy)
.CreateAsync(cancellationToken);
.
.
.
var storageAccount = azureClient.StorageAccounts.GetByIdAsync(storageId);
await snapshotDb.ExportTo(storageAccount, storageContainer, outputFileName)
.WithSqlAdministratorLoginAndPassword(user, password)
.ExecuteAsync(cancellationToken);
According to documentation, I was able to get this:
var sp = new ClientSecretCredential(tenantId, clientId, clientSecret);
var azureClient = new ArmClient(sp, subscriptionId);
var ri = new ResourceIdentifier(NOT SURE WHAT SHOULD BE HERE);
var resGroup = azure.GetResourceGroupResource(ri);
var sqlServerResponse = await resGroup.GetSqlServers().GetAsync(sourceServerId);
var sqlServer = sqlServers.Value;
var serverDBs = sqlServer.GetSqlDatabases();
var snapshotDB = serverDBs.FirstOrDefault(x => x.Data.Name == db.SnapshotDbName);
What are substitution commands, which handle creating snapshot and exporting to Storage Account base on parameters used in deprecated sample? Or do I miss some Package?
We have a general guidance for using our latest version of .NET SDK against resource management.
Regarding your issue, you can refer to code below
var resourceGroup = _client.GetDefaultSubscription().GetResourceGroup(resourceGroupName).Value;
var sqlServer = resourceGroup.GetSqlServer("mySqlServerName").Value;
var sqlDB = sqlServer.GetSqlDatabase("myDbName").Value;
var exportResult= sqlDB.Export(Azure.WaitUntil.Completed, new Azure.ResourceManager.Sql.Models.DatabaseExportDefinition("storageKeyType", "storageKey", new Uri("storageUri"), "adminLogin", "adminLoginPWD")).Value;
The _client here is ArmClient object,
your code var ri = new ResourceIdentifier(NOT SURE WHAT SHOULD BE HERE); is not necessary, may I know why do you want to create a resource identifier here?
Please make sure you are using 1.1.0 version of Azure SDK for SQL libirary in .NET
We are open to any feedback regarding our new SDK, feel free to let us know your thoughts on our new SDK in this survey

AADSTS500011 - PowerBi C# .NET (resource x not found in tenant y)

I am trying to integrate the power bi embedded with C#, I always have this same error that comes out, I put it to you just below, as well as the versions of the packages and the code (basic) which is supposed to do the work .
Thank you for all your answers
Microsoft.PowerBI.Api (v2.0.12)
Microsoft.PowerBI.JavaScript (v2.5.1)
Microsoft.IdentityModel.Clients.ActiveDirectory (v3.13.9)
Microsoft PowerBI JavaScript (v2.5.1)
Microsoft IdentityModel Clients.ActiveDirectory (v3.13.9)
Note that the two head variables are temporary.
The error always come out at this line : var authenticationResult = await authenticationContext.AcquireTokenAsync(this.resourceUrl, this.applicationId, credential);
There is the error message : "exceptionMessage": "AADSTS500011: The resource principal named https://analysis.windows.net/powerbi/api/ was not found in the tenant named x. This can happen if the application has not been installed by the administrator of the tenant or consented to by any user in the tenant. You might have sent your authentication request to the wrong tenant.
public async Task<EmbedConfigResource> EmbedReport([FromUri]string username, [FromUri]string roles)
{
roles = "None";
username = this.pbiUsername;
var result = new EmbedConfigResource { Username = username, Roles = roles };
var credential = new UserPasswordCredential(this.pbiUsername, this.pbiPassword);
var authenticationContext = new AuthenticationContext(this.authorityUrl);
var authenticationResult = await authenticationContext.AcquireTokenAsync(this.resourceUrl, this.applicationId, credential);
var tokenCredentials = new TokenCredentials(authenticationResult.AccessToken, "Bearer");
using (var client = new PowerBIClient(new Uri(this.apiUrl), tokenCredentials))
{
var reports = await client.Reports.GetReportsInGroupAsync(this.workspaceId);
Report report = reports.Value.FirstOrDefault(r => r.Id == this.reportId);
var datasets = await client.Datasets.GetDatasetByIdInGroupAsync(this.workspaceId, report.DatasetId);
result.IsEffectiveIdentityRequired = datasets.IsEffectiveIdentityRequired;
result.IsEffectiveIdentityRolesRequired = datasets.IsEffectiveIdentityRolesRequired;
GenerateTokenRequest generateTokenRequestParameters;
var rls = new EffectiveIdentity(this.pbiUsername, new List<string> { report.DatasetId });
if (!string.IsNullOrWhiteSpace(roles))
{
var rolesList = new List<string>();
rolesList.AddRange(roles.Split(','));
rls.Roles = rolesList;
}
generateTokenRequestParameters = new GenerateTokenRequest(accessLevel: "view", identities: new List<EffectiveIdentity> { rls });
var tokenResponse = await client.Reports.GenerateTokenInGroupAsync(this.workspaceId, report.Id, generateTokenRequestParameters);
result.EmbedToken = tokenResponse;
result.EmbedUrl = report.EmbedUrl;
result.Id = report.Id;
return result;
}
}
You must log into Azure portal, go to Azure Active Directory -> App registrations, select your app, click View API permissions, and then grant admin consent by clicking the button at the bottom:
If you don't have access to the portal, or the button is disabled, you must ask your admin to do it for you.

Azure - Programmatically Create Storage Account

I have tried the following code to create a new storage account in Azure:
Getting the token (success - I received a token):
var cc = new ClientCredential("clientId", "clientSecret");
var context = new AuthenticationContext("https://login.windows.net/subscription");
var result = context.AcquireTokenAsync("https://management.azure.com/", cc);
Create cloud storage credentials:
var credential = new TokenCloudCredentials("subscription", token);
Create the cloud storage account (fails):
using (var storageClient = new StorageManagementClient(credentials))
{
await storageClient.StorageAccounts.CreateAsync(new StorageAccountCreateParameters
{
Label = "samplestorageaccount",
Location = LocationNames.NorthEurope,
Name = "myteststorage",
AccountType = "RA-GRS"
});
}
Error:
ForbiddenError: The server failed to authenticate the request. Verify
that the certificate is valid and is associated with this
subscription.
I am not sure if this is one of those misleading messages or if I misconfigured something in Azure?
As far as I know, Azure provides two types of storage management library now.
Microsoft.Azure.Management.Storage
Microsoft.WindowsAzure.Management.Storage
Microsoft.Azure.Management.Storage is used to create new ARM storage.
Microsoft.WindowsAzure.Management.Storage is used to create classic ARM storage.
I guess you want to create the new arm storage but you used the "Microsoft.WindowsAzure.Management.Storage" library. Since the "Microsoft.WindowsAzure.Management.Storage" uses the certificate to auth requests, you will get the error. If you want to know how to use "Microsoft.WindowsAzure.Management.Storage" to create classic storage, I suggest you refer to this article.
I assume you want to create new ARM storage, I suggest you install the "Microsoft.Azure.Management.Storage" Nuget package.
More details, you could refer to the following code.
static void Main(string[] args)
{
var subscriptionId = "your subscriptionId";
var clientId = "your client id";
var tenantId = "your tenantid";
var secretKey = "secretKey";
StorageManagementClient StorageManagement = new StorageManagementClient(new Microsoft.Azure.TokenCloudCredentials(subscriptionId, GetAccessToken(tenantId, clientId, secretKey)));
var re= StorageManagement.StorageAccounts.CreateAsync("groupname", "sotrage name",new Microsoft.Azure.Management.Storage.Models.StorageAccountCreateParameters() {
Location = LocationNames.NorthEurope,
AccountType = Microsoft.Azure.Management.Storage.Models.AccountType.PremiumLRS
},new CancellationToken() { }).Result;
Console.ReadKey();
}
static string GetAccessToken(string tenantId, string clientId, string secretKey)
{
var authenticationContext = new AuthenticationContext($"https://login.windows.net/{tenantId}");
var credential = new ClientCredential(clientId, secretKey);
var result = authenticationContext.AcquireTokenAsync("https://management.core.windows.net/",
credential);
if (result == null)
{
throw new InvalidOperationException("Failed to obtain the JWT token");
}
var token = result.Result.AccessToken;
return token;
}

How to export SQL Database directly to blob storage programmatically

I need to programmatically backup/export a SQL Database (either in Azure, or a compatible-one on-prem) to Azure Storage, and restore it to another SQL Database. I would like to use only NuGet packages for code dependencies, since I cannot guarantee that either the build or production servers will have the Azure SDK installed. I cannot find any code examples for something that I assume would be a common action. The closest I found was this:
https://blog.hompus.nl/2013/03/13/backup-your-azure-sql-database-to-blob-storage-using-code/
But, this code exports to a local bacpac file (requiring RoleEnvironment, an SDK-only object). I would think there should be a way to directly export to Blob Storage, without the intermediary file. One thought was to create a Stream, and then run:
services.ExportBacpac(stream, "dbnameToBackup")
And then write the stream to storage; however a Memory Stream wouldn't work--this could be a massive database (100-200 GB).
What would be a better way to do this?
Based on my test, the sql Microsoft Azure SQL Management Library 0.51.0-prerelease support directly export the sql database .bacpac file to the azure storage.
We could using sqlManagementClient.ImportExport.Export(resourceGroup, azureSqlServer, azureSqlDatabase,exportRequestParameters) to export the .bacpac file the azure storage.
But we couldn't find ImportExport in the lastest version of Microsoft Azure SQL Management Library SDK. So we could only use sql Microsoft Azure SQL Management Library 0.51.0-prerelease SDK.
More details about how to use sql Microsoft Azure SQL Management Library to export the sql backup to azure blob storage, you could refer to below steps and codes.
Prerequisites:
Registry an App in Azure AD and create service principle for it. More detail steps about how to registry app and get access token please refer to document.
Details codes:
Notice: Replace the clientId,tenantId,secretKey,subscriptionId with your registered azure AD information. Replace the azureSqlDatabase,resourceGroup,azureSqlServer,adminLogin,adminPassword,storageKey,storageAccount with your own sql database and storage.
static void Main(string[] args)
{
var subscriptionId = "xxxxxxxx";
var clientId = "xxxxxxxxx";
var tenantId = "xxxxxxxx";
var secretKey = "xxxxx";
var azureSqlDatabase = "data base name";
var resourceGroup = "Resource Group name";
var azureSqlServer = "xxxxxxx"; //testsqlserver
var adminLogin = "user";
var adminPassword = "password";
var storageKey = "storage key";
var storageAccount = "storage account";
var baseStorageUri = $"https://{storageAccount}.blob.core.windows.net/brandotest/";//with container name endwith "/"
var backName = azureSqlDatabase + "-" + $"{DateTime.UtcNow:yyyyMMddHHmm}" + ".bacpac"; //back up sql file name
var backupUrl = baseStorageUri + backName;
ImportExportOperationStatusResponse exportStatus = new ImportExportOperationStatusResponse();
try
{
ExportRequestParameters exportRequestParameters = new ExportRequestParameters
{
AdministratorLogin = adminLogin,
AdministratorLoginPassword = adminPassword,
StorageKey = storageKey,
StorageKeyType = "StorageAccessKey",
StorageUri = new Uri(backupUrl)
};
SqlManagementClient sqlManagementClient = new SqlManagementClient(new Microsoft.Azure.TokenCloudCredentials(subscriptionId, GetAccessToken(tenantId, clientId, secretKey)));
var export = sqlManagementClient.ImportExport.Export(resourceGroup, azureSqlServer, azureSqlDatabase,
exportRequestParameters); //do export operation
while (exportStatus.Status != Microsoft.Azure.OperationStatus.Succeeded) // until operation successed
{
Thread.Sleep(1000 * 60);
exportStatus = sqlManagementClient.ImportExport.GetImportExportOperationStatus(export.OperationStatusLink);
}
Console.WriteLine($"Export DataBase {azureSqlDatabase} to Storage {storageAccount} Succesfully");
}
catch (Exception exception)
{
//todo
}
}
private static string GetAccessToken(string tenantId, string clientId, string secretKey)
{
var authenticationContext = new AuthenticationContext($"https://login.windows.net/{tenantId}");
var credential = new ClientCredential(clientId, secretKey);
var result = authenticationContext.AcquireTokenAsync("https://management.core.windows.net/",
credential);
if (result == null)
{
throw new InvalidOperationException("Failed to obtain the JWT token");
}
var token = result.Result.AccessToken;
return token;
}
Result like this:
1.Send request to tell sql server start exporting to azure blob storage
2.Continue sending request to monitor the database exported operation status.
3.Finish exported operation.
Here's an idea:
Pass the stream to the .ExportBacPac method but hold a reference to it on a different thread where you regularly empty and reset the stream so that there's no memory overflow. I'm assuming here, that Dac does not have any means to access the stream while it is being filled.
The thing you have to take care of yourself though is thread safety - MemoryStreams are not thread safe by default. So you'd have to write your own locking mechanisms around .Position and .CopyTo. I've not tested this, but if you handle locking correctly I'd assume the .ExportBacPac method won't throw any errors while the other thread accesses the stream.
Here's a very simple example as pseudo-code just outlining my idea:
ThreadSafeStream stream = new ThreadSafeStream();
Task task = new Task(async (exitToken) => {
MemoryStream partialStream = new MemoryStream();
// Check if backup completed
if (...)
{
exitToken.Trigger();
}
stream.CopyToThreadSafe(partialStream);
stream.PositionThreadSafe = 0;
AzureService.UploadToStorage(partialStream);
await Task.Delay(500); // Play around with this - it shouldn't take too long to copy the stream
});
services.ExportBacpac(stream, "dbnameToBackup");
await TimerService.RunTaskPeriodicallyAsync(task, 500);
It's similiar to the Brando's answer but this one uses a stable package:
using Microsoft.WindowsAzure.Management.Sql;
Nuget
Using the same variables in the Brando's answer, the code will be like this:
var azureSqlServer = "xxxxxxx"+".database.windows.net";
var azureSqlServerName = "xxxxxxx";
SqlManagementClient managementClient = new SqlManagementClient(new TokenCloudCredentials(subscriptionId, GetAccessToken(tenantId, clientId, secretKey)));
var exportParams = new DacExportParameters()
{
BlobCredentials = new DacExportParameters.BlobCredentialsParameter()
{
StorageAccessKey = storageKey,
Uri = new Uri(baseStorageUri)
},
ConnectionInfo = new DacExportParameters.ConnectionInfoParameter()
{
ServerName = azureSqlServer,
DatabaseName = azureSqlDatabase,
UserName = adminLogin,
Password = adminPassword
}
};
var exportResult = managementClient.Dac.Export(azureSqlServerName, exportParams);
You can use Microsoft.Azure.Management.Fluent to export your database to a .bacpac file and store it in a blob. To do this, there are few things you need to do.
Create an AZAD (Azure Active Directory) application and Service Principal that can access resources. Follow this link for a comprehensive guide.
From the first step, you are going to need "Application (client) ID", "Client Secret", and "Tenant ID".
Install Microsoft.Azure.Management.Fluent NuGet packages, and import Microsoft.Azure.Management.Fluent, Microsoft.Azure.Management.ResourceManager.Fluent, and Microsoft.Azure.Management.ResourceManager.Fluent.Authentication namespaces.
Replace the placeholders in the code snippets below with proper values for your usecase.
Enjoy!
var principalClientID = "<Applicaiton (Client) ID>";
var principalClientSecret = "<ClientSecret>";
var principalTenantID = "<TenantID>";
var sqlServerName = "<SQL Server Name> (without '.database.windows.net'>";
var sqlServerResourceGroupName = "<SQL Server Resource Group>";
var databaseName = "<Database Name>";
var databaseLogin = "<Database Login>";
var databasePassword = "<Database Password>";
var storageResourceGroupName = "<Storage Resource Group>";
var storageName = "<Storage Account>";
var storageBlobName = "<Storage Blob Name>";
var bacpacFileName = "myBackup.bacpac";
var credentials = new AzureCredentialsFactory().FromServicePrincipal(principalClientID, principalClientSecret, principalTenantID, AzureEnvironment.AzureGlobalCloud);
var azure = await Azure.Authenticate(credentials).WithDefaultSubscriptionAsync();
var storageAccount = await azure.StorageAccounts.GetByResourceGroupAsync(storageResourceGroupName, storageName);
var sqlServer = await azure.SqlServers.GetByResourceGroupAsync(sqlServerResourceGroupName, sqlServerName);
var database = await sqlServer.Databases.GetAsync(databaseName);
await database.ExportTo(storageAccount, storageBlobName, bacpacFileName)
.WithSqlAdministratorLoginAndPassword(databaseLogin, databasePassword)
.ExecuteAsync();

Categories

Resources