How to configure CORS setting for Blob storage in windows azure - c#

I have created several containers in a azure storage and also uploaded some files into these containers. Now I need to give domain level access to the container/blobs. So I tried it from code level like below.
CloudStorageAccount storageAccount = CloudStorageAccount.Parse(
CloudConfigurationManager.GetSetting("StorageConnectionString"));
CloudBlobClient blobClient = storageAccount.CreateCloudBlobClient();
ServiceProperties blobServiceProperties = new ServiceProperties();
blobServiceProperties.Cors.CorsRules.Add(new CorsRule(){
AllowedHeaders = new List<string>() {"*"},
ExposedHeaders = new List<string>() {"*"},
AllowedMethods = CorsHttpMethods.Post | CorsHttpMethods.Put | CorsHttpMethods.Get | CorsHttpMethods.Delete ,
AllowedOrigins = new List<string>() { "http://localhost:8080/"},
MaxAgeInSeconds = 3600,
});
blobClient.SetServiceProperties(GetBlobServiceProperties());
But above code seems to be work if I am creating everything from code (Correct me if I am wrong). I also find setting like below Here,
<CorsRule>
<AllowedOrigins>http://www.contoso.com, http://www.fabrikam.com</AllowedOrigins>
<AllowedMethods>PUT,GET</AllowedMethods>
<AllowedHeaders>x-ms-meta-data*,x-ms-meta-target,x-ms-meta-source</AllowedHeaders>
<ExposedHeaders>x-ms-meta-*</ExposedHeaders>
<MaxAgeInSeconds>200</MaxAgeInSeconds>
</CorsRule>
But I didn't get where this code have to put. I mean in which file. Or is there any setting for CORS while creating container or blob from azure portal. Please assist. Any help would be appreciable. Thanks!

The following answers the question that was actually asked in the title. It appears the questioner already knew how to do this largely from his code, but here is my answer to this. Unfortunately the code samples MS has put out has been far from easy or clear, so I hope this helps someone else. In this solution all you need is a CloudStorageAccount instance, which you can call this function from then (as an extension method).
// USAGE:
// -- example usage (in this case adding a wildcard CORS rule to this account --
CloudStorageAccount acc = getYourStorageAccount();
acc.SetCORSPropertiesOnBlobService(cors => {
var wildcardRule = new CorsRule() { AllowedMethods = CorsHttpMethods.Get, AllowedOrigins = { "*" } };
cors.CorsRules.Add(wildcardRule);
return cors;
});
// CODE:
/// <summary>
/// Allows caller to replace or alter the current CorsProperties on a given CloudStorageAccount.
/// </summary>
/// <param name="storageAccount">Storage account.</param>
/// <param name="alterCorsRules">The returned value will replace the
/// current ServiceProperties.Cors (ServiceProperties) value. </param>
public static void SetCORSPropertiesOnBlobService(this CloudStorageAccount storageAccount,
Func<CorsProperties, CorsProperties> alterCorsRules)
{
if (storageAccount == null || alterCorsRules == null) throw new ArgumentNullException();
CloudBlobClient blobClient = storageAccount.CreateCloudBlobClient();
ServiceProperties serviceProperties = blobClient.GetServiceProperties();
serviceProperties.Cors = alterCorsRules(serviceProperties.Cors) ?? new CorsProperties();
blobClient.SetServiceProperties(serviceProperties);
}
It may be helpful to consider the properties of the CorsRule class:
CorsRule corsRule = new CorsRule() {
AllowedMethods = CorsHttpMethods.Get, // Gets or sets the HTTP methods permitted to execute for this origin
AllowedOrigins = { "*" }, // (IList<string>) Gets or sets domain names allowed via CORS.
//AllowedHeaders = { "*" }, // (IList<string>) Gets or sets headers allowed to be part of the CORS request
//ExposedHeaders = null, // (IList<string>) Gets or sets response headers that should be exposed to client via CORS
//MaxAgeInSeconds = 33333 // Gets or sets the length of time in seconds that a preflight response should be cached by browser
};

Let me try to answer your question. As you know, Azure Storage offers a REST API for managing storage contents. An operation there is Set Blob Service Properties and one of the things you do there is manage CORS rules for blob service. The XML you have included in the question is the request payload for this operation. The C# code you mentioned is actually storage client library which is essentially a wrapper over this REST API written in .Net. So when you use the code above, it actually invokes the REST API and sends the XML.
Now coming to options on setting up CORS rules, there're a few ways you can achieve that. If you're interested in setting them up programmatically, then you can either write some code which consumes the REST API or you could directly use .Net storage client library as you have done above. You could simply create a console application, put the code in there and execute that to set the CORS rule. If you're looking for some tools to do that, then you can try one of the following tools:
Azure Management Studio from Cerebrata: http://www.cerebrata.com
Cloud Portam: http://www.cloudportam.com (Disclosure: This product is built by me).
Azure Storage Explorer (version 6.0): https://azurestorageexplorer.codeplex.com/

Its not a good idea to give domain level access to your containers. You can make the container private, upload the files (create blob) and then share it by using Shared Access Policy.
The below code can help you.
static void Main(string[] args)
{
var account = CloudStorageAccount.Parse(ConfigurationManager.ConnectionStrings["AzureStorageAccount"].ConnectionString);
var bClient = account.CreateCloudBlobClient();
var container = bClient.GetContainerReference("test-share-container-1");
container.CreateIfNotExists();
// clear all existing policy
ClearPolicy(container);
string newPolicy = "blobsharepolicy";
CreateSharedAccessPolicyForBlob(container, newPolicy);
var bUri = BlobUriWithNewPolicy(container, newPolicy);
Console.ReadLine();
}
static void ClearPolicy(CloudBlobContainer container)
{
var perms = container.GetPermissions();
perms.SharedAccessPolicies.Clear();
container.SetPermissions(perms);
}
static string BlobUriWithNewPolicy(CloudBlobContainer container, string policyName)
{
var blob = container.GetBlockBlobReference("testfile1.txt");
string blobContent = "Hello there !!";
MemoryStream ms = new MemoryStream(Encoding.UTF8.GetBytes(blobContent));
ms.Position = 0;
using (ms)
{
blob.UploadFromStream(ms);
}
return blob.Uri + blob.GetSharedAccessSignature(null, policyName);
}
static void CreateSharedAccessPolicyForBlob(CloudBlobContainer container, string policyName)
{
SharedAccessBlobPolicy sharedPolicy = new SharedAccessBlobPolicy()
{
SharedAccessExpiryTime = DateTime.UtcNow.AddHours(24),
Permissions = SharedAccessBlobPermissions.Write | SharedAccessBlobPermissions.Read
};
var permissions = container.GetPermissions();
permissions.SharedAccessPolicies.Add(policyName, sharedPolicy);
container.SetPermissions(permissions);
}
<connectionStrings>
<add name="AzureStorageAccount" connectionString="DefaultEndpointsProtocol=https;AccountName=[name];AccountKey=[key]" />
</connectionStrings>

Related

How can I use a google speech to text api key in deployed WPF app?

I have created a C# WPF app that uses google Speech-to-Text API. Currently, on my development machine, I have added an environment variable in windows, which references to the JSON file that google gave me.
Will I need to create that environment variable to every machine I deploy my app or is there a way to somehow store the JSON key on a server and reference it from there?
Possibly, Passing Credentials Using Code helps you.
This is the code copied from there:
// Some APIs, like Storage, accept a credential in their Create()
// method.
public object AuthExplicit(string projectId, string jsonPath)
{
// Explicitly use service account credentials by specifying
// the private key file.
var credential = GoogleCredential.FromFile(jsonPath);
var storage = StorageClient.Create(credential);
// Make an authenticated API request.
var buckets = storage.ListBuckets(projectId);
foreach (var bucket in buckets)
{
Console.WriteLine(bucket.Name);
}
return null;
}
// Other APIs, like Language, accept a channel in their Create()
// method.
public object AuthExplicit(string projectId, string jsonPath)
{
LanguageServiceClientBuilder builder = new LanguageServiceClientBuilder
{
CredentialsPath = jsonPath
};
LanguageServiceClient client = builder.Build();
AnalyzeSentiment(client);
return 0;
}

How can i determine if a SAS Token has already expired for Azure Blob Storage Container Access?

i use Azure Blob Storage Client Libary v11 for .Net.
I wrote a program that our customers can use to upload files. I generate a URL with a SAS Token (valid for x Days) for our customer and the customer can upload files using the program. Here is an example url:
https://storage.blob.core.windows.net/123456789?sv=2019-07-07&sr=c&si=mypolicy&sig=ASDH845378ddsaSDdase324234234rASDSFR
How can I find out whether the SAS token is still valid before the upload is started?
Update:
I have no se claim in my url.
Here is my code to generate the url:
var policyName = "mypolicy";
string containerName = "123456789";
// Retrieve storage account information from connection string
CloudStorageAccount storageAccount = CloudStorageAccount.Parse(GetSecret());
// Create a blob client for interacting with the blob service.
CloudBlobClient blobClient = storageAccount.CreateCloudBlobClient();
// Create a container for organizing blobs within the storage account.
CloudBlobContainer container = blobClient.GetContainerReference(containerName);
try
{
// The call below will fail if the sample is configured to use the storage emulator in the connection string, but
// the emulator is not running.
// Change the retry policy for this call so that if it fails, it fails quickly.
BlobRequestOptions requestOptions = new BlobRequestOptions() { RetryPolicy = new NoRetry() };
await container.CreateIfNotExistsAsync(requestOptions, null);
}
catch (StorageException ex)
{
MessageBox.Show(ex.Message, Application.ProductName, MessageBoxButtons.OK, MessageBoxIcon.Error);
return string.Empty;
}
// create the stored policy we will use, with the relevant permissions and expiry time
var storedPolicy = new SharedAccessBlobPolicy()
{
SharedAccessExpiryTime = DateTime.UtcNow.AddDays(7),
Permissions = SharedAccessBlobPermissions.Read |
SharedAccessBlobPermissions.Write |
SharedAccessBlobPermissions.List
};
// get the existing permissions (alternatively create new BlobContainerPermissions())
var permissions = container.GetPermissions();
// optionally clear out any existing policies on this container
permissions.SharedAccessPolicies.Clear();
// add in the new one
permissions.SharedAccessPolicies.Add(policyName, storedPolicy);
// save back to the container
container.SetPermissions(permissions);
// Now we are ready to create a shared access signature based on the stored access policy
var containerSignature = container.GetSharedAccessSignature(null, policyName);
// create the URI a client can use to get access to just this container
return container.Uri + containerSignature;
I have found a solution myself. This blog describes two different ShardedAccessSignatures. I have adapted the code so that I now also have the se claim in my URL.
Solution:
protected void GetSharedAccessSignature(
String containerName, String blobName)
{
CloudStorageAccount cloudStorageAccount =
CloudStorageAccount.FromConfigurationSetting(“DataConnectionString”);
CloudBlobClient cloudBlobClient = cloudStorageAccount.CreateCloudBlobClient();
CloudBlobContainer cloudBlobContainer =
new CloudBlobContainer(containerName, cloudBlobClient);
CloudBlockBlob cloudBlockBlob =
cloudBlobContainer.GetBlockBlobReference(blobName);
SharedAccessPolicy sharedAccessPolicy = new SharedAccessPolicy();
sharedAccessPolicy.Permissions = SharedAccessPermissions.Read;
sharedAccessPolicy.SharedAccessStartTime = DateTime.UtcNow.AddMinutes(-10);
sharedAccessPolicy.SharedAccessExpiryTime = DateTime.UtcNow.AddMinutes(40);
String sharedAccessSignature1 =
cloudBlockBlob.GetSharedAccessSignature(sharedAccessPolicy);
String sharedAccessSignature2 =
cloudBlockBlob.GetSharedAccessSignature( new SharedAccessPolicy(), “adele”);
}
The sharedAccessSignature1 contains the se claim.
In my code of my initial questions I had used the sharedAccessSignature2.

Read Parquet file from Azure blob with out downloading it locally c# .net

We have a parquet formatfile (500 mb) which is located in Azure blob.How to read the file directly from blob and save in memory of c# ,say eg:Datatable.
I am able to read parquet file which is physically located in folder using the below code.
public void ReadParqueFile()
{
using (Stream fileStream = System.IO.File.OpenRead("D:/../userdata1.parquet"))
{
using (var parquetReader = new ParquetReader(fileStream))
{
DataField[] dataFields = parquetReader.Schema.GetDataFields();
for (int i = 0; i < parquetReader.RowGroupCount; i++)
{
using (ParquetRowGroupReader groupReader = parquetReader.OpenRowGroupReader(i))
{
DataColumn[] columns = dataFields.Select(groupReader.ReadColumn).ToArray();
DataColumn firstColumn = columns[0];
Array data = firstColumn.Data;
//int[] ids = (int[])data;
}
}
}
}
}
}
(I am able to read csv file directly from blob using sourcestream).Please kindly suggest a fastest method to read the parquet file directly from blob
Per my experience, the solution to directly read the parquet file from blob is first to generate the blob url with sas token and then to get the stream of HttpClient from the url with sas and finally to read the http response stream via ParquetReader.
First, please refer to the sample code below of the section Create a service SAS for a blob of the offical document Create a service SAS for a container or blob with .NET using Azure Blob Storage SDK for .NET Core.
private static string GetBlobSasUri(CloudBlobContainer container, string blobName, string policyName = null)
{
string sasBlobToken;
// Get a reference to a blob within the container.
// Note that the blob may not exist yet, but a SAS can still be created for it.
CloudBlockBlob blob = container.GetBlockBlobReference(blobName);
if (policyName == null)
{
// Create a new access policy and define its constraints.
// Note that the SharedAccessBlobPolicy class is used both to define the parameters of an ad hoc SAS, and
// to construct a shared access policy that is saved to the container's shared access policies.
SharedAccessBlobPolicy adHocSAS = new SharedAccessBlobPolicy()
{
// When the start time for the SAS is omitted, the start time is assumed to be the time when the storage service receives the request.
// Omitting the start time for a SAS that is effective immediately helps to avoid clock skew.
SharedAccessExpiryTime = DateTime.UtcNow.AddHours(24),
Permissions = SharedAccessBlobPermissions.Read | SharedAccessBlobPermissions.Write | SharedAccessBlobPermissions.Create
};
// Generate the shared access signature on the blob, setting the constraints directly on the signature.
sasBlobToken = blob.GetSharedAccessSignature(adHocSAS);
Console.WriteLine("SAS for blob (ad hoc): {0}", sasBlobToken);
Console.WriteLine();
}
else
{
// Generate the shared access signature on the blob. In this case, all of the constraints for the
// shared access signature are specified on the container's stored access policy.
sasBlobToken = blob.GetSharedAccessSignature(null, policyName);
Console.WriteLine("SAS for blob (stored access policy): {0}", sasBlobToken);
Console.WriteLine();
}
// Return the URI string for the container, including the SAS token.
return blob.Uri + sasBlobToken;
}
Then to get the http response stream of HttpClient from the url with sas token .
var blobUrlWithSAS = GetBlobSasUri(container, blobName);
var client = new HttpClient();
var stream = await client.GetStreamAsync(blobUrlWithSAS);
Finally to read it via ParquetReader, the code comes from Reading Data of GitHub repo aloneguid/parquet-dotnet.
var options = new ParquetOptions { TreatByteArrayAsString = true };
var reader = new ParquetReader(stream, options);

Need to create a folder(and a file inside it) using C# inside Azure DevOps repository - be it Git or TFVC

From Azure DevOps portal, I can manually add file/ folder into repository irrespective of the fact that source code is cloned or not - Image for illustration.
However, I want to programmatically create a folder and a file inside that folder within a Repository from c# code in my ASP .NET core application.
Is there a Azure DevOps service REST API or any other way to do that? I'll use BASIC authentication through PAT token only.
Note : I'm restricted to clone the source code at local repository.
Early reply is really appreciated.
I tried HttpClient, GitHttpClient and LibGit2Sharp but failed.
Follow below steps in your C# code
call GetRef REST https://dev.azure.com/{0}/{1}/_apis/git/repositories/{2}/refs{3}
this should return the object of your repository branch which you can use to push your changes
Next, call Push REST API to create folder or file into your repository
https://dev.azure.com/{0}/{1}/_apis/git/repositories/{2}/pushes{3}
var changes = new List<ChangeToAdd>();
//Add Files
//pnp_structure.yml
var jsonContent = File.ReadAllText(#"./static-files/somejsonfile.json");
ChangeToAdd changeJson = new ChangeToAdd()
{
changeType = "add",
item = new ItemBase() { path = string.Concat(path, "/[your-folder-name]/somejsonfile.json") },
newContent = new Newcontent()
{
contentType = "rawtext",
content = jsonContent
}
};
changes.Add(changeJson);
CommitToAdd commit = new CommitToAdd();
commit.comment = "commit from code";
commit.changes = changes.ToArray();
var content = new List<CommitToAdd>() { commit };
var request = new
{
refUpdates = refs,
commits = content
};
var personalaccesstoken = _configuration["azure-devOps-configuration-token"];
var authorization = Convert.ToBase64String(System.Text.ASCIIEncoding.ASCII.GetBytes(string.Format("{0}:{1}", "", personalaccesstoken)));
_logger.LogInformation($"[HTTP REQUEST] make a http call with uri: {uri} ");
//here I making http client call
// https://dev.azure.com/{orgnizationName}/{projectName}/_apis/git/repositories/{repositoryId}/pushes{?api-version}
var result = _httpClient.SendHttpWebRequest(uri, method, data, authorization);

Azure container permissions

Im reading this article.
I have an azure container called "test" that is set to private in azure.
That container has a scorm package in it "121/HEEDENNL/story.html"
I'm using the code below to set the permissions of the folder to read.
However that story.html file needs several other files to run properly.
The story page opens and doesn't return a 403 or 404.
but the files it trying to reference to to make the page run properly are not loading.
How can I get all the files needed for story.html to run properly, be set to read access also?
I thought changing the containers permissions would allow that file to access the files needed.
What am I missing here?
public ActionResult ViewContent(int id)
{
const string pageBlobName = "121/HEEDENNL/story.html";
CloudStorageAccount storageAccount = Common.Constants.Azure.ConnectionStringUrl;
CloudBlobClient blobClient = storageAccount.CreateCloudBlobClient();
//// Retrieve a reference to a container.
// CloudBlobContainer learningModulContainer = blobClient.GetContainerReference(Common.Constants.Azure.LearningModulesContainerName);
CloudBlobContainer learningModulContainer = blobClient.GetContainerReference("test");
PrintBlobs(learningModulContainer);
CloudBlockBlob myindexfile = learningModulContainer.GetBlockBlobReference(pageBlobName);
SharedAccessBlobPermissions permission = SharedAccessBlobPermissions.None;
permission = SharedAccessBlobPermissions.Read;
var token = GetSasForBlob(myindexfile, permission,30);
//this isn't finished.....must get learning module
var module = DataAccessService.Get<LearningModule>(id);
var url = $"{Common.Constants.Azure.StorageAccountUrl}{"test"}/{module.ScormPackage.Path.Replace("index_lms", "story")}{token}";
return Redirect(token);
}
public static string GetSasForBlob(CloudBlockBlob blob, SharedAccessBlobPermissions permission, int sasMinutesValid)
{
// var sasToken = blob.GetSharedAccessSignature(new SharedAccessBlobPolicy()
var sasToken = blob.Container.GetSharedAccessSignature(new SharedAccessBlobPolicy()
{
Permissions = permission,
SharedAccessStartTime = DateTime.UtcNow.AddMinutes(-15),
SharedAccessExpiryTime = DateTime.UtcNow.AddMinutes(sasMinutesValid),
});
return string.Format(CultureInfo.InvariantCulture, "{0}{1}", blob.Uri, sasToken);
}
How can I get all the files needed for story.html to run properly, be set to read access also?
Firstly, if possible, you could put these css&js&image etc files that your html page reference in a allow-public-access container.
Secondly, you could provide URL with SAS of the blob resource, and add reference like this in your html page.
<link href="https://{storageaccount}.blob.core.windows.net/styles/Style1.css?st=2017-06-15T02%3A27%3A00Z&se=2017-06-30T02%3A27%3A00Z&sp=r&sv=2015-04-05&sr=b&sig=%2FWwN0F4qyoIH97d7znRKo9lcp84S4oahU9RBwHTnlXk%3D" rel="stylesheet" />
Besides, if you’d like to host your web app, you could try to use Azure app service.

Categories

Resources