How to download files from Folders inside Azure Blob storage? [duplicate] - c#

I'm new to Azure and playing around with blobs in my .Net application.
I want to be able to get structure with folders, subfolders and files inside.
For now I've figured a way to get the files from all folders and subfolders altogether with parents.
Is there any way to get folder structure some other way than parse Prefix of those files' parents?
File structure is the following:
root container
-folder1
-subfolder1
-file
-file
-subfolder2
-file
-file
-file
I've tried this, but it only gives me folder in the root directory, no subfolders:
//returns account, client and container
var blobData = GetBlobDetails(blobConnectionString, rootContainerName);
var rootContainer = blobData.Container;
var blobList = rootContainer.ListBlobsSegmentedAsync(string.Empty, false, BlobListingDetails.None, int.MaxValue, null, null, null);
return (from blob in blobList.Result
.Results
.OfType<CloudBlobDirectory>()
select blob).ToList();

First of all, as noted in the comments: Blob storage does not know the concept of folders. Is all a flat structure and what you see below as prefixes, is all part of the path of a blob (=file).
That said, you can replicate the behavior by traversing the prefixes:
Using Azure.Storage.Blobs 12.2.0
using Azure;
using Azure.Storage.Blobs;
using Azure.Storage.Blobs.Models;
using System;
using System.Threading.Tasks;
using System.Linq;
namespace BlobLister
{
class Program
{
static async Task Main(string[] args)
{
// Get a connection string to our Azure Storage account.
string connectionString = "*****";
string containerName = "mycontainer";
Console.WriteLine($"Recursivly listing blobs and virtual directories for container '{containerName}'");
BlobContainerClient container = new BlobContainerClient(connectionString, containerName);
await ListBlobsForPrefixRecursive(container, "", 0);
}
public static async Task ListBlobsForPrefixRecursive(BlobContainerClient container, string prefix, int level)
{
string spaces = new string(' ', level);
Console.WriteLine($"{spaces}- {prefix}");
await foreach (Page<BlobHierarchyItem> page in container.GetBlobsByHierarchyAsync(prefix: prefix, delimiter: "/").AsPages())
{
foreach (var blob in page.Values.Where(item => item.IsBlob).Select(item => item.Blob))
{
Console.WriteLine($"{spaces} {blob.Name}");
}
var prefixes = page.Values.Where(item => item.IsPrefix).Select(item => item.Prefix);
foreach (var p in prefixes)
{
await ListBlobsForPrefixRecursive(container, p, level + 1);
}
}
}
}
}

Related

.NET ShareFileItem properties null for Azure File Share

I am trying to create an azure function that'll clear down files older than a certain age, but when I access the properties of the file they are all null, what am I doing wrong?!
using System;
using System.Collections.Generic;
using Azure.Storage.Files.Shares;
using Azure.Storage.Files.Shares.Models;
using Microsoft.Azure.WebJobs;
using Microsoft.Extensions.Logging;
namespace somewhere
{
public static class FileShareCleaner
{
[FunctionName("FileShareCleaner")]
public static void Run([TimerTrigger("*/10 */1 * * * *")]TimerInfo myTimer, ILogger log)
{
string connectionString = Environment.GetEnvironmentVariable("FileShareConnectionString");
string shareName = "files";
ShareServiceClient shareserviceclient = new ShareServiceClient(connectionString);
ShareClient shareclient = shareserviceclient.GetShareClient(shareName);
Queue<ShareDirectoryClient> remaining = new Queue<ShareDirectoryClient>();
remaining.Enqueue(shareclient.GetRootDirectoryClient());
while (remaining.Count > 0)
{
ShareDirectoryClient dir = remaining.Dequeue();
foreach (ShareFileItem item in dir.GetFilesAndDirectories())
{
log.LogInformation(item.Name);
if (item.IsDirectory)
{
remaining.Enqueue(dir.GetSubdirectoryClient(item.Name));
}
else
{
log.LogInformation($"time: {item.Properties.LastModified.ToString()}");
}
}
}
}
}
}
The code finds the files but all the properties are null:
[2021-10-06T10:04:50.048Z] Executing 'FileShareCleaner' (Reason='Timer fired at 2021-10-06T11:04:50.0126493+01:00', Id=af5c7864-4326-4c97-b9d6-82bf98726f4e)
[2021-10-06T10:04:50.341Z] 0304ccf5-4e32-4206-b903-af5acc8652dc.dat
[2021-10-06T10:04:50.344Z] time:
[2021-10-06T10:04:50.347Z] 06716b40-cce4-4ef0-86ec-329dcaeddbf4.dat
[2021-10-06T10:04:50.350Z] time:
[2021-10-06T10:04:50.353Z] 20735b83-d8b2-4110-9ee6-6154b97c154c.dat
[2021-10-06T10:04:50.355Z] time:
[2021-10-06T10:04:50.358Z] 2696a0eb-2aed-4200-b495-0dd2a7152139.dat
[2021-10-06T10:04:50.361Z] time:
You are not doing anything wrong. This is expected behavior.
By default when files and folders are listed in a File Share, only size of the file is returned.
For fetching other properties of the file like last modified or content properties, you will need to get the properties of each file separately.
Update
To get the properties, what you will need to do is create an instance of ShareFileClient using ShareDirectoryClient.GetFileClient and then call GetProperties on that. Your code would look something like below:
while (remaining.Count > 0)
{
ShareDirectoryClient dir = remaining.Dequeue();
foreach (ShareFileItem item in dir.GetFilesAndDirectories())
{
log.LogInformation(item.Name);
if (item.IsDirectory)
{
remaining.Enqueue(dir.GetSubdirectoryClient(item.Name));
}
else
{
var fileClient = dir.GetFileClient(item.Name);
var fileProperties = fileClient.GetProperties();
log.LogInformation($"time: {fileProperties.Value.LastModified.ToString()}");
}
}
}

Recursive Upload to Azure-Files / Create Subfolder

i want to Upload a folder recursivly to an azure-files storage.
The file structure usually has several subfolders.
What is the best way to create the subfolders in azure-files?
foreach (string fullLocalFilename in System.IO.Directory.GetFiles(locationOfFolderToUpload, "*.*", System.IO.SearchOption.AllDirectories))
{
Console.WriteLine(fullLocalFilename);
FileInfo localfile = new FileInfo(fullLocalFilename);
var root = share.GetRootDirectoryReference();
string strPfad = localfile.DirectoryName.Substring(3);
var folder = root.GetDirectoryReference(strPfad);
Console.WriteLine(strPfad);
folder.CreateIfNotExists();
CloudFile file = folder.GetFileReference(localfile.Name);
if (file.Exists() == false) {
file.Create(localfile.Length);
file.UploadFromFile(fullLocalFilename);
Console.WriteLine(fullLocalFilename);
}
}
WebException: The remote server returned an error: (404) Not found.
I suggest you make use of Microsoft.Azure.Storage.DataMovement, it supports uploading directory to azure as well create the same structure like in local path. Please install the latest version 1.0.0 of Microsoft.Azure.Storage.DataMovement here. Note that if you have installed other azure storage sdk, please uninstall them first.
For example, if I have a local folder like below:
Use the code below:
using System;
using System.Diagnostics;
using System.Threading;
using Microsoft.Azure.Storage;
using Microsoft.Azure.Storage.Blob;
using Microsoft.Azure.Storage.DataMovement;
using Microsoft.Azure.Storage.File;
namespace AzureDataMovementTest
{
class Program
{
static void Main(string[] args)
{
string storageConnectionString = "xxxx";
CloudStorageAccount account = CloudStorageAccount.Parse(storageConnectionString);
CloudFileClient fileClient = account.CreateCloudFileClient();
CloudFileShare fileShare = fileClient.GetShareReference("t22");
fileShare.CreateIfNotExists();
CloudFileDirectory fileDirectory= fileShare.GetRootDirectoryReference();
//here, I want to upload all the files and subfolders in the follow path.
string source_path = #"F:\temp\1";
//if I want to upload the folder 1, then use the following code to create a file directory in azure.
CloudFileDirectory fileDirectory_2 = fileDirectory.GetDirectoryReference("1");
fileDirectory_2.CreateIfNotExists();
UploadDirectoryOptions directoryOptions = new UploadDirectoryOptions
{
Recursive = true
};
var task = TransferManager.UploadDirectoryAsync(source_path,fileDirectory_2,directoryOptions,null);
task.Wait();
Console.WriteLine("the upload is completed");
Console.ReadLine();
}
}
}
After the code completes running, nav to azure portal -> file storage:
Please let me know if you have more issues.

Vs.Net C# Azure flle storage fails to add File for existing File Share

VS.NET C# fails to create file on Azure File Storage for Existing File share
I'm using Microsoft.WindowsAzure.Storage lib to access Azure File Storage API. My method creates File Share and uploads file. It works when File Share is created, but skips file upload when File Share exists.
using Microsoft.WindowsAzure.Storage;
using Microsoft.WindowsAzure.Storage.Auth;
public void SaveText( string fileName )
{
string accountName = "mylogs";
string key = #"dvjdjhsvdjfhvsjhdvfjhsvdfjhC2g==";
var storageAccount = new CloudStorageAccount(new StorageCredentials(accountName, key), true);
var share = storageAccount.CreateCloudFileClient().GetShareReference("test");
share.CreateIfNotExistsAsync().Wait();
var root = share.GetRootDirectoryReference();
root.GetFileReference(fileName).UploadTextAsync("mytext").Wait();
}
First SaveText(file1) call works fine, Share & "file1" got created.
Second SaveText(file2) call, no errors, no "file2" created.
Same user, same app.
I'm using the nuget package WindowsAzure.Storage, version 9.3.3, and with a console project(not .net core), it works fine.
Sample code as blow(just use yours):
using Microsoft.WindowsAzure.Storage;
using Microsoft.WindowsAzure.Storage.Auth;
using System;
namespace AzureFileTest
{
class Program
{
static void Main(string[] args)
{
Program p = new Program();
p.SaveText("file1"); //in the first call, file1 created and text uploads.
p.SaveText("file2"); //in the second call, file2 created and text uploads.
Console.WriteLine("done now");
Console.ReadLine();
}
public void SaveText(string fileName)
{
string accountName = "xxxxx";
string key = "xxxxxx";
var storageAccount = new CloudStorageAccount(new StorageCredentials(accountName, key), true);
var share = storageAccount.CreateCloudFileClient().GetShareReference("test");
share.CreateIfNotExistsAsync().Wait();
var root = share.GetRootDirectoryReference();
root.GetFileReference(fileName).UploadTextAsync("mytext").Wait();
}
}
}
Please let me know if any more issues, or any difference between the codes.

Get TFS information for locally mapped file or directory via WebApi

I want to use the Teamfoundation.SourceControl.WebApi to check for updates or local changes against our TFS Source Control.
I can gather information about changesets from an item which is committed TFS but I am not able to gather this information based on a local file path inside my mapped workspace.
Is it somehow possible without using the ExtendedClient?
I want something like this:
TfvcChangesetSearchCriteria tcsc = new TfvcChangesetSearchCriteria();
tcsc.ItemPath = #"c:\source\mappedtfs\MYPROJECT\src\MainWindow.cs";/*<--- localPath would be nice here*/
List<TfvcChangesetRef> changerefs = tfvcHttpClient.GetChangesetsAsync("MYPROJECT", null, null, null, null, tcsc).Result;
Microsoft.Teamfoundation.SourceControl.WebApi is a webapi which does not interact with local workspaces and files. If you want to get changesets with local items' path, use Microsoft.TeamFoundation.VersionControl.Client in the Client Library.
using Microsoft.TeamFoundation.Client;
using System;
using Microsoft.TeamFoundation.SourceControl.WebApi;
using Microsoft.TeamFoundation.VersionControl.Client;
using System.Collections.Generic;
namespace ConsoleX
{
class Program
{
static void Main(string[] args)
{
Uri url = new Uri("https://tfsuri");
TfsTeamProjectCollection ttpc = new TfsTeamProjectCollection(url);
VersionControlServer vcs = ttpc.GetService<VersionControlServer>();
IEnumerable<Changeset> cses = vcs.QueryHistory("Path here could be local path or server path", RecursionType.Full);
foreach (Changeset cs in cses)
{
Console.WriteLine(cs.ChangesetId);
Console.WriteLine(cs.Comment);
}
Console.ReadLine();
}
}
}

How to go about searching a server for a specific file?

I am attempting to find a specific file on a web(site/server), and this file could have varying extensions depending upon the server. How would I determine the extension for a unique sever?
Example Possibilities:
website.com/list.bz2
--or--
website.com/list.gz
using System;
using System.IO;
class App
{
public static void Main()
{
string searchPath = #"c:\";
string searchPattern = "list.*";
DirectoryInfo di = new DirectoryInfo(searchPath);
FileInfo[] files = di.GetFiles(searchPattern, SearchOption.AllDirectories);
foreach (FileInfo file in files)
Console.WriteLine(file.FullName);
Console.WriteLine("Press any key to exit...");
Console.ReadKey();
}
}
If you don't have access to server and want to search it as a anonymous client, then you should search internet for google hacking.

Categories

Resources