Recursive Upload to Azure-Files / Create Subfolder - c#

i want to Upload a folder recursivly to an azure-files storage.
The file structure usually has several subfolders.
What is the best way to create the subfolders in azure-files?
foreach (string fullLocalFilename in System.IO.Directory.GetFiles(locationOfFolderToUpload, "*.*", System.IO.SearchOption.AllDirectories))
{
Console.WriteLine(fullLocalFilename);
FileInfo localfile = new FileInfo(fullLocalFilename);
var root = share.GetRootDirectoryReference();
string strPfad = localfile.DirectoryName.Substring(3);
var folder = root.GetDirectoryReference(strPfad);
Console.WriteLine(strPfad);
folder.CreateIfNotExists();
CloudFile file = folder.GetFileReference(localfile.Name);
if (file.Exists() == false) {
file.Create(localfile.Length);
file.UploadFromFile(fullLocalFilename);
Console.WriteLine(fullLocalFilename);
}
}
WebException: The remote server returned an error: (404) Not found.

I suggest you make use of Microsoft.Azure.Storage.DataMovement, it supports uploading directory to azure as well create the same structure like in local path. Please install the latest version 1.0.0 of Microsoft.Azure.Storage.DataMovement here. Note that if you have installed other azure storage sdk, please uninstall them first.
For example, if I have a local folder like below:
Use the code below:
using System;
using System.Diagnostics;
using System.Threading;
using Microsoft.Azure.Storage;
using Microsoft.Azure.Storage.Blob;
using Microsoft.Azure.Storage.DataMovement;
using Microsoft.Azure.Storage.File;
namespace AzureDataMovementTest
{
class Program
{
static void Main(string[] args)
{
string storageConnectionString = "xxxx";
CloudStorageAccount account = CloudStorageAccount.Parse(storageConnectionString);
CloudFileClient fileClient = account.CreateCloudFileClient();
CloudFileShare fileShare = fileClient.GetShareReference("t22");
fileShare.CreateIfNotExists();
CloudFileDirectory fileDirectory= fileShare.GetRootDirectoryReference();
//here, I want to upload all the files and subfolders in the follow path.
string source_path = #"F:\temp\1";
//if I want to upload the folder 1, then use the following code to create a file directory in azure.
CloudFileDirectory fileDirectory_2 = fileDirectory.GetDirectoryReference("1");
fileDirectory_2.CreateIfNotExists();
UploadDirectoryOptions directoryOptions = new UploadDirectoryOptions
{
Recursive = true
};
var task = TransferManager.UploadDirectoryAsync(source_path,fileDirectory_2,directoryOptions,null);
task.Wait();
Console.WriteLine("the upload is completed");
Console.ReadLine();
}
}
}
After the code completes running, nav to azure portal -> file storage:
Please let me know if you have more issues.

Related

How to download files from Folders inside Azure Blob storage? [duplicate]

I'm new to Azure and playing around with blobs in my .Net application.
I want to be able to get structure with folders, subfolders and files inside.
For now I've figured a way to get the files from all folders and subfolders altogether with parents.
Is there any way to get folder structure some other way than parse Prefix of those files' parents?
File structure is the following:
root container
-folder1
-subfolder1
-file
-file
-subfolder2
-file
-file
-file
I've tried this, but it only gives me folder in the root directory, no subfolders:
//returns account, client and container
var blobData = GetBlobDetails(blobConnectionString, rootContainerName);
var rootContainer = blobData.Container;
var blobList = rootContainer.ListBlobsSegmentedAsync(string.Empty, false, BlobListingDetails.None, int.MaxValue, null, null, null);
return (from blob in blobList.Result
.Results
.OfType<CloudBlobDirectory>()
select blob).ToList();
First of all, as noted in the comments: Blob storage does not know the concept of folders. Is all a flat structure and what you see below as prefixes, is all part of the path of a blob (=file).
That said, you can replicate the behavior by traversing the prefixes:
Using Azure.Storage.Blobs 12.2.0
using Azure;
using Azure.Storage.Blobs;
using Azure.Storage.Blobs.Models;
using System;
using System.Threading.Tasks;
using System.Linq;
namespace BlobLister
{
class Program
{
static async Task Main(string[] args)
{
// Get a connection string to our Azure Storage account.
string connectionString = "*****";
string containerName = "mycontainer";
Console.WriteLine($"Recursivly listing blobs and virtual directories for container '{containerName}'");
BlobContainerClient container = new BlobContainerClient(connectionString, containerName);
await ListBlobsForPrefixRecursive(container, "", 0);
}
public static async Task ListBlobsForPrefixRecursive(BlobContainerClient container, string prefix, int level)
{
string spaces = new string(' ', level);
Console.WriteLine($"{spaces}- {prefix}");
await foreach (Page<BlobHierarchyItem> page in container.GetBlobsByHierarchyAsync(prefix: prefix, delimiter: "/").AsPages())
{
foreach (var blob in page.Values.Where(item => item.IsBlob).Select(item => item.Blob))
{
Console.WriteLine($"{spaces} {blob.Name}");
}
var prefixes = page.Values.Where(item => item.IsPrefix).Select(item => item.Prefix);
foreach (var p in prefixes)
{
await ListBlobsForPrefixRecursive(container, p, level + 1);
}
}
}
}
}

Vs.Net C# Azure flle storage fails to add File for existing File Share

VS.NET C# fails to create file on Azure File Storage for Existing File share
I'm using Microsoft.WindowsAzure.Storage lib to access Azure File Storage API. My method creates File Share and uploads file. It works when File Share is created, but skips file upload when File Share exists.
using Microsoft.WindowsAzure.Storage;
using Microsoft.WindowsAzure.Storage.Auth;
public void SaveText( string fileName )
{
string accountName = "mylogs";
string key = #"dvjdjhsvdjfhvsjhdvfjhsvdfjhC2g==";
var storageAccount = new CloudStorageAccount(new StorageCredentials(accountName, key), true);
var share = storageAccount.CreateCloudFileClient().GetShareReference("test");
share.CreateIfNotExistsAsync().Wait();
var root = share.GetRootDirectoryReference();
root.GetFileReference(fileName).UploadTextAsync("mytext").Wait();
}
First SaveText(file1) call works fine, Share & "file1" got created.
Second SaveText(file2) call, no errors, no "file2" created.
Same user, same app.
I'm using the nuget package WindowsAzure.Storage, version 9.3.3, and with a console project(not .net core), it works fine.
Sample code as blow(just use yours):
using Microsoft.WindowsAzure.Storage;
using Microsoft.WindowsAzure.Storage.Auth;
using System;
namespace AzureFileTest
{
class Program
{
static void Main(string[] args)
{
Program p = new Program();
p.SaveText("file1"); //in the first call, file1 created and text uploads.
p.SaveText("file2"); //in the second call, file2 created and text uploads.
Console.WriteLine("done now");
Console.ReadLine();
}
public void SaveText(string fileName)
{
string accountName = "xxxxx";
string key = "xxxxxx";
var storageAccount = new CloudStorageAccount(new StorageCredentials(accountName, key), true);
var share = storageAccount.CreateCloudFileClient().GetShareReference("test");
share.CreateIfNotExistsAsync().Wait();
var root = share.GetRootDirectoryReference();
root.GetFileReference(fileName).UploadTextAsync("mytext").Wait();
}
}
}
Please let me know if any more issues, or any difference between the codes.

Get TFS information for locally mapped file or directory via WebApi

I want to use the Teamfoundation.SourceControl.WebApi to check for updates or local changes against our TFS Source Control.
I can gather information about changesets from an item which is committed TFS but I am not able to gather this information based on a local file path inside my mapped workspace.
Is it somehow possible without using the ExtendedClient?
I want something like this:
TfvcChangesetSearchCriteria tcsc = new TfvcChangesetSearchCriteria();
tcsc.ItemPath = #"c:\source\mappedtfs\MYPROJECT\src\MainWindow.cs";/*<--- localPath would be nice here*/
List<TfvcChangesetRef> changerefs = tfvcHttpClient.GetChangesetsAsync("MYPROJECT", null, null, null, null, tcsc).Result;
Microsoft.Teamfoundation.SourceControl.WebApi is a webapi which does not interact with local workspaces and files. If you want to get changesets with local items' path, use Microsoft.TeamFoundation.VersionControl.Client in the Client Library.
using Microsoft.TeamFoundation.Client;
using System;
using Microsoft.TeamFoundation.SourceControl.WebApi;
using Microsoft.TeamFoundation.VersionControl.Client;
using System.Collections.Generic;
namespace ConsoleX
{
class Program
{
static void Main(string[] args)
{
Uri url = new Uri("https://tfsuri");
TfsTeamProjectCollection ttpc = new TfsTeamProjectCollection(url);
VersionControlServer vcs = ttpc.GetService<VersionControlServer>();
IEnumerable<Changeset> cses = vcs.QueryHistory("Path here could be local path or server path", RecursionType.Full);
foreach (Changeset cs in cses)
{
Console.WriteLine(cs.ChangesetId);
Console.WriteLine(cs.Comment);
}
Console.ReadLine();
}
}
}

Creating folder in VSO via VSO API

I have been trying to figure out how is possible to create a query folder via VSO api, but I always the "Method not allowed" message.
I'm using Microsoft.TeamFoundationServer.Client package to connect VSO. This page says that this library is needed for me. I can query data, but it seems something is missing to create data. This library is fit for me because I have a WebApi whihch manages the communication to VSO API.
Here is my code:
public QueryHierarchyItem CreateFolderAsync(string folderName)
{
QueryHierarchyItem newFolder = new QueryHierarchyItem()
{
Name = folderName,
IsFolder = true,
//Path = "Queries/Shared Queries/" + folderName,
IsPublic = true
};
QueryHierarchyItem item = witClient.CreateQueryAsync(newFolder, _projectName, null).Result;
return item;
}
I have tried to play with the Path property but it did not help.
I have checked the user rights. My user is member of "Project Administrators", and
rights are also set up to manage query folders (Click the chevron next to the "Shared Queries" folder -> select "Security") as group and as single user. It did not help.
I use a free account. The strange is that I have logged in with the same user from Visual Studio and I can manage the folders. Is this functionality available for free accounts?
You can refer to this blog from MSDN for details: http://blogs.msdn.com/b/team_foundation/archive/2010/06/16/work-item-tracking-queries-object-model-in-2010.aspx
Quote the code here:
using System;
using Microsoft.TeamFoundation.Client;
using Microsoft.TeamFoundation.WorkItemTracking.Client;
namespace QueryAPI
{
class Program
{
private static Project myproject = null;
public static QueryFolder GetMyQueriesFolder()
{
foreach (QueryFolder folder in myproject.QueryHierarchy)
{
if (folder.IsPersonal == true)
return folder;
}
throw new Exception("Cannot find the My Queries folder");
}
public static QueryFolder AddNewFolder(string folderName)
{
QueryFolder folder = new QueryFolder(folderName, GetMyQueriesFolder());
myproject.QueryHierarchy.Save();
return folder;
}
static void Main(string[] args)
{
TfsTeamProjectCollection coll = TfsTeamProjectCollectionFactory.GetTeamProjectCollection(new Uri("Your TFS Server URI"));
WorkItemStore store = new WorkItemStore(coll);
myproject = store.Projects["Your project name"];
QueryFolder myNewfolder = AddNewFolder("Your folder name");
}
}
}

C#/.NET Server Path to default/index page

In my attempt to further future-proof a project I am trying to find the best way to retrieve the full path and filename of the index/default page in a web directory using C# and without knowing the web server's list of filename possibilities.
'Server.MapPath("/test/")' gives me 'C:\www\test\'
...so does: 'Server.MapPath(Page.ResolveUrl("/test/"))'
...but I need 'C:\www\test\index.html'.
Does anyone know of an existing method of retrieving the filename that the webserver will serve up when someone browses to that directory - be it default.aspx, or index.html, or whatever?
Thanks for any help,
fodder
ASP.NET has no knowledge of this. You would need to query IIS for the default document list.
The reason for this is that IIS will look in your web folder for the first matching file in the IIS default document list then hand off to the matching ISAPI extension for that file type (by extension) in the script mappings.
To obtain the default document list you can do the following (using the Default Website as an example where the IIS Number = 1):
using System;
using System.DirectoryServices;
namespace ConsoleApplication1
{
class Program
{
static void Main(string[] args)
{
using (DirectoryEntry w3svc =
new DirectoryEntry("IIS://Localhost/W3SVC/1/root"))
{
string[] defaultDocs =
w3svc.Properties["DefaultDoc"].Value.ToString().Split(',');
}
}
}
}
It would then be a case of iterating the defaultDocs array to see which file exists in the folder, the first match is the default document. For example:
// Call me using: string doc = GetDefaultDocument("/");
public string GetDefaultDocument(string serverPath)
{
using (DirectoryEntry w3svc =
new DirectoryEntry("IIS://Localhost/W3SVC/1/root"))
{
string[] defaultDocs =
w3svc.Properties["DefaultDoc"].Value.ToString().Split(',');
string path = Server.MapPath(serverPath);
foreach (string docName in defaultDocs)
{
if(File.Exists(Path.Combine(path, docName)))
{
Console.WriteLine("Default Doc is: " + docName);
return docName;
}
}
// No matching default document found
return null;
}
}
Sadly this won't work if you're in a partial trust ASP.NET environment (for example shared hosting).

Categories

Resources