I have a container called systemdesignwhich have several subfolders for example dfd usecase and other UML design tool names. I want to show the images of a particular "folder" for example dfd, rather than all the images found inside this container.
Below are some screenshots, and partial code that revolves around this. Please do not mind the nature of the images those are just test data.
http://i.imgur.com/fVs1SZk.png [Shows everything rather than a container]
EDIT: For example in folder dfd there should be just one picture, 2nd folder should be another 3.
http://i.imgur.com/kMyBLca.png [How my "directories" are sectioned]
The Code that affects the above:
SystemDesignController
// GET: SystemDesign
public ActionResult Index()
{
StorageCredentials credentials = new StorageCredentials(storagename, accountkey);
CloudStorageAccount storageAccount = new CloudStorageAccount(credentials, true);
CloudBlobClient blobClient = storageAccount.CreateCloudBlobClient();
CloudBlobContainer storageContainer = blobClient.GetContainerReference("systemdesign");
Models.SystemDesignModel blobsList = new Models.SystemDesignModel(storageContainer.ListBlobs(useFlatBlobListing: true));
return View(blobsList);
}
SystemDesignModel.cs // This class holds the model and ultimately is used in the View to show the images
public class SystemDesignModel
{
public SystemDesignModel() : this(null)
{
Files = new List<SystemDesign>();
}
public SystemDesignModel(IEnumerable<IListBlobItem> list)
{
Files = new List<SystemDesign>();
if (list != null && list.Count<IListBlobItem>() > 0)
{
foreach (var item in list)
{
SystemDesign info = SystemDesign.CreateImageFromIListBlob(item);
if (info != null)
{
Files.Add(info);
}
}
}
else
{
}
}
public List<SystemDesign> Files { get; set; }
}
index.cshtml Partial Code For the View that affects this
#foreach (var item in Model.Files)
{
<img src="#item.URL" height="128" width="128"/>
}
What I have tried to do by now:
1) I tried changing CloudBlobContainer storageContainer = blobClient.GetContainerReference("systemdesign"); from systemdesign to dfdbut it issued me a 404 and a storageException in SystemDesignModel.cs if condition
2) tried useFlatBlobListing as false but it output nothing.
Any idea how I can output just one folder according to the section I want?
Thanks
When listing blobs in a container, you can pass a blob prefix which is essentially the name of the folder (dfd, usecase etc. in your example). Then you will only see the blobs in that folder. Here's the link to the documentation: https://msdn.microsoft.com/en-us/library/microsoft.windowsazure.storage.blob.cloudblobcontainer.listblobs.aspx.
Related
I am trying to create a microservice in C# which will accept a csv file containing order numbers, digest the csv, connect to sharepoint, create a new folder on sharepoint, and then copy contracts with names corresponding to the order number from whereever they may be (and they probably won't all be in the smae place) to the new folder.
At this point, with help from Stackoverflow, I can successfully get an authentication token from our Sharepoint using a CSOM Authentication Manager. And now I am trying to figure out how to create a the folder. Googling for information on creating Sharepoint folders keeps bringing up the topic of lists, which I don't know anything about and don't even know if I really want or need to know anything about, or whether there might be a different way which works with the site as that's what I'm actually interested in.
So, let's say I have a sharepoint site at https://example.sharepoint.com/sites/MySite.
How can I simply create a folder called "Foo" within a folder called "Bar" which exists in "Shared Documents"?
If I need to know something about lists in order to do this, can I use C# to find the correct list? Or do I need to chase my adminstrator for additional information?
Assuming the AuthenticationManager returns a valid context and the root folder already exists, the following works:
using Microsoft.SharePoint.Client;
using System;
using System.Collections.Generic;
using System.Linq;
using AuthenticationManager = SharepointOrderContractExtractor.Clients.AuthenticationManager;
namespace SharePointOrderContractExtractor.Clients
{
public class FolderManager
{
private readonly AuthenticationManager _authenticationManager;
public FolderManager(
AuthenticationManager sharepointAuthenticationManager
)
{
_authenticationManager = sharepointAuthenticationManager;
}
internal Folder EnsureAndGetTargetFolder(string folderPath)
{
using ClientContext context = _authenticationManager.GetContext();
List<string> folderNames = folderPath.Split("/").ToList();
List documents = context.Web.Lists.GetByTitle(folderNames[0]);
folderNames.RemoveAt(0);
return EnsureAndGetTargetFolder(context, documents, folderNames);
}
private Folder EnsureAndGetTargetFolder(ClientContext context, List list, List<string> folderPath)
{
Folder returnFolder = list.RootFolder;
return (folderPath != null && folderPath.Count > 0)
? EnsureAndGetTargetSubfolder(context, list, folderPath)
: returnFolder;
}
private Folder EnsureAndGetTargetSubfolder(ClientContext context, List list, List<string> folderPath)
{
Web web = context.Web;
Folder currentFolder = list.RootFolder;
context.Load(web, t => t.Url);
context.Load(currentFolder);
context.ExecuteQuery();
foreach (string folderPointer in folderPath)
{
currentFolder = FindOrCreateFolder(context, list, currentFolder, folderPointer);
}
return currentFolder;
}
private Folder FindOrCreateFolder(ClientContext context, List list, Folder currentFolder, string folderPointer)
{
FolderCollection folders = currentFolder.Folders;
context.Load(folders);
context.ExecuteQuery();
foreach (Folder existingFolder in folders)
{
if (existingFolder.Name.Equals(folderPointer, StringComparison.InvariantCultureIgnoreCase))
{
return existingFolder;
}
}
return CreateFolder(context, list, currentFolder, folderPointer);
}
private Folder CreateFolder(ClientContext context, List list, Folder currentFolder, string folderPointer)
{
ListItemCreationInformation itemCreationInfo = new ListItemCreationInformation
{
UnderlyingObjectType = FileSystemObjectType.Folder,
LeafName = folderPointer,
FolderUrl = currentFolder.ServerRelativeUrl
};
ListItem folderItemCreated = list.AddItem(itemCreationInfo);
folderItemCreated.Update();
context.Load(folderItemCreated, f => f.Folder);
context.ExecuteQuery();
return folderItemCreated.Folder;
}
}
}
I am trying to retrieve all image files from a virtual directory in my Azure storage account. The path of the folder (container URI) is correct but is returning
StorageException: The requested URI does not represent any resource on the server.
Pasting the URI in the browser produces
BlobNotFound
The specified blob does not exist. RequestId:622f4500-a01e-0022-7dd0-7d9428000000 Time:2019-10-08T12:02:29.6389180Z
The URI, which is public, works fine; you can see this video by pasting the URI in your browser or clicking the Engine Video link below.
Engine Video
My code grabs the container, whose URI is https://batlgroupimages.blob.core.windows.net/enerteck/publicfiles/images/robson
public async Task<List<string>> GetBlobFileListAsync(CloudBlobContainer blobContainer, string customer)
{
var files = new List<string>();
BlobContinuationToken blobContinuationToken = null;
do
{
//code fails on the line below
var segments = await blobContainer.ListBlobsSegmentedAsync(null, blobContinuationToken);
blobContinuationToken = segments.ContinuationToken;
files.AddRange(segments.Results.Select(x => GetFileNameFromBlobUri(x.Uri, customer)));
} while (blobContinuationToken != null);
return files;
}
The code is failing on the var segments = await blobContainer…. code line
and it is not the container that is causing the error (IMO) as you can see the container comes back with a valid URI
and the virtual folder contains files
I would love to know what I am doing wrong here.
https://batlgroupimages.blob.core.windows.net/enerteck/publicfiles/images/robson is not a container URI.
https://batlgroupimages.blob.core.windows.net/enerteck is a container URI.
publicfiles/images/robson/image.png could be a blob's name in that container.
I'm thinking you may have included some of the virtual folder path in the container URI and maybe that is messing up something?
So the answer lies in the fact that I am trying to list blobs WITHIN a virtual folder path
So in the OP, I was trying to list the blobs using the FULL path to the folder containing the blobs. You cannot do it that way. You must use one of the overloads of ListBlobsSegmentedAsync on the MAIN CONTAINER. Thanks to juunas for getting me to realize I wasn't starting at the main container. I realized their must be other overload methods to accomplish what I was seeking to do
The code below works well
public async Task<List<EvaluationImage>> GetImagesFromVirtualFolder(CloudBlobContainer blobContainer, string customer)
{
var images = new List<EvaluationImage>();
BlobContinuationToken blobContinuationToken = null;
do
{
//this is the overload to use, you pass in the full virtual path from the main container to where the files are (prefix), use a
//useflatbloblisting (true value in 2nd parameter), BlobListingDetail, I chose 100 as the max number of blobs (parameter 4)
//then the token and the last two parameters can be null
var results = await blobContainer.
ListBlobsSegmentedAsync("publicfiles/images/" + customer,true,BlobListingDetails.All, 100, blobContinuationToken,null,null);
// Get the value of the continuation token returned by the listing call.
blobContinuationToken = results.ContinuationToken;
foreach (var item in results.Results)
{
var filename = GetFileNameFromBlobUri(item.Uri, customer);
var img = new EvaluationImage
{
ImageUrl = item.Uri.ToString(),
ImageCaption = GetCaptionFromFilename(filename),
IsPosterImage = filename.Contains("poster")
};
images.Add(img);
}
} while (blobContinuationToken != null);
return images;
}
I recently asked a question here and thanks to Gaurav Mantri I could add Metadata to blob azure .
my Code after editing in AzureBlobStorage class :
public void SaveMetaData(string fileName, string container, string key, string value)
{
var blob = GetBlobReference(fileName, container);
blob.FetchAttributes();
blob.Metadata.Add(key, value);
blob.SetMetadata();
}
and I call it from myController by this :
public JsonResult SaveMetaData(string name, string key, int id)
{
var uploadedFils = _FileStorage.GetUploadedFiles("images", id + "/");
if (!uploadedFils.Any())
_FileStorage.SaveMetaData(name, "images", key, "true");
foreach (var file in uploadedFils)
{
if (name == file.Name)
{
_FileStorage.SaveMetaData(FormatFileName(id, name), "images", key, "true");
}
else
{
_FileStorage.SaveMetaData(FormatFileName(id, file.Name), "images", key, "false");
}
}
return Json("");
}
the code to get uploaded file
public IEnumerable<Attachment> GetUploadedFiles(string container, string blobprefix)
{
if (string.IsNullOrWhiteSpace(container))
container = DefaultBlobContainer;
var storageAccount = CreateStorageAccountFromConnectionString(GetStorageConnectionString());
var blobContainer = GetBlobContainer(storageAccount, container);
var resultList = new List<Attachment>();
try
{
foreach (IListBlobItem item in blobContainer.ListBlobs(blobprefix, false))
{
var blob = (CloudBlockBlob) item;
var file = new Attachment
{
Name = blob.Name.Substring(blob.Name.LastIndexOf('/') + 1),
Size = blob.Properties.Length,
Extension = Path.GetExtension(blob.Name)
};
resultList.Add(file);
}
}
catch (Exception e)
{
}
return resultList;
}
and I call this action when I click on the desired image that I want to set as active .
for first time it works , but I don't know how to edit it for second click ,specially this is first time for me dealing with Azure ?
the logic behind this line that : when the Gallery is empty and the users upload the first image , this image will be set automatically to active:
if (!uploadedFils.Any())
_FileStorage.SaveMetaData(name, "images", key, "true");
According to your description, I checked your code, you need to modify your code as follows:
SaveMetaData method under your AzureBlobStorage class:
public void SaveMetaData(string fileName, string container, string key, string value)
{
var blob = GetBlobReference(fileName, container);
blob.FetchAttributes();
if (blob.Metadata.ContainsKey(key))
{
blob.Metadata[key] = value;
}
else
blob.Metadata.Add(key, value);
blob.SetMetadata();
}
Based on your scenario, your image files would be uploaded to images\{id}\{filename}. And before you invoke the SaveMetaData under your controller, you need to make sure the file with the specific parameters name and id exists in your blob storage. I assumed that you need to remove the following code snippet:
if (!uploadedFils.Any())
FileStorage.SaveMetaData(name, "images", key, "true");
Note: If there has no files, you could not add/update the meta data for it. Also, you just set the name for the parameter fileName without combining the id. Based on my understanding, the SaveMetaData method is used to set meta data for existing files. I recommend you move the above logic to the action for uploading the file and set the default meta data if there has no files.
Background: I'm using the HTML 5 Offline App Cache and dynamically building the manifest file. Basically, the manifest file needs to list each of the static files that your page will request. Works great when the files are actually static, but I'm using Bundling and Minification in System.Web.Optimization, so my files are not static.
When in the DEBUG symbol is loaded (i.e. debugging in VS) then the actual physical files are called from the MVC View. However, when in Release mode, it calls a virtual file that could look something like this: /bundles/scripts/jquery?v=FVs3ACwOLIVInrAl5sdzR2jrCDmVOWFbZMY6g6Q0ulE1
So my question: How can I get that URL in the code to add it to the offline app manifest?
I've tried:
var paths = new List<string>()
{
"~/bundles/styles/common",
"~/bundles/styles/common1024",
"~/bundles/styles/common768",
"~/bundles/styles/common480",
"~/bundles/styles/frontend",
"~/bundles/scripts/jquery",
"~/bundles/scripts/common",
"~/bundles/scripts/frontend"
};
var bundleTable = BundleTable.Bundles;
foreach (var bundle in bundleTable.Where(b => paths.Contains(b.Path)))
{
var bundleContext = new BundleContext(this.HttpContext, bundleTable, bundle.Path);
IEnumerable<BundleFile> files = bundle.GenerateBundleResponse(bundleContext).Files;
foreach (var file in files)
{
var filePath = file.IncludedVirtualPath.TrimStart(new[] { '~' });
sb.AppendFormat(formatFullDomain, filePath);
}
}
As well as replacing GenerateBundleResponse() with EnumerateFiles(), but it just always returns the original file paths.
I'm open to alternative implementation suggestions as well. Thanks.
UPDATE: (7/7/14 13:45)
As well as the answer below I also added this Bundles Registry class to keep a list of the required static files so that it works in debug mode in all browsers. (See comments below)
public class Registry
{
public bool Debug = false;
public Registry()
{
SetDebug();
}
[Conditional("DEBUG")]
private void SetDebug()
{
Debug = true;
}
public IEnumerable<string> CommonScripts
{
get
{
if (Debug)
{
return new string[]{
"/scripts/common/jquery.validate.js",
"/scripts/common/jquery.validate.unobtrusive.js",
"/scripts/common/knockout-3.1.0.debug.js",
"/scripts/common/jquery.timepicker.js",
"/scripts/common/datepicker.js",
"/scripts/common/utils.js",
"/scripts/common/jquery.minicolors.js",
"/scripts/common/chosen.jquery.custom.js"
};
}
else
{
return new string[]{
"/scripts/common/commonbundle.js"
};
}
}
}
}
I'm by no means happy with this solution. Please make suggestions if you can improve on this.
I can suggest an alternative from this blog post create your own token.
In summary the author suggests using web essentials to create the bundled file and then creating a razor helper to generate the token, in this case based on the last changed date and time.
public static class StaticFile
{
public static string Version(string rootRelativePath)
{
if (HttpRuntime.Cache[rootRelativePath] == null)
{
var absolutePath = HostingEnvironment.MapPath(rootRelativePath);
var lastChangedDateTime = File.GetLastWriteTime(absolutePath);
if (rootRelativePath.StartsWith("~"))
{
rootRelativePath = rootRelativePath.Substring(1);
}
var versionedUrl = rootRelativePath + "?v=" + lastChangedDateTime.Ticks;
HttpRuntime.Cache.Insert(rootRelativePath, versionedUrl, new CacheDependency(absolutePath));
}
return HttpRuntime.Cache[rootRelativePath] as string;
}
}
Then you can reference the bundled file like so...
#section scripts {
<script src="#StaticFile.Version("~/Scripts/app/myAppBundle.min.js")"></script>}
Then you have control of the token and can do what you want with it.
On my FTP Server I have the following folder structure
- Parent Directory
-a.txt
-b.txt.old
-SubDirectory1
-c.txt
-NestedSubDirectory1
-d.txt
-SubDirectory2
-e.txt
-f.txt.old
The number of SDs are not fixed. I need a way to get all the files(can be any format) without the .old extension from the Parent Directory.
I'm currently using the 3rd party dll edtFTPnet.
ftpConnection.GetFileInfos()Where(f => !(f.Name.EndsWith(".old"))).ToList();
This helps me get the details of the files and folders at the current working directory level.
Can someone tell me a way to get all the files with the parentdirectory, subdirectories and nested subdirectories.
The solution may or may not use edtFTPnet.
FTPConnection.GetFileInfos() returns an array of FTPFile. The class FTPFile has a boolean property Dir which indicates whether its filename accesses a file (false) or directory (true).
Something like this should work:
void ReadSubDirectories(FTPConncetion connection, FTPFile[] files)
{
foreach (var file in files)
{
if (file.Dir)
{
// Save parent directory
var curDir = connection.ServerDirectory;
// Move into directory
connection.ChangeWorkingDirectory(file.Name)
// Read all files
ReadSubDirectories(connection, connection.GetFileInfos());
// Move back into parent directory
connection.ChangeWorkingDirectory(curDir)
}
else
{
// Do magic with your files
}
}
}
However you might be better off using just .NET's built-in FtpWebRequest class since its methods and naming conventions are clearer, it's better documented and it's easier to find references online.
Try to use extensions like this:
class Program
{
static void Main(string[] args)
{
using (var connection = new FTPConnection
{
ServerAddress = "127.0.0.1",
UserName = "Admin",
Password = "1",
})
{
connection.Connect();
connection.ServerDirectory = "/recursive_folder";
var resultRecursive =
connection.GetFileInfosRecursive().Where(f => !(f.Name.EndsWith(".old"))).ToList();
var resultDefault = connection.GetFileInfos().Where(f => !(f.Name.EndsWith(".old"))).ToList();
}
}
}
public static class FtpClientExtensions
{
public static FTPFile[] GetFileInfosRecursive(this FTPConnection connection)
{
var resultList = new List<FTPFile>();
var fileInfos = connection.GetFileInfos();
resultList.AddRange(fileInfos);
foreach (var fileInfo in fileInfos)
{
if (fileInfo.Dir)
{
connection.ServerDirectory = fileInfo.Path;
resultList.AddRange(connection.GetFileInfosRecursive());
}
}
return resultList.ToArray();
}
}