Delete Empty S3 Folder .Net SDK - c#

I am using .Net Core along with Amazon's .net sdk to push and pull things from S3. I am using a folder structure in S3 that involves inserting an empty directory with several sub directories.
At a later time I insert files into those directories and move them around. Now I need to be able to remove the directory entirely.
I am able to delete all of the contents of the directory by using
await client.DeleteObjectAsync(bucketName, keyName, null).ConfigureAwait(false);
where I loop through all the files I want to delete in the given bucket. However, it always leaves me with the empty folder structure, in S3 I see that it has a content of 0 Bytes but I don't want to have to sort through thousands of empty folders to find the ones that actually have data.
Is there any way to remove an empty folder from S3 using AWS .NET SDK?
Update:
I am able to delete everything in the folder I want except for the folder itself.
using (IAmazonS3 client = new AmazonS3Client(awsCreds, Amazon.RegionEndpoint.USEast1))
{
try
{
DeleteObjectsRequest deleteRequest = new DeleteObjectsRequest();
ListObjectsRequest listRequest = new ListObjectsRequest
{
BucketName = bucketName,
Prefix = prefix,
//Marker = prefix,
};
ListObjectsResponse response = await client.ListObjectsAsync(listRequest).ConfigureAwait(false);
// Process response
foreach (S3Object entry in response.S3Objects)
{
deleteRequest.AddKey(entry.Key);
}
deleteRequest.BucketName = bucketName;
var response2 = await client.DeleteObjectsAsync(deleteRequest).ConfigureAwait(false);
return true;
}
catch (AmazonS3Exception amazonS3Exception)
{
if (amazonS3Exception.ErrorCode != null
&& (amazonS3Exception.ErrorCode.Equals("InvalidAccessKeyId", StringComparison.Ordinal)
|| amazonS3Exception.ErrorCode.Equals("InvalidSecurity", StringComparison.Ordinal)))
{
logger.LogError("AwsS3Service.DeleteFileFromBucket Error - Check the provided AWS Credentials.");
}
else
{
logger.LogError($"AwsS3Service.DeleteFileFromBucket Error - Message: {amazonS3Exception.Message}");
}
}
}
This deletes the entire contents of the directory I choose along with all sub directories. But the main directory remains, is there any way to remove that main directory.

Your code is 99% of the way there. The only thing you need to do is add the prefix variable to your keys to be deleted as well. Technically, it is a 0-byte object that needs to be 'deleted' as well.
For example, after your loop through all the objects in the response, go ahead and add the prefix variable that was added to find all those things.
foreach (S3Object entry in response.S3Objects)
{
deleteRequest.AddKey(entry.Key);
}
// Add the folder itself to be deleted as well
deleteRequest.AddKey(prefix);

Related

TFS - Get latest code in a folder

I am using the TFS API to get latest code files, directories, .csproj files, etc. under a TFS-bound folder.
For the same, I use something like the following:
var tfs = TfsTeamProjectCollectionFactory.GetTeamProjectCollection(new
Uri(ConfigurationManager.AppSettings["TFSUrl"]));
tfs.EnsureAuthenticated();
var vsStore = tfs.GetService<VersionControlServer>();
string workingFolder = #"C:\TFS\SolutionFolder";
Workspace wsp = vsStore.TryGetWorkspace(workingFolder);
if (wsp != null)
{
ItemSet items = vsStore.GetItems(workingFolder, VersionSpec.Latest, RecursionType.Full);
string relativePath = workingFolder + #"/";
foreach (Item item in items.Items)
{
string relativePath1 = item.ServerItem.Replace("$/TFS/SolutionFolder", relativePath);
if (item.ItemType == ItemType.Folder)
{
Directory.CreateDirectory(relativePath1);
}
else
{
item.DownloadFile(relativePath1);
}
}
}
Now, I get the items to download and then download happens. However, I want it to be like how VS handles it - if (and only if) there is a change in a file/folder, then only download the same. With this code, I always get 'n' number of files/folders in that folder and then I overwrite the same. Wrong approach, I know. I can, however, modify this code to check for the folder's or file's last change time and then choose to either overwrite it or ignore it. That's an option, albeit a bad one at that.
Now, what I would ideally like is to get ONLY the list of files/folders that actually need to be changed i.e. the incremental change. After that, I can choose to overwrite/ignore each item in that list. So, in the present case, if a new file/folder is created (or one of the existing ones got changed inside $/TFS/SolutionFolder i.e. in the sever), then only I want to pull that item in the list of files/folders to change(and decide what I want to do with it inside C:\TFS\SolutionFolder).
Also, is using one of the overloads of VersionControlServer.QueryHistory() an option? I had something like this:
(latestVersionIdOf $/TFS/SolutionFolder) - (existingVersionIdOf C:\TFS\SolutionFolder) = (Versions that I'd go out and get back from the server, for that folder)
in mind.
Any pointers will be very helpful. Thanks!
Just use Workspace.Get() or overload method (wsp.Get()), it just update updated files.
I don't think we can achieve that. If the files are downloaded to a folder without in source control, there are no versions compared within the folder, even if the folder is in source control, the behavior is just download also no version compare actions. So, it will download all the files ever time and then overwrite the same ones.
In VS, the files are all in TFS source control system, so when we Get Latest Version the changed/added files will be retrieved from TFS. If you want to get the same behavior as VS handles, you can use the tf get command. See Get Command
You can reference this article to use the tf get command :
get-latest-version-of-specific-files-with-tfs-power-tools
Update :-
var tfs = TfsTeamProjectCollectionFactory.GetTeamProjectCollection(new Uri(ConfigurationManager.AppSettings["TFSUrl"]));
tfs.EnsureAuthenticated();
var vsStore = tfs.GetService<VersionControlServer>();
string workingFolder = ConfigurationManager.AppSettings["LocalPathToFolder"]; // C:\TFS\SolutionFolder
string tfsPathToFolder = ConfigurationManager.AppSettings["TFSPathToFolder"]; // $/TFS/SolutionFolder
Workspace wsp = vsStore.GetWorkspace(workingFolder);
if (wsp != null)
{
ItemSpec[] specs = { new ItemSpec(tfsPathToFolder, RecursionType.Full) };
ExtendedItem[][] extendedItems = wsp.GetExtendedItems(specs, DeletedState.NonDeleted, ItemType.Any);
ExtendedItem[] extendedItem = extendedItems[0];
var itemsToDownload = extendedItem.Where(itemToDownload => itemToDownload.IsLatest == false);
foreach (var itemToDownload in itemsToDownload)
{
try
{
switch (itemToDownload.ItemType)
{
case ItemType.File:
if (itemToDownload.LocalItem != null)
{
vsStore.DownloadFile(itemToDownload.SourceServerItem, itemToDownload.LocalItem);
}
else
{
string localItemPath = itemToDownload.SourceServerItem.Replace(tfsPathToFolder,
workingFolder);
vsStore.DownloadFile(itemToDownload.SourceServerItem, localItemPath);
}
break;
case ItemType.Folder:
string folderName = itemToDownload.SourceServerItem.Replace(tfsPathToFolder, workingFolder);
if ((!string.IsNullOrEmpty(folderName)) && (!Directory.Exists(folderName)))
{
Directory.CreateDirectory(folderName);
}
break;
}
}
catch (Exception e)
{
File.AppendAllText(#"C:\TempLocation\GetLatestExceptions.txt", e.Message);
}
}
}
This code works well, except:
a. Whenever it downloads the latest copy of, let's say a file, it 'checks it out' in TFS :(
b. For some items, it throws errors like 'Item $/TFS/SolutionFolder/FolderX/abc.cs was not found in source control at version T.' - I have to find out what the exact cause of this issue is, though.
Any ideas on how to get around these two issues or any other problems you see with this code? Thanks!

Directory.GetDirectories return empty string inside an async Task operation

I have a UWP application which perform to capture and process images from a camera. This project leverage Microsoft Cognitive Services Face Recognition API and I'm exploring the application's existing functionality for awhile now. My goal is that when the image of a person is identified by the camera (through Face Recognition API service), I want to show the associated image of that person.
With that, the images are captured and stored in a local directory of my machine. I want to retrieve the image file and render it on the screen once the person is identified.
The code below shows the async Task method ProcessCameraCapture
private async Task ProcessCameraCapture(ImageAnalyzer e)
{
if (e == null)
{
this.UpdateUIForNoFacesDetected();
this.isProcessingPhoto = false;
return;
}
DateTime start = DateTime.Now;
await e.DetectFacesAsync();
if (e.DetectedFaces.Any())
{
string names;
await e.IdentifyFacesAsync();
this.greetingTextBlock.Text = this.GetGreettingFromFaces(e, out names);
if (e.IdentifiedPersons.Any())
{
this.greetingTextBlock.Foreground = new SolidColorBrush(Windows.UI.Colors.GreenYellow);
this.greetingSymbol.Foreground = new SolidColorBrush(Windows.UI.Colors.GreenYellow);
this.greetingSymbol.Symbol = Symbol.Comment;
GetSavedFilePhoto(names);
}
else
{
this.greetingTextBlock.Foreground = new SolidColorBrush(Windows.UI.Colors.Yellow);
this.greetingSymbol.Foreground = new SolidColorBrush(Windows.UI.Colors.Yellow);
this.greetingSymbol.Symbol = Symbol.View;
}
}
else
{
this.UpdateUIForNoFacesDetected();
}
TimeSpan latency = DateTime.Now - start;
this.faceLantencyDebugText.Text = string.Format("Face API latency: {0}ms", (int)latency.TotalMilliseconds);
this.isProcessingPhoto = false;
}
In GetSavedFilePhoto, I passed the string names argument once the person is identified.
Code below for the GetSavedFilePhoto method
private void GetSavedFilePhoto(string personName)
{
if (string.IsNullOrWhiteSpace(personName)) return;
var directoryPath = #"D:\PersonImages";
var directories = Directory.GetDirectories(directoryPath);
var filePaths = Directory.GetFiles(directoryPath, "*.jpg", SearchOption.AllDirectories);
}
However, in GetSavedFilePhoto method the variable directories returned an empty string of array when using directoryPath string variable. Directory "D:\PersonImages" is a valid and existing folder in my machine and, it contains subfolders with images inside. I also tried Directory.GetFiles to retrieve the jpg images but still returned an empty string.
I think it should work because I have used Directory class several times but not inside an asyncTask method. Does using async caused the files not returned when using I/O operation?
Sorry for this stupid question, but I really don't understand.
Any help is greatly appreciated.
Using Directory.GetFiles or Directory.GetDirectories method can get the folder/file in the local folder of the Application by the following code. But it could not open D:\.
var directories = Directory.GetDirectories(ApplicationData.Current.LocalFolder.Path);
In UWP app you can only access two locations at default (local folder and install folder), others need capabilities setting or file open picker.Details please reference file access permission.
If you need access to all files in D:\, the user must manually pick the D:\ drive using the FolderPicker, then you have permissions to access to files in this drive.
var picker = new Windows.Storage.Pickers.FileOpenPicker();
picker.ViewMode = Windows.Storage.Pickers.PickerViewMode.Thumbnail;
picker.SuggestedStartLocation =
Windows.Storage.Pickers.PickerLocationId.ComputerFolder;
picker.FileTypeFilter.Add(".jpg");
picker.FileTypeFilter.Add(".jpeg");
picker.FileTypeFilter.Add(".png");
Windows.Storage.StorageFile file = await picker.PickSingleFileAsync();
if (file != null)
{
// Application now has read/write access to the picked file
}
else
{
//do some stuff
}

How know if bucket exists in AmazonS3 SDK 3.0

Currently I am working in CRUD operations using Amazon S3 for 3.5 .net , I am using 3.1.5 version.
I found this code to check if the bucket exists :
AmazonS3Client s3Client = new AmazonS3Client ();
///setup the client configuration
S3DirectoryInfo directoryInfo = new S3DirectoryInfo(s3Client, bucketName);
bucketExists = directoryInfo.Exists;
Is there another elegant way (c# code) to check if the bucket exists?
I originally followed the answer here but I switched to a slightly different method so I thought I'd share it. This method creates the bucket if it doesn't already exist.
internal async Task CreateBucketAsync(string bucket, CancellationToken token)
{
if (string.IsNullOrEmpty(bucket)) return;
using (var amazonClient = GetAmazonClient)
{
if (AmazonS3Util.DoesS3BucketExist(amazonClient, bucket)) return;
await amazonClient.PutBucketAsync(new PutBucketRequest { BucketName = bucket, UseClientRegion = true }, token);
await SetMultiPartLifetime(amazonClient, bucket, token);
}
}
Your Code is written in c#, you are looking for other way to check if the directory exists? I think your way is better.
You can create a list of all the subfolders in the root and store it in other place (text file or list or whatever you want) and then you don't need to create every time connection to amazon.
S3DirectoryInfo s3Root = new S3DirectoryInfo(s3Client, "bucketofcode");
foreach (S3DirectoryInfo subDirectory in s3Root.GetDirectories())
{
Console.WriteLine(subDirectory.Name);
}
From here https://blogs.aws.amazon.com/net/post/Tx2N8LWZYHZHGQI/The-Three-Different-APIs-for-Amazon-S3

Replacing a file with a new file of the same name but different content in TFS via C#

i´m currently working on a programm which updates templates on our companies Team Foundation Server. I am having those new templates locally on my disk and want to replace the existing ones on the server. I was trying different approaches and this is my newest version. The problem is that either
the new file is "in use" when accessing it through coding in c#(while not in use when i try to replace it in runtime using the normal explorer).
the replacement is not appearing in the pending changes, the pendingChanges array is initial.
using (var tfs = TeamFoundationServerFactory.GetServer("myserver"))
{
var versionControlServer = tfs.GetService(typeof(VersionControlServer)) as VersionControlServer;
// Create a new workspace for the currently authenticated user.
var workspace = versionControlServer.CreateWorkspace("Temporary Workspace", versionControlServer.AuthorizedUser);
try
{
// Check if a mapping already exists.
var workingFolder = new WorkingFolder("$serverpath", #"c:\tempFolder");
// Create the mapping (if it exists already, it just overides it, that is fine).
workspace.CreateMapping(workingFolder);
workspace.Get(VersionSpec.Latest, GetOptions.GetAll);
string[] paths = new string[1];
paths[0] = "test.pdf";
workspace.PendEdit(paths, RecursionType.Full, null, LockLevel.None);
// Go through the folder structure defined and create it locally, then check in the changes.
CreateFolderStructure(workspace, workingFolder.LocalItem);
// Check in the changes made.
int a = workspace.CheckIn(workspace.GetPendingChanges(), "This is my comment");
}
catch (Exception ex)
{
Console.WriteLine(ex.Message);
}
finally
{
// Cleanup the workspace.
workspace.Delete();
// Remove the temp folder used.
Directory.Delete(#"C:\tempFolder", true);
}
}
}
static void CreateFolderStructure(Workspace workspace, string initialPath)
{
workspace.PendDelete("$serverpath/test.pdf", RecursionType.None);
File.Copy(#"C:\test\testnew.pdf", #"C:\tempfolder\test", true);
workspace.PendAdd(#"C:\tempfolder\test.pdf");
}
I found a solution to the problem. The workspace which was used by "authorizedUser" was obviously not enough.. I found out that a need a TeamFoundationIdentity to do it. Here is a guide on how to fix the issue.
http://blogs.msdn.com/b/taylaf/archive/2010/03/29/using-tfs-impersonation-with-the-version-control-client-apis.aspx

Upload file to subdirectory with specific name when subdirectories don't exist?

Maybe this is a four-part question:
Upload to subdirectory/ies
Subdirectories don't exist
Use different remote filename than local file
Subdirectories should have explicit permissions (similar to root problem in WinSCP .NET assembly - How to set folder permissions after creating directory?)
Attempted:
var localPath = Path.GetTempFileName();
var remoteFolder = "/some/directory/beneath/root";
var slash = "/"; // maybe given as '/' or '\'...
var remotePath = remoteFolder + slash + "destination.ext1.ext2.txt";
var session = new Session(sessionOptions);
var result = session.PutFiles(localPath, remotePath, false, new FileTransferOptions { FilePermissions = new FilePermissions { Octal = "700" }, KeepTimestamp..., etc });
result.Check();
throws exception Cannot create remote file '/some/directory/beneath/root/destination.ext1.ext2.txt'. ---> WinSCP.SessionRemoteException: No such file or directory.
I was finally able to make the subdirectories with the correct permissions via the crazy workaround indicated here by creating the subdirectory structure in my temp path and using PutFiles on the first folder:
var tempRoot = Path.GetTempPath();
var tempPath = Path.Combine(tempRoot, remoteFolder);
Directory.CreateDirectory(tempPath);
// only need to upload the first segment, PutFiles will magically grab the subfolders too...
var segment = remoteFolder.Substring(0, remoteFolder.IndexOf(slash, StringComparison.Ordinal));
if( !this.DoesFolderExist(segment) )
{
// here's the workaround...
try
{
this._session.PutFiles(Path.Combine(tempRoot, segment), segment, false, new TransferOptions { FilePermissions = this._transferOptions.FilePermissions }).Check();
}
catch (InvalidOperationException)
{
// workaround for bug in .NET assembly prior to 5.5.5/5.6.1 beta
// although I never hit this catch, maybe I've got a new enough version?
}
}
Directory.Delete(tempPath); // finish workaround
but this was way too unintuitive.
ad 1) WinSCP does not (generally) create the target directory of the upload. It must exist prior to the upload. You can test the existence using the Session.FileExists and create the directory using the Session.CreateDirectory, if not. WinSCP, of course, creates the directories you are uploading, if needed.
ad 3) You specify different target name in the remotePath argument of the Session.PutFiles:
session.PutFiles(#"C:\path\original.txt", "/home/user/newname.txt");
ad 4) You specify permissions of uploaded file/directory using the TransferOptions.FilePermissions. Note that WinSCP implicitly adds x permission to directories for every group, where r permission is granted. So when you specify 600 permissions for batch upload, the 600 is used for files, while 700 is used for directories. If you need to use different permissions for different files/directories, you need to upload them one by one.

Categories

Resources