Trying to delete an Azure blob - getting blob does not exist exception - c#

I have an Azure storage container with blobs (/images/filename). The filename(uri) is stored in the database at creation time and comes from the file upload save function:
blob.UploadFromStream(filestream);
string uri = blob.Uri.AbsoluteUri;
return uri;
The file upload works fine and when passes to the client with SAS key download works fine too.
Coming to delete the images I have a helper function that was taken from a MS example here:MS Github example
Here is the function:
internal bool DeleteFile(string fileURI)
{
try
{
Uri uri = new Uri(fileURI);
string filename = Path.GetFileName(uri.LocalPath);
CloudBlockBlob fileblob = container.GetBlockBlobReference(filename);
fileblob.Delete();
bool res = fileblob.DeleteIfExists();
return res; //Ok
}
catch(Exception ex)
{
Console.WriteLine(ex);
return false;
}
}
This is all in a helper class which starts as follows:
public class AzureHelpers
{
private string connection;
private CloudStorageAccount storageAccount;
private CloudBlobClient blobClient;
private CloudBlobContainer container;
public AzureHelpers()
{
connection = CloudConfigurationManager.GetSetting("myproject_AzureStorageConnectionString");
storageAccount = CloudStorageAccount.Parse(connection);
blobClient = storageAccount.CreateCloudBlobClient();
container = blobClient.GetContainerReference(Resources.DataStoreRoot);
container.CreateIfNotExists();
}
....
I deliberately added the delete before the deleteIfExists to cause the exception and prove what I suspected that it wasn't finding the file/blob.
As I step through the code however, the CloudBlockBlob is definitely there and has the correct URI etc.
I am wondering if this could be a permissions thing? Or am I missing something else?

I think there is a directory in your container. Assume that you have a container named container_1, and your files are stored in directory like /images/a.jpg. Here you should remember that in this case, your blob name is images/a.jpg, not a.jpg.
In your code, Path.GetFileName only get the file name like a.jpg, so it does not match the real blob name images/a.jpg, which cause the error "does not exist".
So in your DeleteFile(string fileURI) method, try the code below, it works fine at my side:
Uri uri = new Uri(fileURI);
var temp = uri.LocalPath;
string filename = temp.Remove(0, temp.IndexOf('/', 1)+1);
CloudBlockBlob fileblob = container.GetBlockBlobReference(filename);
//fileblob.Delete();
bool res = fileblob.DeleteIfExists();
or use this code snippet:
Uri uri = new Uri(fileURI);
//use this line of code just to get the blob name correctly
CloudBlockBlob blob_temp = new CloudBlockBlob(uri);
var myblob = cloudBlobContainer.GetBlockBlobReference(blob_temp.Name);
bool res = myblob.DeleteIfExists();

Seems to be a permission issue, Can you goto portal and then Edit container meta data on azure bolb storage. change access private to public.

Related

Copy file from one DataLake Gen2 to another Data Lake Gen 2 via C# in Azure Functions

I would like to do something simple!!!
Copy a blob from a Container (SourceContainer) which is in a dataLake gen 2 (SourceDataLake) to a second DatLake (TargetDataLake) via c# code in an Azure Functions.
The connection between my Azure Function and SourceDataLake is secured via Private Link and Private Endpoint.
DataLakeDirectoryClient sourcedirectoryClient2 = sourceDataLakeFileSystemClient.GetDirectoryClient(myPath);
DataLakeFileClient sourcefileClient = sourcedirectoryClient2.GetFileClient(myBlobName);
Response<FileDownloadInfo> downloadResponse = await sourcefileClient.ReadAsync(); //I get the error in this line
Stream reader = downloadResponse.Value.Content;
DataLakeDirectoryClient targetdirectoryClient = taregetDataLakeFileSystemClient.GetDirectoryClient(TargetDirectory);
DataLakeFileClient targetfileClient = await targetdirectoryClient.CreateFileAsync(myBlobName);
await targetfileClient.UploadAsync(reader, true);
for Authentication to DataLake I use this function:
public static DataLakeFileSystemClient GetDataLakeFileSystemClient(string containerName, string dataLakeName, string dataLakeAccessKey)
{
StorageSharedKeyCredential storageSharedKeyCredential = new StorageSharedKeyCredential(dataLakeName, dataLakeAccessKey);
DataLakeClientOptions options = new DataLakeClientOptions(DataLakeClientOptions.ServiceVersion.V2019_07_07);
DataLakeServiceClient dataLakeServiceClient = new DataLakeServiceClient(
new Uri(string.Concat("https://", dataLakeName, ".dfs.core.windows.net")),
storageSharedKeyCredential,
options);
DataLakeFileSystemClient dataLakeFileSystemClient = dataLakeServiceClient.GetFileSystemClient(containerName);
return dataLakeFileSystemClient;
}
This code doesn't work for me. If I delete Priavet Link Private Endpoint from SourceDataLake then it works. Somehow Private Link and Private Endpoint doesn't work with this line of code:
Response<FileDownloadInfo> downloadResponse = await sourcefileClient.ReadAsync();
Do you have any Idea how can I solve this problem? or any better way to copy a blob from DataLake gen2?
Sorry I cannot help with the current issue , but as you asked about other options , I think you can explore Azure data factory . https://learn.microsoft.com/en-us/azure/data-factory/connector-azure-data-lake-storage

How can I copy a file from the isolated storage to the Downloads folder?

I'm trying to copy my database file from the isolated storage to the Download folder (or any folder that the user can access).
Currently my database is stored in:
/data/user/0/com.companyname.appname/files/Databases/MyDatabase.db
I tried to use this code:
public string GetCustomFilePath(string folder, string filename)
{
var docFolder = System.Environment.GetFolderPath(System.Environment.SpecialFolder.Personal);
var libFolder = Path.Combine(docFolder, folder);
if (!Directory.Exists(libFolder))
Directory.CreateDirectory(libFolder);
return Path.Combine(libFolder, filename);
}
var bas = GetDatabaseFilePath("MyDatabase.db");
var des = Path.Combine(Android.OS.Environment.DirectoryDownloads, "MyDatabase.db");
File.Copy(bas, des);
The Android.OS.Environment.DirectoryDownloads property returns the path Download, which is the name of the downloads folder.
But File.Copy() throws an exception telling
System.IO.DirectoryNotFoundException: Destination directory not found:
Download.
I tried to use a slash before like this: /Download/MyDatabase.db with no luck.
Is there any way to copy a file like that? Do I need any permission?
1st) Yes, you do need permissions to write to external storage.
You can get the runtime time permission required by doing it yourself:
https://devblogs.microsoft.com/xamarin/requesting-runtime-permissions-in-android-marshmallow/
Or via a 3rd-party plugin, such as James Montemagno's PermissionsPlugin
https://github.com/jamesmontemagno/PermissionsPlugin
2nd) Once your user accepts that it is ok to write to external storage, you can use:
Android.OS.Environment.ExternalStorageDirectory.AbsolutePath, Android.OS.Environment.DirectoryDownloads
To obtain the path of the device's public Download folder, i.e. using a Forms' dependency service:
public interface IDownloadPath
{
string Get();
}
public class DownloadPath_Android : IDownloadPath
{
public string Get()
{
return Path.Combine(Android.OS.Environment.ExternalStorageDirectory.AbsolutePath, Android.OS.Environment.DirectoryDownloads);
}
}
https://learn.microsoft.com/en-us/xamarin/xamarin-forms/app-fundamentals/dependency-service/introduction
You end up with something like:
public void Handle_Button(object sender, System.EventArgs e)
{
var fileName = "someFile.txt";
using (var stream = File.Create(Path.Combine(FileSystem.CacheDirectory, fileName)))
{
// just creating a dummy file to copy (in the cache dir using Xamarin.Essentials
}
var downloadPath = DependencyService.Get<IDownloadPath>().Get();
File.Copy(Path.Combine(FileSystem.CacheDirectory, fileName), downloadPath);
}

Copy file from one Azure storage account to another

I am trying to copy a file from one storage account to another account using StartCopy method to copy the file. Check the below code.
CloudStorageAccount sourceStorageAccount = CloudStorageAccount.Parse(#"source storage account connection string");
CloudStorageAccount destStorageAccount = CloudStorageAccount.Parse(#"destination storage account connection string");
CloudBlobClient sourceBlobClient = sourceStorageAccount.CreateCloudBlobClient();
CloudBlobClient destBlobClient = destStorageAccount.CreateCloudBlobClient();
var sourceContainer = sourceBlobClient.GetContainerReference("sourceContainer");
var destContainer = destBlobClient.GetContainerReference("destContainer");
CloudBlockBlob sourceBlob = sourceContainer.GetBlockBlobReference("copy.txt");
CloudBlockBlob targetBlob = destContainer.GetBlockBlobReference("copy.txt");
targetBlob.StartCopy(sourceBlob);
But it always return the following error.
Microsoft.WindowsAzure.Storage.StorageException: 'The remote server
returned an error: (404) Not Found.'
What am I missing here ?
Note, the same code works perfectly if I try to copy files from one container to another within same storage account.
Take a look at the following example on how a copy should be performed (taken from Introducing Asynchronous Cross-Account Copy Blob):
public static void CopyBlobs(
CloudBlobContainer srcContainer,
string policyId,
CloudBlobContainer destContainer)
{
// get the SAS token to use for all blobs
string blobToken = srcContainer.GetSharedAccessSignature(
new SharedAccessBlobPolicy(), policyId);
var srcBlobList = srcContainer.ListBlobs(true, BlobListingDetails.None);
foreach (var src in srcBlobList)
{
var srcBlob = src as CloudBlob;
// Create appropriate destination blob type to match the source blob
CloudBlob destBlob;
if (srcBlob.Properties.BlobType == BlobType.BlockBlob)
{
destBlob = destContainer.GetBlockBlobReference(srcBlob.Name);
}
else
{
destBlob = destContainer.GetPageBlobReference(srcBlob.Name);
}
// copy using src blob as SAS
destBlob.StartCopyFromBlob(new Uri(srcBlob.Uri.AbsoluteUri + blobToken));
}
}
Hope it helps!
Here is another way to do this using TransferManager.CopyAsync Method
CloudStorageAccount sourceStorageAccount = CloudStorageAccount.Parse(#"source storage account connection string");
CloudStorageAccount destStorageAccount = CloudStorageAccount.Parse(#"destination storage account connection string");
CloudBlobClient sourceBlobClient = sourceStorageAccount.CreateCloudBlobClient();
CloudBlobClient destBlobClient = destStorageAccount.CreateCloudBlobClient();
var sourceContainer = sourceBlobClient.GetContainerReference("sourceContainer");
var destContainer = destBlobClient.GetContainerReference("destContainer");
CloudBlockBlob sourceBlob = sourceContainer.GetBlockBlobReference("copy.txt");
CloudBlockBlob targetBlob = destContainer.GetBlockBlobReference("copy.txt");
TransferManager.CopyAsync(sourceBlob, targetBlob, true).Wait();
TransferManager is under the namespace Microsoft.WindowsAzure.Storage.DataMovement. To get the reference install Microsoft.Azure.Storage.DataMovement in nuget manager.
i recently ran into this error trying to copy from /uploads to /raw within a single blob account.
The issue was that the container /raw didn't exist on the destination side within the test environment.
(ie, this error is actually thrown by the destination, not the source)

Azure Blob Storage - How to determine if a specified container contains any blobs?

I'm trying to use the Azure Storage SDK and trying to determine if there's a way I can specify a container and find the number of blobs it contains. The posts I've seen thus far only mention checking by the name of the blob, which doesn't suit my needs.
If I do the following:
CloudBlobContainer blobContainer = blobClient.GetContainerReference("my-container");
var blobCount = blobContainer.ListBlobs().Count();
Then I'm hit with a HTTP 404 Exception.
Is there any way to go about this?
You can check the count by using this code:
CloudBlobContainer blobContainer = blobClient.GetContainerReference("my-container");
blobContainer.FetchAttributes();
string count = blobContainer.Metadata["ItemCount"];
int ItemCount;
if(int.Tryparse(count ,out ItemCount))
{
if(ItemCount>0)
// Container is not Empty
else
// Container is Empty
}
else
{
// Conversion failed;
}

The specified container does not exist

Am stuck with this error The specified container does not exist.
let me explain,
CloudBlobClient blobStorage = GetBlobStorage("upload");
CloudBlockBlob blob = BlobPropertySetting(blobStorage, Guid.NewGuid().ToString().ToLower() + Path.GetExtension(file.FileName));
blob.UploadFromStream(file.InputStream);
public static CloudBlobClient GetBlobStorage(string cloudBlobContainserName)
{
CloudBlobClient blobStorage;
try
{
var storageAccount = CloudStorageAccount.FromConfigurationSetting("StorageConnectionString");
blobStorage = storageAccount.CreateCloudBlobClient();
CloudBlobContainer container = blobStorage.GetContainerReference(cloudBlobContainserName);
container.CreateIfNotExist();
var permissions = container.GetPermissions();
permissions.PublicAccess = BlobContainerPublicAccessType.Container;
container.SetPermissions(permissions);
}
catch (Exception ex)
{
Logger.LogError(Log4NetLogger.Category.Exception, "Error in : BlobHandler.GetBlobStorage :>> Exception message: " + ex.Message);
throw;
}
return blobStorage;
}
public static CloudBlockBlob BlobPropertySetting(CloudBlobClient cloudBlobClientReferenceName, string blobContentName)
{
return cloudBlobClientReferenceName.GetBlockBlobReference(blobContentName);
}
and my StorageConnectionString is
<Setting name="StorageConnectionString" value="DefaultEndpointsProtocol=https;AccountName=duw;AccountKey=bla bla" />
container 'upload' and the storage account 'duw' exist.
executing blob.UploadFromStream(file.InputStream); statement causes the error.
stack trace :
at Microsoft.WindowsAzure.StorageClient.Tasks.Task`1.get_Result()
at Microsoft.WindowsAzure.StorageClient.Tasks.Task`1.ExecuteAndWait()
at Microsoft.WindowsAzure.StorageClient.TaskImplHelper.ExecuteImpl(Func`1 impl)
at Microsoft.WindowsAzure.StorageClient.CloudBlob.UploadFromStream(Stream source, BlobRequestOptions options)
at Microsoft.WindowsAzure.StorageClient.CloudBlob.UploadFromStream(Stream source)
at DAL.Handlers.BlobHandler.CreateAd(HttpPostedFileBase file, Advertisement model) in D:\DU Server\trunk\Du Server\DAL\Handlers\BlobHandler.cs:line 151
Inner exception:
{"The remote server returned an error: (404) Not Found."}
can any body help me to sort this out.
Short version
Try the following code for BlobPropertySetting function:
public static CloudBlockBlob BlobPropertySetting(CloudBlobClient cloudBlobClientReferenceName, string blobContentName)
{
CloudBlockBlob blob = cloudBlobClientReferenceName.GetBlockBlobReference("upload/" + blobContentName);
return blob;
}
Now for the longer version :)
The reason you're getting this error is because of the way you are constructing the CloudBlockBlob object in BlobPropertySetting method. When you use your code, it creates a blob object with the following URI: https://duv.blob.core.windows.net/blobContentName. If you notice, there's no container name there. Since there's no container name, storage client library assumes that you're trying to create a blob in $root blob container which is a special blob container. You can read more about it here: http://msdn.microsoft.com/en-us/library/windowsazure/hh488356.aspx. Since your storage account does not have this container, you get 404 - Resource Not Found error.
I am very late, but still thought if my answer would be useful for anyone.
I resolved this error by putting the correct "container name". It was different by default.
I have cloned this GIT project : https://github.com/Azure-Samples/storage-blob-upload-from-webapp-node
const
express = require('express')
, router = express.Router()
, azureStorage = require('azure-storage')
, blobService = azureStorage.createBlobService()
, containerName = 'container' // added container name here, as my container name
, config = require('../config')
;

Categories

Resources