The specified container does not exist - c#

Am stuck with this error The specified container does not exist.
let me explain,
CloudBlobClient blobStorage = GetBlobStorage("upload");
CloudBlockBlob blob = BlobPropertySetting(blobStorage, Guid.NewGuid().ToString().ToLower() + Path.GetExtension(file.FileName));
blob.UploadFromStream(file.InputStream);
public static CloudBlobClient GetBlobStorage(string cloudBlobContainserName)
{
CloudBlobClient blobStorage;
try
{
var storageAccount = CloudStorageAccount.FromConfigurationSetting("StorageConnectionString");
blobStorage = storageAccount.CreateCloudBlobClient();
CloudBlobContainer container = blobStorage.GetContainerReference(cloudBlobContainserName);
container.CreateIfNotExist();
var permissions = container.GetPermissions();
permissions.PublicAccess = BlobContainerPublicAccessType.Container;
container.SetPermissions(permissions);
}
catch (Exception ex)
{
Logger.LogError(Log4NetLogger.Category.Exception, "Error in : BlobHandler.GetBlobStorage :>> Exception message: " + ex.Message);
throw;
}
return blobStorage;
}
public static CloudBlockBlob BlobPropertySetting(CloudBlobClient cloudBlobClientReferenceName, string blobContentName)
{
return cloudBlobClientReferenceName.GetBlockBlobReference(blobContentName);
}
and my StorageConnectionString is
<Setting name="StorageConnectionString" value="DefaultEndpointsProtocol=https;AccountName=duw;AccountKey=bla bla" />
container 'upload' and the storage account 'duw' exist.
executing blob.UploadFromStream(file.InputStream); statement causes the error.
stack trace :
at Microsoft.WindowsAzure.StorageClient.Tasks.Task`1.get_Result()
at Microsoft.WindowsAzure.StorageClient.Tasks.Task`1.ExecuteAndWait()
at Microsoft.WindowsAzure.StorageClient.TaskImplHelper.ExecuteImpl(Func`1 impl)
at Microsoft.WindowsAzure.StorageClient.CloudBlob.UploadFromStream(Stream source, BlobRequestOptions options)
at Microsoft.WindowsAzure.StorageClient.CloudBlob.UploadFromStream(Stream source)
at DAL.Handlers.BlobHandler.CreateAd(HttpPostedFileBase file, Advertisement model) in D:\DU Server\trunk\Du Server\DAL\Handlers\BlobHandler.cs:line 151
Inner exception:
{"The remote server returned an error: (404) Not Found."}
can any body help me to sort this out.

Short version
Try the following code for BlobPropertySetting function:
public static CloudBlockBlob BlobPropertySetting(CloudBlobClient cloudBlobClientReferenceName, string blobContentName)
{
CloudBlockBlob blob = cloudBlobClientReferenceName.GetBlockBlobReference("upload/" + blobContentName);
return blob;
}
Now for the longer version :)
The reason you're getting this error is because of the way you are constructing the CloudBlockBlob object in BlobPropertySetting method. When you use your code, it creates a blob object with the following URI: https://duv.blob.core.windows.net/blobContentName. If you notice, there's no container name there. Since there's no container name, storage client library assumes that you're trying to create a blob in $root blob container which is a special blob container. You can read more about it here: http://msdn.microsoft.com/en-us/library/windowsazure/hh488356.aspx. Since your storage account does not have this container, you get 404 - Resource Not Found error.

I am very late, but still thought if my answer would be useful for anyone.
I resolved this error by putting the correct "container name". It was different by default.
I have cloned this GIT project : https://github.com/Azure-Samples/storage-blob-upload-from-webapp-node
const
express = require('express')
, router = express.Router()
, azureStorage = require('azure-storage')
, blobService = azureStorage.createBlobService()
, containerName = 'container' // added container name here, as my container name
, config = require('../config')
;

Related

Trying to delete an Azure blob - getting blob does not exist exception

I have an Azure storage container with blobs (/images/filename). The filename(uri) is stored in the database at creation time and comes from the file upload save function:
blob.UploadFromStream(filestream);
string uri = blob.Uri.AbsoluteUri;
return uri;
The file upload works fine and when passes to the client with SAS key download works fine too.
Coming to delete the images I have a helper function that was taken from a MS example here:MS Github example
Here is the function:
internal bool DeleteFile(string fileURI)
{
try
{
Uri uri = new Uri(fileURI);
string filename = Path.GetFileName(uri.LocalPath);
CloudBlockBlob fileblob = container.GetBlockBlobReference(filename);
fileblob.Delete();
bool res = fileblob.DeleteIfExists();
return res; //Ok
}
catch(Exception ex)
{
Console.WriteLine(ex);
return false;
}
}
This is all in a helper class which starts as follows:
public class AzureHelpers
{
private string connection;
private CloudStorageAccount storageAccount;
private CloudBlobClient blobClient;
private CloudBlobContainer container;
public AzureHelpers()
{
connection = CloudConfigurationManager.GetSetting("myproject_AzureStorageConnectionString");
storageAccount = CloudStorageAccount.Parse(connection);
blobClient = storageAccount.CreateCloudBlobClient();
container = blobClient.GetContainerReference(Resources.DataStoreRoot);
container.CreateIfNotExists();
}
....
I deliberately added the delete before the deleteIfExists to cause the exception and prove what I suspected that it wasn't finding the file/blob.
As I step through the code however, the CloudBlockBlob is definitely there and has the correct URI etc.
I am wondering if this could be a permissions thing? Or am I missing something else?
I think there is a directory in your container. Assume that you have a container named container_1, and your files are stored in directory like /images/a.jpg. Here you should remember that in this case, your blob name is images/a.jpg, not a.jpg.
In your code, Path.GetFileName only get the file name like a.jpg, so it does not match the real blob name images/a.jpg, which cause the error "does not exist".
So in your DeleteFile(string fileURI) method, try the code below, it works fine at my side:
Uri uri = new Uri(fileURI);
var temp = uri.LocalPath;
string filename = temp.Remove(0, temp.IndexOf('/', 1)+1);
CloudBlockBlob fileblob = container.GetBlockBlobReference(filename);
//fileblob.Delete();
bool res = fileblob.DeleteIfExists();
or use this code snippet:
Uri uri = new Uri(fileURI);
//use this line of code just to get the blob name correctly
CloudBlockBlob blob_temp = new CloudBlockBlob(uri);
var myblob = cloudBlobContainer.GetBlockBlobReference(blob_temp.Name);
bool res = myblob.DeleteIfExists();
Seems to be a permission issue, Can you goto portal and then Edit container meta data on azure bolb storage. change access private to public.

How to fix "request error" from google cloud storage?

I used google.cloud.storage for .mp3 files. I tried it before and it worked.
now I writed another new function that uses the existing storage function.
everything was suppose to work but I get this weird error:
"Google.Apis.Requests.RequestError
text-to-speach#XXXXXXXXXXX.iam.gserviceaccount.com does not have storage.objects.create access to objectsound/voiceAnimals20.mp3. [403]"
I dont know what to begin with. can anybody help me?
the Ok storage function is here:
public static string VoiceStorage(int catId, string URL,
Dictionary<string, int> voicesCounter)
{
Environment.SetEnvironmentVariable("GOOGLE_APPLICATION_CREDENTIALS",
#"C:\wordproject-XXXXXXXXXX.json");
// upload the image storage
//----------------
string voiceName;
voiceName = "voice" + BLLcategory.GetCategoryById(catId).CategoryName
+ voicesCounter[BLLcategory.GetCategoryById(catId).CategoryName]++ +
".mp3";
string bucketName = "XXXXXXXX";
var storage = StorageClient.Create();
using (var f = File.OpenRead(URL))
{
try
{
var res = storage.UploadObject(bucketName, voiceName, null,
f);
URL = "https://storage.cloud.google.com/" + bucketName + "/" +
voiceName;
}
catch (Exception e)
{
throw e;
}
}
return URL;
}
the new not working function is:
private void button12_Click(object sender, EventArgs e)
{
foreach (COMimageObject obj in BLLobject.GetObjects())
{
if(obj.VoiceURL==null)
{
try
{
string url=BLLtextToSpeach.TextToSpeach(obj.Name);
url=BLLtextToSpeach.VoiceStorage(
BLLimage.GetImageById(obj.ImageID).CategoryID,
url, voicesCounter);
BLLobject.UpdateVoiceURL(obj.ObjectId, url);
}
catch (Exception)
{
throw;
}
}
}
}
the catch happening after the line with url=BLLtextToSpeach.VoiceStorage
tnx!!
What the error mean is that the service account of the text-to-speach api doesn't have create access on the objectsound bucket. Go to the bucket permissions and add the text-to-speach service account with storage creator rights.
When you get this error:
[403] Errors [ Message[.................iam.gserviceaccount.com does
not have storage.buckets.list access to the Google Cloud project.
you need to have permissions in your service account from the IAM to read the buckets
storage.buckets.get Read bucket metadata, excluding IAM policies.
storage.buckets.list List buckets in a project. Also read bucket
metadata,
https://cloud.google.com/storage/docs/access-control/iam-permissions
The "Firebase Admin" will enable all necessary permissions.

Copy file from one Azure storage account to another

I am trying to copy a file from one storage account to another account using StartCopy method to copy the file. Check the below code.
CloudStorageAccount sourceStorageAccount = CloudStorageAccount.Parse(#"source storage account connection string");
CloudStorageAccount destStorageAccount = CloudStorageAccount.Parse(#"destination storage account connection string");
CloudBlobClient sourceBlobClient = sourceStorageAccount.CreateCloudBlobClient();
CloudBlobClient destBlobClient = destStorageAccount.CreateCloudBlobClient();
var sourceContainer = sourceBlobClient.GetContainerReference("sourceContainer");
var destContainer = destBlobClient.GetContainerReference("destContainer");
CloudBlockBlob sourceBlob = sourceContainer.GetBlockBlobReference("copy.txt");
CloudBlockBlob targetBlob = destContainer.GetBlockBlobReference("copy.txt");
targetBlob.StartCopy(sourceBlob);
But it always return the following error.
Microsoft.WindowsAzure.Storage.StorageException: 'The remote server
returned an error: (404) Not Found.'
What am I missing here ?
Note, the same code works perfectly if I try to copy files from one container to another within same storage account.
Take a look at the following example on how a copy should be performed (taken from Introducing Asynchronous Cross-Account Copy Blob):
public static void CopyBlobs(
CloudBlobContainer srcContainer,
string policyId,
CloudBlobContainer destContainer)
{
// get the SAS token to use for all blobs
string blobToken = srcContainer.GetSharedAccessSignature(
new SharedAccessBlobPolicy(), policyId);
var srcBlobList = srcContainer.ListBlobs(true, BlobListingDetails.None);
foreach (var src in srcBlobList)
{
var srcBlob = src as CloudBlob;
// Create appropriate destination blob type to match the source blob
CloudBlob destBlob;
if (srcBlob.Properties.BlobType == BlobType.BlockBlob)
{
destBlob = destContainer.GetBlockBlobReference(srcBlob.Name);
}
else
{
destBlob = destContainer.GetPageBlobReference(srcBlob.Name);
}
// copy using src blob as SAS
destBlob.StartCopyFromBlob(new Uri(srcBlob.Uri.AbsoluteUri + blobToken));
}
}
Hope it helps!
Here is another way to do this using TransferManager.CopyAsync Method
CloudStorageAccount sourceStorageAccount = CloudStorageAccount.Parse(#"source storage account connection string");
CloudStorageAccount destStorageAccount = CloudStorageAccount.Parse(#"destination storage account connection string");
CloudBlobClient sourceBlobClient = sourceStorageAccount.CreateCloudBlobClient();
CloudBlobClient destBlobClient = destStorageAccount.CreateCloudBlobClient();
var sourceContainer = sourceBlobClient.GetContainerReference("sourceContainer");
var destContainer = destBlobClient.GetContainerReference("destContainer");
CloudBlockBlob sourceBlob = sourceContainer.GetBlockBlobReference("copy.txt");
CloudBlockBlob targetBlob = destContainer.GetBlockBlobReference("copy.txt");
TransferManager.CopyAsync(sourceBlob, targetBlob, true).Wait();
TransferManager is under the namespace Microsoft.WindowsAzure.Storage.DataMovement. To get the reference install Microsoft.Azure.Storage.DataMovement in nuget manager.
i recently ran into this error trying to copy from /uploads to /raw within a single blob account.
The issue was that the container /raw didn't exist on the destination side within the test environment.
(ie, this error is actually thrown by the destination, not the source)

Azure Blob Storage - How to determine if a specified container contains any blobs?

I'm trying to use the Azure Storage SDK and trying to determine if there's a way I can specify a container and find the number of blobs it contains. The posts I've seen thus far only mention checking by the name of the blob, which doesn't suit my needs.
If I do the following:
CloudBlobContainer blobContainer = blobClient.GetContainerReference("my-container");
var blobCount = blobContainer.ListBlobs().Count();
Then I'm hit with a HTTP 404 Exception.
Is there any way to go about this?
You can check the count by using this code:
CloudBlobContainer blobContainer = blobClient.GetContainerReference("my-container");
blobContainer.FetchAttributes();
string count = blobContainer.Metadata["ItemCount"];
int ItemCount;
if(int.Tryparse(count ,out ItemCount))
{
if(ItemCount>0)
// Container is not Empty
else
// Container is Empty
}
else
{
// Conversion failed;
}

Not able to mount VHD drive on azure server

Please help me. I am writing following code to mount the vhd file. But I am not able to mount it. It works fine locally but when I deploy it on azure server the webrole remains offline. I tried by removing foreach block below but in vain. But when I removed the code "Global.driveLetter = drive.Mount(localCache.MaximumSizeInMegabytes - 20, DriveMountOptions.Force);" role got ready on server. But I can't do this because this is the key statement to mount the drive.
What would be the problem?
private static void MountAzureDrive()
{
string connectionStringSettingName = "AzureConnectionString";
string azureDriveContainerName = "azuredrives";
string azureDrivePageBlobName = Guid.NewGuid().ToString("N").ToLowerInvariant();
string azureDriveCacheDirName = Path.Combine(Environment.CurrentDirectory, "cache");
CloudStorageAccount.SetConfigurationSettingPublisher((a, b) =>
{
b(RoleEnvironment.GetConfigurationSettingValue(connectionStringSettingName));
});
//CloudStorageAccount storageAccount=CloudStorageAccount.FromConfigurationSetting(connectionStringSettingName);
CloudStorageAccount storageAccount=CloudStorageAccount.DevelopmentStorageAccount;
LocalResource localCache=RoleEnvironment.GetLocalResource("InstanceDriveCache");
CloudDrive.InitializeCache(localCache.RootPath + "cache", localCache.MaximumSizeInMegabytes);
// Just checking: make sure the container exists
CloudBlobClient blobClient = storageAccount.CreateCloudBlobClient();
blobClient.GetContainerReference("drives").CreateIfNotExist();
// Create cloud drive
//WebRole.drive=storageAccount.CreateCloudDrive(blobClient.GetContainerReference("drives").GetPageBlobReference("Test.VHD").Uri.ToString());
WebRole.drive = storageAccount.CreateCloudDrive("drives/Test.VHD");
try
{
WebRole.drive.CreateIfNotExist(512);
}
catch (CloudDriveException ex)
{
// handle exception here
// exception is also thrown if all is well but the drive already exists
}
foreach (var d in CloudDrive.GetMountedDrives())
{
var mountedDrive = storageAccount.CreateCloudDrive(d.Value.PathAndQuery);
mountedDrive.Unmount();
}
//Global.driveLetter = drive.Mount(25, DriveMountOptions.Force);
Global.driveLetter = drive.Mount(localCache.MaximumSizeInMegabytes - 20, DriveMountOptions.Force);
}
Thanks in advance.
Maybe this is stating the obvious, but... when you deploy to Windows Azure, did you change the storage account from dev storage? You have the dev storage emulator hard-coded:
CloudStorageAccount storageAccount=CloudStorageAccount.DevelopmentStorageAccount;

Categories

Resources