I'm trying to upload the Image on Firebase Storage but I face some problems with it.
ar storage = StorageClient.Create();
string studentImageRef = bucket + 33333 + ".jpg";
var obj = storage.GetObject(studentImageRef, studentImageRef);
var downloadUrl = obj.MediaLink;
await storage.UploadObjectAsync(obj, null, new UploadObjectOptions { PredefinedAcl = PredefinedObjectAcl.PublicRead });
// Add the download URL to the student's information
studentsInfo.ImageUrl = downloadUrl;
This is the error I encountered.
Error: The service storage has thrown an exception. HttpStatusCode is NotFound. The specified bucket does not exist
And yes I tried several times and make sure the bucket name is correct but the error still persists.
I figured how to fix this problem using this codes
/ Convert image to stream
var stream = new MemoryStream();
imageFile.Save(stream, System.Drawing.Imaging.ImageFormat.Jpeg);
stream.Position = 0;
// Initialize Google Cloud Storage
var storage = StorageClient.Create();
string studentImageRef = "my-bucket.appspot.com/" + Lname_FnameGetter + ".jpg";
await storage.UploadObjectAsync("my-bucket.appspot.com/", studentImageRef, "image/jpeg", stream, new UploadObjectOptions { PredefinedAcl = PredefinedObjectAcl.PublicRead });
// Get the download URL for the image
var obj = storage.GetObject("my-bucket.appspot.com", studentImageRef);
var downloadUrl = obj.MediaLink;
// Add the download url to the student's information
studentsInfo.ImageUrl = downloadUrl;
The ff. codes i provide solve the problem.
Related
I created Bucket on google Storage and I programmatically uploaded some files on it. When I try to download them, I get this exception:
The specified data could not be decrypted
I wrote code such that
GoogleCredential credential = null;
var jsonFileBytes = Properties.Resources.stitcherautoupdate_55bd51f48cf0;
var jsonFileString = Encoding.UTF8.GetString(jsonFileBytes, 0, jsonFileBytes.Length);
var json = Newtonsoft.Json.JsonConvert.DeserializeObject<System.Object>(jsonFileString);
var jsonString = json.ToString();
credential = GoogleCredential.FromJson(jsonString);
StorageClient = StorageClient.Create(credential);
StorageClient.DownloadObject(bucketName, fileName, fileStream);
My recommendation regarding your issue is to try following the methods for upload and download mentioned in the documentation. Once you get it working, you can start slowly changing the code so you know which part is the one causing the issue.
This document describes how you should first configure your Cloud Storage client library and setting up authentication.
A sample code for uploading an object:
private void UploadFile(string bucketName, string localPath,
string objectName = null)
{
var storage = StorageClient.Create();
using (var f = File.OpenRead(localPath))
{
objectName = objectName ?? Path.GetFileName(localPath);
storage.UploadObject(bucketName, objectName, null, f);
Console.WriteLine($"Uploaded {objectName}.");
}
}
For downloading an object:
private void DownloadObject(string bucketName, string objectName,
string localPath = null)
{
var storage = StorageClient.Create();
localPath = localPath ?? Path.GetFileName(objectName);
using (var outputFile = File.OpenWrite(localPath))
{
storage.DownloadObject(bucketName, objectName, outputFile);
}
Console.WriteLine($"downloaded {objectName} to {localPath}.");
}
I am trying to insert an image into the Amazon s3 bucket. But I am getting the following error.
AWSClientFactory.cs is not found.
I am using Amazon SDK version 2.3.55.2
Here is my Cs code.
public string InsertFileToBucket(string fileName, Stream FileStream)
{
using (s3 = Amazon.AWSClientFactory.CreateAmazonS3Client(_awsAccessKey, _awsSecretKey,RegionEndpoint.USEast1))
{
PutObjectRequest request = new PutObjectRequest();
request.BucketName = _s3ImageBucketName;
request.CannedACL = S3CannedACL.PublicRead;
request.Key = fileName;
request.InputStream = FileStream;
PutObjectResponse response = s3.PutObject(request);
}
return fileName;
}
In this code, I am converting the image to stream and passing it to this class. Can anyone help me on this?
I have created hierarchical structure for managing file in bucket. Planning to create folder for each month i.e. dec-2017. There will be more than 10k pdf files in each folder.
Written C# code for getting objects from bucket. This code is working fine for accessing files that are on root of bucket. I am having issues with accessing files in folder i.e. my-bucket/dec-2017/test.pdf is not accessed using code.
Refer my bucket structure here
I am using following code, Can anyone don this before?
if (_storageService == null)
{
_storageService = CreateStorageClient();
}
ObjectsResource.GetRequest downloadRequest = null;
//string objectName = "dec-2017%2Ftest.pdf";
string objectName = "dec-2017/test.pdf";
downloadRequest = new ObjectsResource.GetRequest(_storageService, bucketName, objectName);
MemoryStream stream = new MemoryStream();
downloadRequest.Download(stream);
bytes = stream.ToArray();
Please check below sample code -
using Google.Apis.Auth.OAuth2;
using Google.Cloud.Storage.V1;
string file = Path.Combine(AppDomain.CurrentDomain.BaseDirectory, "Test.csv");
File.WriteAllText(file, "test");
GoogleCredential credential = null;
BucketConnector bucketConnector = new BucketConnector();
credential = bucketConnector.ConnectStream();
var storageClient = StorageClient.Create(credential);
string folderPath = ConfigurationParameters.FOLDER_NAME_IN_BUCKET;
using (FileStream file = File.OpenRead(localPath))
{
objectName = folderPath + Path.GetFileName(localPath);
storage.UploadObject(bucketName, objectName, null, file);
}
I am trying to set the thumbnail for a pull video upload done through the vimeo api. I am developing this for a c# windows service and please note that there are no official libraries for this. For the moment I use this library. I am able to successfully upload the video by following the vimeo documentation, however, when I try to upload an image to be the thumbnail of a video I get an issue. According to the vimeo picutre upload documentation, in step 2, i need to upload my thumbnail image via a PUT request. It says, that I need to do the following:
PUT https://i.cloud.vimeo.com/video/518016424
.... binary data of your file in the body ....
I can't figure out how to do this. I can get the binary data of the image by using
byte[] byte_array_of_image = File.ReadAllBytes(file);
but how can I send this data to the api and get a response (with or without using the library)? If it would help, here is my code to upload the video and thumbnail done so far.
var vc = VimeoClient.ReAuthorize(
accessToken: ConfigurationManager.AppSettings["ACCESS_TOKEN"],
cid: ConfigurationManager.AppSettings["API_KEY"],
secret: ConfigurationManager.AppSettings["API_SECRET"]
);
string temporary_video_dir = ConfigurationManager.AppSettings["TEMP_VIDEO_URL"];
Dictionary<string,string> automatic_pull_parameters = new Dictionary<string, string>();
automatic_pull_parameters.Add("type", "pull");
automatic_pull_parameters.Add("link", temporary_video_dir);
var video_upload_request = vc.Request("/me/videos", automatic_pull_parameters, "POST");
string uploaded_URI = video_upload_request["uri"].ToString();
string video_id = uploaded_URI.Split('/')[2];
Library.WriteErrorLog("Succesfully uploaded Video in test folder. Returned Vimeo ID for video: "+ video_id);
var picture_resource_request = vc.Request("/videos/" + video_id + "/pictures", null, "POST");
string picture_resource_link = picture_resource_request["uri"].ToString();
//Library.WriteErrorLog("uri: " + picture_resource_link);
byte[] binary_image_data = File.ReadAllBytes("http://testclient.xitech.com.au/Videos/Images/Closing_2051.jpg");
string thumbnail_upload_link = picture_resource_link.Split('/')[4];
Please help! Stuck for hours now.
WebClient has a method called UploadData that fits like a glove. Below there is an example that what you can do.
WebClient wb = new WebClient();
wb.Headers.Add("Authorization","Bearer" +AccessToken);
var file = wb.DownloadData(new Uri("http://testclient.xitech.com.au/Videos/Images/Closing_2051.jpg"));
var asByteArrayContent = wb.UploadData(new Uri(picture_resource_request ), "PUT", file);
var asStringContent = Encoding.UTF8.GetString(asByteArrayContent);
reference post:- Vimeo API C# - Uploading a video
the answer is not upvoted but it could be tried worked well in my case.
see the code below:-
public ActionResult UploadChapterVideoVimeo(HttpPostedFileBase file, string productID = "")
{
if (file != null){
var authCheck = Task.Run(async () => await vimeoClient.GetAccountInformationAsync()).Result;
if (authCheck.Name != null)
{
BinaryContent binaryContent = new BinaryContent(file.InputStream, file.ContentType);
int chunkSize = 0;
int contenetLength = file.ContentLength;
int temp1 = contenetLength / 1024;
if (temp1 > 1)
{
chunkSize = temp1 / 1024;
chunkSize = chunkSize * 1048576;
}
else
{ chunkSize = chunkSize * 1048576; }
binaryContent.OriginalFileName = file.FileName;
var d = Task.Run(async () => await vimeoClient.UploadEntireFileAsync(binaryContent, chunkSize, null)).Result;
vmodel.chapter_vimeo_url = "VIMEO-" + d.ClipUri;
}
return RedirectToAction("ProductBuilder", "Products", new { productId = EncryptedProductID, message = "Successfully Uploaded video", type = 1 });
}
}
catch (Exception exc)
{
return RedirectToAction("ProductBuilder", "Products", new { productId = EncryptedProductID, message = "Failed to Uploaded video " + exc.Message, type = 0 });
}
}
return null; }
I have an application where i want to merge two files present on s3 into the third file. I thought of using the Copy Object using multipart upload. Below is the code.
AmazonS3Config config = new AmazonS3Config();
AmazonS3 s3Client = new AmazonS3Client(accessKeyID, secretAccessKey, config);
// List to store upload part responses.
List<UploadPartResponse> uploadResponses =
new List<UploadPartResponse>();
List<CopyPartResponse> copyResponses =
new List<CopyPartResponse>();
InitiateMultipartUploadRequest initiateRequest =
new InitiateMultipartUploadRequest()
.WithBucketName(targetBucket)
.WithKey(targetObjectKey);
InitiateMultipartUploadResponse initResponse =
s3Client.InitiateMultipartUpload(initiateRequest);
String uploadId = initResponse.UploadId;
try
{
// Get object size.
GetObjectMetadataRequest metadataRequest = new GetObjectMetadataRequest();
metadataRequest.BucketName = sourceBucket;
metadataRequest.Key = sourceObjectKey1;
GetObjectMetadataResponse metadataResponse = s3Client.GetObjectMetadata(metadataRequest);
long objectSize1 = metadataResponse.ContentLength; // in bytes
// Get object size.
GetObjectMetadataRequest metadataRequest2 = new GetObjectMetadataRequest();
metadataRequest2.BucketName = sourceBucket;
metadataRequest2.Key = sourceObjectKey2;
GetObjectMetadataResponse metadataResponse2 = s3Client.GetObjectMetadata(metadataRequest2);
long objectSize2 = metadataResponse2.ContentLength; // in bytes
long bytePosition = 0;
CopyPartRequest copyRequest1 = new CopyPartRequest()
.WithDestinationBucket(targetBucket)
.WithDestinationKey(targetObjectKey)
.WithSourceBucket(sourceBucket)
.WithSourceKey(sourceObjectKey1)
.WithUploadID(uploadId)
.WithFirstByte(bytePosition)
.WithLastByte( objectSize1 - 1 )
.WithPartNumber(1);
copyResponses.Add(s3Client.CopyPart(copyRequest1));
CopyPartRequest copyRequest2 = new CopyPartRequest()
.WithDestinationBucket(targetBucket)
.WithDestinationKey(targetObjectKey)
.WithSourceBucket(sourceBucket)
.WithSourceKey(sourceObjectKey2)
.WithUploadID(uploadId)
.WithFirstByte(bytePosition)
.WithLastByte(objectSize2 - 1)
.WithPartNumber(2);
copyResponses.Add(s3Client.CopyPart(copyRequest2));
////
CompleteMultipartUploadRequest completeRequest =
new CompleteMultipartUploadRequest()
.WithBucketName(targetBucket)
.WithKey(targetObjectKey)
.WithUploadId(initResponse.UploadId)
.WithPartETags(GetETags(copyResponses));
CompleteMultipartUploadResponse completeUploadResponse =
s3Client.CompleteMultipartUpload(completeRequest);
}
catch (Exception e)
{
Console.WriteLine(e.Message);
}
But it is throwing exception at the last line CompleteMultipartUpload. Below is the S3 exception: Your proposed upload is smaller than the minimum allowed size
Where as if i only upload copyRequest1 it works fine.
Any help is appreciated!!
Regards,
Haseena
Did you manage to solve the problem? It seems that it can't be done using S3 api
It is not possible to merge uploaded files using S3 API, so I am using FTP to download and Merge.