I have an API endpoint that creates files in Google Cloud Storage, and in the end it returns the URI in the gs://bucket_name/file_name format.
How can I write tests in C# to check if the file is indeed created in the cloud? I can download the file with gcutil, but I have not found any library in C# that can handle the gs:// URI to check/download the file.
The official documentation shows how to download objects from buckets
using Google.Cloud.Storage.V1;
using System;
using System.IO;
public class DownloadFileSample
{
public void DownloadFile(
string bucketName = "your-unique-bucket-name",
string objectName = "my-file-name",
string localPath = "my-local-path/my-file-name")
{
var storage = StorageClient.Create();
using var outputFile = File.OpenWrite(localPath);
storage.DownloadObject(bucketName, objectName, outputFile);
Console.WriteLine($"Downloaded {objectName} to {localPath}.");
}
}
Also you can
string fileDownloadPath = #"C:\pathwereyouwanttosavefile\file_Out.pdf";
string objectBlobName = "fileName.pdf";
var gcsStorage = StorageClient.Create();
using var outputFile = File.OpenWrite(fileDownloadPath);
gcsStorage.DownloadObject(bucketName, objectBlobName, outputFile);
Console.WriteLine($"Downloaded {objectBlobName} to {fileDownloadPath}.");
Here you can find more about this implementation
Finally also in the Official documentation shows how to do it with gsutil
gsutil cp gs://BUCKET_NAME/OBJECT_NAME SAVE_TO_LOCATION
You can try to download the object and check for a successful response in order to see if object was created
Related
I am trying to convert doc to pdf using Aspose.Words-Cloud but the conversion API uploads the converted file to cloud and I can't find a clear way to download it to local via code.
This is the code provided by Aspose.Cloud,
private static void downloadToLocal(Configuration config, string path)
{
StorageApi storageApi = new StorageApi(config);
GetDownloadRequest request = new GetDownloadRequest();
request.Path = path;
request.Storage = null;
request.VersionId = null;
var response = storageApi.GetDownload(request);
}
but it is not clear which library StorageApi and GetDownloadRequest are a part of.
It seems you are using some old SDK version of Aspose.Words Cloud. Please note since 19.4 release, now Aspose.Words Cloud API has its own storage methods for storage operations. Please use the latest version of Aspose.Words Cloud SDK for .NET from NuGet to download a file to a local drive from cloud storage as follows.
P.S: I am a developer evangelist at aspose.cloud.
public static async Task DownloadStorageFile_Word()
{
Aspose.Words.Cloud.Sdk.Configuration config = new Aspose.Words.Cloud.Sdk.Configuration();
config.ClientId = ClientId;
config.ClientSecret = ClientSecret;
Aspose.Words.Cloud.Sdk.WordsApi wordsApi = new Aspose.Words.Cloud.Sdk.WordsApi(config);
// Download a File from Cloud Storage
var downloadRequest = new Aspose.Words.Cloud.Sdk.Model.Requests.DownloadFileRequest("Output.pdf",null,null);
var actual = await wordsApi.DownloadFile(downloadRequest);
var fileStream = System.IO.File.Create(#"C:\Temp\Output.pdf");
actual.CopyTo(fileStream);
fileStream.Close();
}
My app requires copying file using SFTP from a location directly to Azure storage.
Our app is using C# with .NET 4.6 and our WinSCP version is 5.21.1.
My old code works using Session.GetFileToDirectory() method, but the problem is it need to store the file on temp folder inside our hosting.
using (Session session = new Session())
{
session.Open(sessionOptions);
TransferOptions transferOptions = new TransferOptions();
transferOptions.TransferMode = TransferMode.Binary;
var transfer = session.GetFileToDirectory(FilePath, fullPath);
using (Stream stream = File.OpenRead(transfer.Destination))
{
UploadToAzure(stream, Filename, Foldername);
}
}
As we planned to entirely use Azure storage, I change my code like this
using (Session session = new Session())
{
session.Open(sessionOptions);
TransferOptions transferOptions = new TransferOptions();
transferOptions.TransferMode = TransferMode.Binary;
using (Stream stream = session.GetFile(FilePath, transferOptions))
{
UploadToAzure(stream, Filename, Foldername);
}
}
Here my library that uploads the file using Stream to Azure.
This code is working fine using my old code that still save to temp folder before send to Azure.
public static string UploadToAzure(Stream attachment, string Filename, string Foldername)
{
System.Net.ServicePointManager.SecurityProtocol = System.Net.SecurityProtocolType.Tls12;
var connectionString = $"{ConfigurationManager.AppSettings["AzureFileShareConnectionString"]}";
string shareName = $"{ConfigurationManager.AppSettings["AzureFileShareFolderName"]}";
string dirName = $"files\\{Foldername}";
string fileName = Filename;
try
{
ShareClient share = new ShareClient(connectionString, shareName);
share.CreateIfNotExists();
ShareDirectoryClient directory = share.GetDirectoryClient(dirName);
directory.CreateIfNotExists();
// Get a reference to a file and upload it
ShareFileClient file = directory.GetFileClient(fileName);
file.Create(attachment.Length);
file.UploadRange(
new HttpRange(0, attachment.Length), attachment);
}
catch (Exception e)
{
return $"Uploaded {Filename} failed : {e.ToString()}";
}
return $"{Filename} Uploaded";
}
But currently my new code not working with error message
'((WinSCP.PipeStream)stream).Length' threw an exception of type 'System.NotSupportedException'.
This is the object description on creating stream using Session.GetFile method
This is 'exception stacktrace' on sending the empty-stream to Azure
The Stream returned by WinSCP Session.GetFile does not implement the Stream.Length property, because WinSCP cannot guarantee that the size of the file is fixed. The remote file might be changing while you are downloading the file. Not to mention ASCII transfer mode, when the file is converted while being transferred, with unpredictable impact on the final size.
You use the size (Stream.Length) in two places:
When creating the file:
file.Create(attachment.Length);
The parameter of ShareFileClient.Create is maxSize. So it does not look like it's a real size. You can possibly just put an arbitrary large number here.
Or if you prefer (and know that the file is not changing), read the current size of the remote file using Session.GetFileInfo and RemoteFileInfo.Length:
file.Create(session.GetFileInfo(FilePath).Length);
When uploading the contents:
file.UploadRange(new HttpRange(0, attachment.Length), attachment);
The above can be replaced with simple ShareFileClient.Upload:
file.Upload(attachment);
I am trying to transfer file from local machine to cloud server using amazon s3 bucket.
I am stuck with the steps as i am very new to s3 bucket concept. had a code snippet example for listing files in s3 bucket, but no luck on transferring files (using c#). stuck with passing values in uploadonjectfromfilepathasync(). not sure what is the objectkey (2nd parameter in the menthod).
AmazonS3Config ClientConfig = new AmazonS3Config();
ClientConfig.ServiceURL = S3_HOST_ENDPOINT;
IAmazonS3 s3Client = new AmazonS3Client(S3_ACCESS_KEY, S3_SECRET_KEY, ClientConfig);
var ObjectList = s3Client.UploadObjectFromFilePathAsync(S3_BUCKET_NAME,<objectkey>,localfilepath);
You may use Amazon.S3.Transfer library. Here is an example:
using Amazon.S3;
using Amazon.S3.Transfer
public class Uploader
{
readonly IAmazonS3 s3Client;
readonly string bucketName;
public async Task<string> Upload(string prefix, byte[] data, string ext)
{
var uuid = Guid.NewGuid();
var filename = $"{uuid}.{ext}";
var key = $"{prefix}/{filename}";
var stream = new MemoryStream(data);
using (var transferUtil = new TransferUtility(s3Client))
{
await transferUtil.UploadAsync(stream, bucketName, key);
}
return filename;
}
}
This is the snippet from the code i use. prefix and ext are could be optional in your use case. Here these parameters are used to determine the filename (aka key) for the file to be saved in s3 bucket. MemoryStream is class that serves the given data in a stream fashion. UploadAsync also accepts variety of sources such as file to path. You should check for the most suitable method for your application.
I'm looking to at a way to create a folder inside a bucket using the following client library:
https://cloud.google.com/storage/docs/json_api/v1/json-api-dotnet-samples
I've looked at the following thread, and it appears I need to use the simple method upload.
Creating folder in bucket google cloud storage using php
I can't see any way of being able to specify a uploadtype, all requests appear to be uploadType=resumable.
Looking at the above post and a fiddler trace I need to use uploadType=media. Is there away to accomplish this?
It is a bit more straighforward to do this in the latest version of the API.
You still have to make sure that you check for the "/" as Salem has suggested.
public void AddFolder(string folder)
{
StorageClient storageClient = StorageClient.Create();
if (!FolderName.EndsWith("/"))
FolderName += "/";
var content = Encoding.UTF8.GetBytes("");
storageClient.UploadObject(bucketName, folder, "application/x-directory", new MemoryStream(content));
}
This worked for me! Don't know if it's the right way to do it, but there are not enough on this.
public void CreateDir(string FolderName)
{
if (!FolderName.EndsWith("/"))
FolderName += "/";
var uploadStream = new MemoryStream(Encoding.UTF8.GetBytes(""));
Storage.Objects.Insert(
bucket: BucketName,
stream: uploadStream,
contentType: "application/x-directory",
body: new Google.Apis.Storage.v1.Data.Object() { Name = FolderName}
).Upload();
}
EDIT: Just found out that you can directly upload the file to your destination objects, and if the directories/sub-directories don't exist the upload function will create them for you.
All you need to do is put the folder(s) you want before the file name
using (MemoryStream ms = new MemoryStream(System.Text.Encoding.ASCII.GetBytes(json)))
{
string fileName = $"/test/somefolder/{sourceId}_{DateTime.Now.ToString("yyyyMMddHHmmss")}.json";
await gcpClient.UploadObjectAsync(gcpBucketName, fileName, null, ms);
}
I am creating class library file. In this I embedded stored procedure script file. so I need to take sp data as a string and I have to create in c#. so for this GetManifestResourceStream method need full-name of assemble and script file. so I did . But I did not figure out why my stream object getting null value.
string strNameSpace = System.Reflection.Assembly.GetExecutingAssembly().GetName().Name;
using (Stream stream = Assembly.GetExecutingAssembly()
.GetManifestResourceStream(strNameSpace + "GP_SOP_AdjustTax.sql"))
{
// Here stream is null.
using (StreamReader reader = new StreamReader(stream))
{
string result = reader.ReadToEnd();
}
}
It is sort of strange to get constant string value by getting assembly name... But you are missing "." in the way to construct the name:
Assembly.GetExecutingAssembly()
.GetManifestResourceStream(strNameSpace + ".GP_SOP_AdjustTax.sql"))
It will likely be safe and easier to simply hardcode the name:
Assembly.GetExecutingAssembly()
.GetManifestResourceStream("WhateverYoourNamespaceIs.GP_SOP_AdjustTax.sql"))
Note "How to embed and access resources" is available on Micorsoft support site and covers this topic.