How to download a file to local from Aspose.Cloud? - c#

I am trying to convert doc to pdf using Aspose.Words-Cloud but the conversion API uploads the converted file to cloud and I can't find a clear way to download it to local via code.
This is the code provided by Aspose.Cloud,
private static void downloadToLocal(Configuration config, string path)
{
StorageApi storageApi = new StorageApi(config);
GetDownloadRequest request = new GetDownloadRequest();
request.Path = path;
request.Storage = null;
request.VersionId = null;
var response = storageApi.GetDownload(request);
}
but it is not clear which library StorageApi and GetDownloadRequest are a part of.

It seems you are using some old SDK version of Aspose.Words Cloud. Please note since 19.4 release, now Aspose.Words Cloud API has its own storage methods for storage operations. Please use the latest version of Aspose.Words Cloud SDK for .NET from NuGet to download a file to a local drive from cloud storage as follows.
P.S: I am a developer evangelist at aspose.cloud.
public static async Task DownloadStorageFile_Word()
{
Aspose.Words.Cloud.Sdk.Configuration config = new Aspose.Words.Cloud.Sdk.Configuration();
config.ClientId = ClientId;
config.ClientSecret = ClientSecret;
Aspose.Words.Cloud.Sdk.WordsApi wordsApi = new Aspose.Words.Cloud.Sdk.WordsApi(config);
// Download a File from Cloud Storage
var downloadRequest = new Aspose.Words.Cloud.Sdk.Model.Requests.DownloadFileRequest("Output.pdf",null,null);
var actual = await wordsApi.DownloadFile(downloadRequest);
var fileStream = System.IO.File.Create(#"C:\Temp\Output.pdf");
actual.CopyTo(fileStream);
fileStream.Close();
}

Related

Save App service log into Azure Blob Storage, using Block blobs

How to save our application logs informarion to Azure Storage blob container as .csv (Block blobs type).
My application developed in Asp.Net core 6.0
Please check by using the BlobStream if it helps to resolve the issue:
_container.CreateIfNotExist();
CloudBlob inputBlob = _container.GetBlobReference(outputBlobUri);
CloudBlob outputBlob = _container.GetBlobReference(inputBlobUri);
using (BlobStream input = inputBlob.OpenRead())
using (BlobStream output = outputBlob.OpenWrite())
{
ProcessImage(input, output);
output.Commit();
outputBlob.Properties.ContentType = "csv";
outputBlob.SetProperties();
AppRepository<Post> postRepository = new AppRepository<Post>();
Post post = postRepository.Find(partitionKey, rowkey);
post.PostImage = outputBlobUri;
post.State = true;
postRepository.Update(post);
postRepository.SubmitChange();
_queue.DeleteMessage(message);
}

Check Google Cloud Storage URI (gs://) contains a file from C#

I have an API endpoint that creates files in Google Cloud Storage, and in the end it returns the URI in the gs://bucket_name/file_name format.
How can I write tests in C# to check if the file is indeed created in the cloud? I can download the file with gcutil, but I have not found any library in C# that can handle the gs:// URI to check/download the file.
The official documentation shows how to download objects from buckets
using Google.Cloud.Storage.V1;
using System;
using System.IO;
public class DownloadFileSample
{
public void DownloadFile(
string bucketName = "your-unique-bucket-name",
string objectName = "my-file-name",
string localPath = "my-local-path/my-file-name")
{
var storage = StorageClient.Create();
using var outputFile = File.OpenWrite(localPath);
storage.DownloadObject(bucketName, objectName, outputFile);
Console.WriteLine($"Downloaded {objectName} to {localPath}.");
}
}
Also you can
string fileDownloadPath = #"C:\pathwereyouwanttosavefile\file_Out.pdf";
string objectBlobName = "fileName.pdf";
var gcsStorage = StorageClient.Create();
using var outputFile = File.OpenWrite(fileDownloadPath);
gcsStorage.DownloadObject(bucketName, objectBlobName, outputFile);
Console.WriteLine($"Downloaded {objectBlobName} to {fileDownloadPath}.");
Here you can find more about this implementation
Finally also in the Official documentation shows how to do it with gsutil
gsutil cp gs://BUCKET_NAME/OBJECT_NAME SAVE_TO_LOCATION
You can try to download the object and check for a successful response in order to see if object was created

C# How to Upload image from local to cloud server using amazon s3 bucket

I am trying to transfer file from local machine to cloud server using amazon s3 bucket.
I am stuck with the steps as i am very new to s3 bucket concept. had a code snippet example for listing files in s3 bucket, but no luck on transferring files (using c#). stuck with passing values in uploadonjectfromfilepathasync(). not sure what is the objectkey (2nd parameter in the menthod).
AmazonS3Config ClientConfig = new AmazonS3Config();
ClientConfig.ServiceURL = S3_HOST_ENDPOINT;
IAmazonS3 s3Client = new AmazonS3Client(S3_ACCESS_KEY, S3_SECRET_KEY, ClientConfig);
var ObjectList = s3Client.UploadObjectFromFilePathAsync(S3_BUCKET_NAME,<objectkey>,localfilepath);
You may use Amazon.S3.Transfer library. Here is an example:
using Amazon.S3;
using Amazon.S3.Transfer
public class Uploader
{
readonly IAmazonS3 s3Client;
readonly string bucketName;
public async Task<string> Upload(string prefix, byte[] data, string ext)
{
var uuid = Guid.NewGuid();
var filename = $"{uuid}.{ext}";
var key = $"{prefix}/{filename}";
var stream = new MemoryStream(data);
using (var transferUtil = new TransferUtility(s3Client))
{
await transferUtil.UploadAsync(stream, bucketName, key);
}
return filename;
}
}
This is the snippet from the code i use. prefix and ext are could be optional in your use case. Here these parameters are used to determine the filename (aka key) for the file to be saved in s3 bucket. MemoryStream is class that serves the given data in a stream fashion. UploadAsync also accepts variety of sources such as file to path. You should check for the most suitable method for your application.

file saved path on Azure

I'm new to .NET Core and Azure I have created an API with SQL-Server and I used Dapper for saving the path to the database for POST form-data with an image, like this:
private async Task<string> WriteFile(Image image)
{
String fileName;
IFormFile file = image.image;
long Id = image.ShopId;
try
{
var extension = "." + file.FileName.Split('.')[file.FileName.Split('.').Length - 1];
fileName = Guid.NewGuid().ToString() + extension;
var path = Path.Combine(Directory.GetCurrentDirectory(), "wwwroot\\cccc", fileName);
using (var bits = new FileStream(path, FileMode.Create))
{
await file.CopyToAsync(bits);
}
Image imageupload = new Image(path,Id);
toDb(imageupload);
}
catch (Exception e)
{
return e.Message;
}
return fileName;
}
public void toDb(Image imageUpload)
{
string path = imageUpload.path;
long ShopId = unchecked((int)imageUpload.ShopId);
using (IDbConnection dbConnection = Connection)
{
string sQuery = "UPDATE shop SET path = #path WHERE ShopId = #ShopId ;";
dbConnection.Open();
dbConnection.Execute(sQuery, new {path = path,ShopId = ShopId});
}
}
Before I deployed to Azure it returned image path "F:\\xxxx\\yyyyy\\zzzzzz\\aaaaaa\\wwwroot\\bbbbbb\\5d665cbc-679d-4926-862b-4e10f9358e8a.png"
After i deployed it return my image path
D:\\home\\site\\wwwroot\\wwwroot\\Shops\\a81c757e-df7e-4cf6-b778-20fc5fcf922d.png
can i view image by using this path if it possible how it view;
If the error is my path that file tried to save to how can I fix it? If I changed saved path to wwwroot\\bbbbbb\\5d665cbc-679d-4926-862b-4e10f9358e8a.png can I viewed file it from client app if its also not possible. How can i fixed this?
can i view image by using this path if it possible how it view.
Yes, in the Azure WebApp D:\home is shared for us. We could get more information about Azure WebApp Sandbox from this tutorial.
We could use the Kudu(To access your KUDU console, using your DEPLOYMENT credentials, navigate to https://*****.scm.azurewebsites.net where ***** is the name of your Web App.) to view, upload or download the files.
We also could use the FTP tool to download or upload the files to Azure WebApp site.
can I viewed file it from client app if its also not possible. How can i fixed this?
I recommand that you could store the image information to Azure storge. It is easy for us to access from client side. For more information about how to use Azure Storage, please refer to this document.

Create folder google cloud storage bucket .NET Client Library

I'm looking to at a way to create a folder inside a bucket using the following client library:
https://cloud.google.com/storage/docs/json_api/v1/json-api-dotnet-samples
I've looked at the following thread, and it appears I need to use the simple method upload.
Creating folder in bucket google cloud storage using php
I can't see any way of being able to specify a uploadtype, all requests appear to be uploadType=resumable.
Looking at the above post and a fiddler trace I need to use uploadType=media. Is there away to accomplish this?
It is a bit more straighforward to do this in the latest version of the API.
You still have to make sure that you check for the "/" as Salem has suggested.
public void AddFolder(string folder)
{
StorageClient storageClient = StorageClient.Create();
if (!FolderName.EndsWith("/"))
FolderName += "/";
var content = Encoding.UTF8.GetBytes("");
storageClient.UploadObject(bucketName, folder, "application/x-directory", new MemoryStream(content));
}
This worked for me! Don't know if it's the right way to do it, but there are not enough on this.
public void CreateDir(string FolderName)
{
if (!FolderName.EndsWith("/"))
FolderName += "/";
var uploadStream = new MemoryStream(Encoding.UTF8.GetBytes(""));
Storage.Objects.Insert(
bucket: BucketName,
stream: uploadStream,
contentType: "application/x-directory",
body: new Google.Apis.Storage.v1.Data.Object() { Name = FolderName}
).Upload();
}
EDIT: Just found out that you can directly upload the file to your destination objects, and if the directories/sub-directories don't exist the upload function will create them for you.
All you need to do is put the folder(s) you want before the file name
using (MemoryStream ms = new MemoryStream(System.Text.Encoding.ASCII.GetBytes(json)))
{
string fileName = $"/test/somefolder/{sourceId}_{DateTime.Now.ToString("yyyyMMddHHmmss")}.json";
await gcpClient.UploadObjectAsync(gcpBucketName, fileName, null, ms);
}

Categories

Resources