I want to upload an image to a specific folder and if that folder does not exist create it and make this folder shareable to another email address.
I use the below code:
MetadataChangeSet changeSetfile = new MetadataChangeSet.Builder()
.SetTitle("Test.jpg")
.SetMimeType("image/jpeg")
.Build();
MetadataChangeSet changeSetfolder = new MetadataChangeSet.Builder()
.SetTitle("New folder")
.SetMimeType(DriveFolder.MimeType)
.SetStarred(true)
.Build();
DriveClass.DriveApi
.GetRootFolder(_googleApiClient)
.CreateFolder(_googleApiClient, changeSetfile) ;
DriveClass.DriveApi
.GetRootFolder(_googleApiClient)
.CreateFile(_googleApiClient, changeSetfolder, contentResults.DriveContents);
First, you have to check if the folder exist. According to this tutorial - Search files on Google Drive with C#, you can use:
By file name
What if you know the name of the file or part of the name well you can search on name. Fulltext is also nice it allows you to search on “Full text of the file including name, description, content, and indexable text.”
List files only directories
It is also possible to search for only one type of file say you just want to return all the Google sheets on your account? What if you just wanted to find all the folders well folders is a mimetype of “application/vnd.google-apps.folder” so we can search on just that.
Upon check if it is existing, get the ID to be place in the parents parameter:
var folderId = "0BwwA4oUTeiV1TGRPeTVjaWRDY1E";
var fileMetadata = new File()
{
Name = "photo.jpg",
Parents = new List<string>
{
folderId
}
};
FilesResource.CreateMediaUpload request;
using (var stream = new System.IO.FileStream("files/photo.jpg",
System.IO.FileMode.Open))
{
request = driveService.Files.Create(
fileMetadata, stream, "image/jpeg");
request.Fields = "id";
request.Upload();
}
var file = request.ResponseBody;
Console.WriteLine("File ID: " + file.Id);
Hope this helps.
Related
I am having folder structure like
(Container)->(1)School->(2)Staffs
->(2.a)OfficeStaffs
-> (2.a.i)Admin ->(Blobs)
-> (2.a.ii)Clerk->(Blobs)
->(2.b)Teachers
->(2.b.i)SeniorStudents ->(Year)->AttendanceReport.xlx
->ExamReport.xlx
->(2.b.ii)JuniorStudents ->(Year)->AttendanceReport.xlx
->ExamReport.xlx
School is my parent folder and all other folders are sub folders. Now I need to find blobs using folder name. The folder name may persist in middle. For example user have the search options in the UI by Staff type or Teachers or Students Type or By Year. There is no mandatory options to search blobs by folder level one by one. If the User selects Teachers , need to display all teachers and Students folder with respective blobs. If the user selects year , we need to get all the blobs belongs to the particular year folder. In this case, we will receive 'Year' value from user. We will not be knowing its parent folders. Based on the year only we need to retrieve. If User selects OfficeStaffs and Teachers, we need to retrieve all the subfolders and blobs from both the folders.
I tried with Blob Prefix to get middle folder but no luck. It is always expecting the Initial folder path and with next folders in order basis. Could not able to get the middle folder.
BlobContainerClient client = new BlobContainerClient(connectionString, containerName);
List<FileData> files = new List<FileData>();
await foreach (BlobItem file in client.GetBlobsAsync(prefix: "SeniorStudents"))
{
files.Add(new FileData
{
FileName = file.Name
}
}
This is not getting the blobs under SeniorStudents folder. It is returning empty. Please help me on this. Thanks.
I am having folder structure like
No, you don't. Unless hierarchical namespace is enabled all the (sub)folders you see are virtual. Those folders are defined by the name of the blob. Each / will be seen as a virtual folder in the storage explorer.
In your case you have multiple blobs in a container:
School/Staffs/OfficeStaffs/Admin/Blob1.ext
School/Staffs/OfficeStaffs/Clerk/Blob2.ext
School/Staffs/Teachers/SeniorStudents/2022/AttendanceReport.xlx
School/Staffs/Teachers/SeniorStudents/2022/ExamReports.xlx
School/Staffs/Teachers/JuniorStudents/2022/AttendanceReport.xlx
School/Staffs/Teachers/JuniorStudents/2022/ExamReports.xlx
As you can see it is a flat list. When you try to find blobs based on a prefix you need to remember it is like the equivalent of matching a string using the C# string.StartsWith method.
So with prefix School/Staffs/OfficeStaffs/Admin/ you will find blob 1, a prefix School/Staffs/Teachers will give you blobs 3 to 6. The prefix Staffs does not list any blobs as there are no blobs that have the text staffs at the start of their name.
In your case, that means that you will have to get all blobs, split their names using for example string.Split(). For example, the code below finds all blobs that are somehow in a folder named SeniorStudents, no matter at what level that virtual folder is present:
class FileData
{
public string FileName { get;set;}
public IEnumerable<string> Folders => FileName.Split('/').SkipLast(1);
}
...
await foreach (BlobItem file in client.GetBlobsAsync())
{
files.Add(new FileData
{
FileName = file.Name
});
}
var targetFiles = files.Where(f => f.Folders.Contains(("SeniorStudents")));
In the above example, if you want all 2022 files for all teachers you can do:
var targetFiles = files.Where(f => f.Folders.Contains("Teachers") && f.Folders.Contains("2022"));
Alternative
If you have lots of blobs the above method will force you to perform an inefficient query to get maybe 5 results out of 2000 blobs because you need to get all the blobs before you can determine whether they match the criteria.
As an alternative you might want to add tags to your blobs, each tag representing a folder or category. It is then easy to find all blobs having a specific tag. Beware the limits, there may be up to 10 tags defined on a given blob, see the docs.
Fetching the folders from container
string connectionString = "Connection String";
List<string> dir = new List<string>();
BlobServiceClient blobServiceClient = new BlobServiceClient(connectionString);
string containerName = "containerName";
BlobContainerClient containerClient = blobServiceClient.GetBlobContainerClient(containerName);
var blobitem = containerClient.GetBlobs(prefix: "test");
List<FileData> files = new List<FileData>();
foreach (var file in blobitem)
{
Console.WriteLine(file.Name);
string[] sub_names = file.Name.Split('/');
Console.WriteLine(sub_names.Length);
files.Add(new FileData
{
FileName = file.Name
});
if (sub_names.Length > 1 && !dir.Contains(sub_names[sub_names.Length - 1]))
{
dir.Add(sub_names[sub_names.Length - 1]);
}
}
foreach (var item in dir)
{
Console.WriteLine(item);
}
Fetching the files and folder structure using code
Sample Output using the C# code
Blob storage in explorer
In Azure Portal
Update
For fetching blobs or files from azure portal and also uploading files to specific folder can also be done by using the below code.
var blob_Container = GetBlobContainer(containerName);
var blob_Directory = blob_Container.GetDirectoryReference(directoryName);
var blob_Infos = new List<BlobFileInfo>();
var blobs = blob_Directory.ListBlobs().ToList();
foreach (var blob in blobs)
{
if (blob is CloudBlockBlob)
{
var blob_FileName = blob.Uri.Segments.Last().Replace("%20", " ");
var blob_FilePath = blob.Uri.AbsolutePath.Replace(blob.Container.Uri.AbsolutePath + "/", "").Replace("%20", " ");
var blob_Path = blob_FilePath.Replace("/" + blob_FileName, "");
blob_Infos.Add(new BlobFileInfo
{
File = blob_FileName,
Path = blob_Path,
Blob_FilePath = blob_FilePath,
Blob = blob
});
}
if (blob is CloudBlobDirectory)
{
var blob_Dir = blob.Uri.OriginalString.Replace(blob.Container.Uri.OriginalString + "/", "");
blob_Dir = blob_Dir.Remove(blob_Dir.Length - 1);
var subBlobs = ListFolderBlobs(containerName, blob_Dir);
blob_Infos.AddRange(subBlobs);
}
}
return blob_Infos;
Question: I want to get file Details from FTP server based on some specific datetime without using any 3rd party.
Problem : My FTP server contains 1000s of files so getting all files and after that filtering it takes time.
Is there any Quicker way to do this ?
string ftpPath = "ftp://directory/";
// Some expression to match against the files...do they have a consistent
// name? This example would find XML files that had 'some_string' in the file
Regex matchExpression = new Regex("^test.+\.xml$", RegexOptions.IgnoreCase);
// DateFilter
DateTime cutOff = DateTime.Now.AddDays(-10);
List<ftplineresult> results = FTPHelper.GetFilesListSortedByDate(ftpPath, matchExpression, cutOff);
public static List<FTPLineResult> GetFilesListSortedByDate(string ftpPath, Regex nameRegex, DateTime cutoff)
{
List<FTPLineResult> output = new List<FTPLineResult>();
FtpWebRequest request = FtpWebRequest.Create(ftpPath) as FtpWebRequest;
ConfigureProxy(request);
request.Method = WebRequestMethods.Ftp.ListDirectoryDetails;
FtpWebResponse response = request.GetResponse() as FtpWebResponse;
StreamReader directoryReader = new StreamReader(response.GetResponseStream(), System.Text.Encoding.ASCII);
var parser = new FTPLineParser();
while (!directoryReader.EndOfStream)
{
var result = parser.Parse(directoryReader.ReadLine());
if (!result.IsDirectory && result.DateTime > cutoff && nameRegex.IsMatch(result.Name))
{
output.Add(result);
}
}
// need to ensure the files are sorted in ascending date order
output.Sort(
new Comparison<FTPLineResult>(
delegate(FTPLineResult res1, FTPLineResult res2)
{
return res1.DateTime.CompareTo(res2.DateTime);
}
)
);
return output;
}
Problem : My FTP server contains 1000s of files so geting all files and after that filtering it takes time.
Is there any Quicker way to do this ?
No.
The only standard FTP API, is the LIST command and its companions. All these will give you list of all files in a folder. There's no FTP API to give you files filtered by a timestamp.
Some servers support non-standard file masks in the LIST command.
So they will allow you to return only the *.xml files.
See How to get list of files based on pattern matching using FTP?
Similar questions:
Download files from FTP if they are created within the last hour
C# - Download files from FTP which have higher last-modified date
I have got an alternative solution to do my functionality using FluentFTP.
Explanation:
I am downloading the files from FTP (Read permission reqd.) with same folder structure.
So everytime the job/service runs I can check into the physical path same file(Full Path) exists or not If not exists then it can be consider as a new file. And Ii can do some action for the same and download as well.
Its just an alternative solution.
Code Changes:
private static void GetFiles()
{
using (FtpClient conn = new FtpClient())
{
string ftpPath = "ftp://myftp/";
string downloadFileName = #"C:\temp\FTPTest\";
downloadFileName += "\\";
conn.Host = ftpPath;
//conn.Credentials = new NetworkCredential("ftptest", "ftptest");
conn.Connect();
//Get all directories
foreach (FtpListItem item in conn.GetListing(conn.GetWorkingDirectory(),
FtpListOption.Modify | FtpListOption.Recursive))
{
// if this is a file
if (item.Type == FtpFileSystemObjectType.File)
{
string localFilePath = downloadFileName + item.FullName;
//Only newly created files will be downloaded.
if (!File.Exists(localFilePath))
{
conn.DownloadFile(localFilePath, item.FullName);
//Do any action here.
Console.WriteLine(item.FullName);
}
}
}
}
}
I'll explain the problem right away, but first of all...is this achievable?
I have a Document Type in Umbraco where I store data from a Form. I can store everything except the file.
...
content.SetValue("notes", item.Notes);
content.SetValue("curriculum", item.Curriculum); /*this is the file*/
...
I'm adding items like this where SetValue comes from the following namespace namespace Umbraco.Core.Models and this is the function signature void SetValue(string propertyTypeAlias, object value)
And the return error is the following
"String or binary data would be truncated.
↵The statement has been terminated."
Did I missunderstood something? Shouldn't I be sending the base64? I'm adding the image to a media file where it creates a sub-folder with a sequential number. If I try to add an existing folder it appends the file just fine but if I point to a new media sub-folder it also returns an error. Any ideas on how should I approach this?
Thanks in advance
Edit 1: After Cryothic answer I've updated my code with the following
byte[] tempByte = Convert.FromBase64String(item.Curriculum);
var mediaFile = _mediaService.CreateMedia(item.cvExtension, -1, Constants.Conventions.MediaTypes.File);
Stream fileStream = new MemoryStream(tempByte);
var fileName = Path.GetFileNameWithoutExtension(item.cvExtension);
mediaFile.SetValue("umbracoFile", fileName, fileStream);
_mediaService.Save(mediaFile);
and the error happens at mediaFile.SetValue(...).
If I upload a file from umbraco it goes to "http://localhost:3295/media/1679/test.txt" and the next one would go to "http://localhost:3295/media/1680/test.txt". Where do I tell on my request that it has to add to the /media folder and increment? Do I only point to the media folder and umbaco handles the incrementation part?
If I change on SetValue to the following mediaFile.SetValue("curriculum", fileName, fileStream); the request succeeds but the file is not added to the content itself and the file is added to "http://localhost:3295/umbraco/media" instead of "http://localhost:3295/media".
If I try the following - content.SetValue("curriculum", item.cvExtension); - the file is added to the content but with the path "http://localhost:3295/umbraco/test.txt".
I'm not understanding very well how umbraco inserts files into the media folder (outside umbraco) and how you add the media service path to the content service.
Do you need to save base64?
I have done something like that, but using the MediaService.
My project had the option to upload multiple images on mulitple wizard-steps, and I needed to save them all at once. So I looped through the uploaded files (HttpFileCollection) per step. acceptedFiletypes is a string-list with the mimetypes I'd allow.
for (int i = 0; i < files.Count; i++) {
byte[] fileData = null;
UploadedFile uf = null;
try {
if (acceptedFiletypes.Contains(files[i].ContentType)) {
using (var binaryReader = new BinaryReader(files[i].InputStream)) {
fileData = binaryReader.ReadBytes(files[i].ContentLength);
}
if (fileData.Length > 0) {
uf = new UploadedFile {
FileName = files[i].FileName,
FileType = fileType,
FileData = fileData
};
}
}
}
catch { }
if (uf != null) {
projectData.UploadedFiles.Add(uf);
}
}
After the last step, I would loop throug my projectData.UploadedFiles and do the following.
var service = Umbraco.Core.ApplicationContext.Current.Services.MediaService;
var mediaTypeAlias = "Image";
var mediaItem = service.CreateMedia(fileName, parentFolderID, mediaTypeAlias);
Stream fileStream = new MemoryStream(file.FileData);
mediaItem.SetValue("umbracoFile", fileName, fileStream);
service.Save(mediaItem);
I also had a check which would see if the uploaded filename was ending on ".pdf". In that case I'd change the mediaTypeAlias to "File".
I hope this helps.
I'm displaying files from a Sharepoint 2010 document library on my web page. I'm successfully showing a list of the files and their icons. However, some of the files are in subfolders, and instead of showing the file icon, I'd like to show a folder icon.
How do I detect when a file is in a subfolder?
I could parse each file's ServerRelativeUrl to determine the folder structure. I'm hoping there is another way.
Here is the Sharepoint library:
And here is the code which produces the list of files in that library:
using (ClientContext clientContext = new ClientContext(SharepointSite))
{
var query = new CamlQuery
{
ViewXml = "<View Scope='RecursiveAll'>" +
"<Query>" +
"<Where>" +
"<Eq>" +
"<FieldRef Name='FSObjType' />" +
"<Value Type='Integer'>0</Value>" +
"</Eq>" +
"</Where>" +
"</Query>" +
"</View>"
};
var sourceList = clientContext.Web.Lists.GetByTitle("Test Library");
var files = sourceList.GetItems(query);
clientContext.Load(files);
clientContext.ExecuteQuery();
foreach (var file in files)
{
var id = file.Id;
var filename = file["FileLeafRef"].ToString();
var iconName = clientContext.Web.MapToIcon(filename, string.Empty, IconSize.Size16);
clientContext.ExecuteQuery();
var imgUrl = "http://sharepointsite/_layouts/images/" + iconName.Value;
Image iconImage = new Image {ImageUrl = imgUrl};
clientContext.Load(file.ParentList);
clientContext.ExecuteQuery();
var listUrl = file.ParentList.DefaultDisplayFormUrl;
HyperLink docLink = new HyperLink
{
Text = filename,
NavigateUrl = listUrl + "?ID=" + id //ToDo: fix
};
HtmlTableRow row = new HtmlTableRow();
HtmlTableCell cell1 = new HtmlTableCell();
cell1.Controls.Add(iconImage);
HtmlTableCell cell2 = new HtmlTableCell();
cell2.Controls.Add(docLink);
row.Cells.Add(cell1);
row.Cells.Add(cell2);
tbFiles.Rows.Add(row);
}
}
which results in:
Only the first file in the list is actually in the top-most library. The rest are in "Test Folder".
It might helpful to understand the available options for the view scope property of your CAML query:
Default: Gets files and subfolders from a specific folder
RecursiveAll: Gets files and subfolders from all folders
FilesOnly: Gets only files (no folders) from a specific folder
Recursive: Gets only files (no folders) from all folders
If you want to maintain a hierarchical folder structure, you have two options:
Get all files in the library (using a view scope of Recursive or RecursiveAll) and try to reconstruct the file structure yourself by post-processing the results
Get only files and folders from one folder at a time (using a view scope of Default), executing a new query whenever you want to drill down into the contents of a subfolder
Both approaches are equally valid, but in general I'd recommend the second. They both have their downsides: the first requires a larger up-front network request and more post-processing logic to assemble the hierarchy, while the second requires multiple network requests to retrieve all the data.
If taking the second approach, you can limit your CAML query to a specific folder by setting the CamlQuery's FolderServerRelativeUrl property to the URL of the desired folder. When working with the items retrieved from a specific folder, you can check their FileSystemObjectType property to determine if they are files or folders; if they are folders you can access their Folder property to get the associated folder object, from which you can get the ServerRelativeUrl property to use in your next query to get items from that folder.
I've this folder and I want to get a specific file from it, how can I do that given that there could be more than one file in there.
I've tried to use StartsWith but wasn't able to do so, is there any way this could be done?
The file is in the CustomerWorkSheet folder and its name starts with the customer id field value. What should I use if I have this path: .../uploads/attachments/CustomerWorkSheet/**File Name Here**
IWordDocument doc = new WordDocument();
doc.Open(
System.Web.HttpContext.Current.Server.MapPath
("~/Content/uploads/attachments/CustomerWorkSheet/"),
FormatType.Doc);
I need some thing like that
if(file.StartsWith(customerId))
but couldn't get the file
var directoryPath = System.Web.HttpContext.Current.Server.MapPath("~/Content/uploads/attachments/CustomerWorkSheet");
var file = Directory.EnumerateFiles(directoryPath).SingleOrDefault(f => f.Contains("CustomerWorkSheet/" + myCustomerId));
if(file != null){
IWordDocument doc = new WordDocument();
doc.open(file, FormatType.Doc);
}
Based on your comments, you are looking for this:
var filename = myDirectoryFiles.SingleOrDefault(f => f.Contains("CustomerWorkSheet/" + myCustomerId + "."));