GitLab API - How to list all files in subfolder - c#

I'm having quite the time trying to get this seemingly simplistic line of code to work using C#. Trying to get a list of all my files in my repo.
File structure below:
MyRepo
-directory_01
--myFile_01.jpg
--myFile_02.jpg
This line of code lists all my directories and works great:
var streamTask = client.GetStringAsync(https://myURL/api/v4/projects/myRepo/repository/tree?private_token=myToken"
According to the api, to get my files only I should use files/:file_path attribute or archive.<file extension> to get my files.
//Get all .jpg files
var streamTask = client.GetStringAsync(https://myURL/api/v4/projects/myRepo/repository/archive.jpg?private_token=myToken"
OR
//Get one file
var streamTask = client.GetStringAsync(https://myURL/api/v4/projects/myRepo/repository/files/directory_01%2FmyFile_01%2Ejpg?private_token=myToken"
//result: 404

When you this API you also need to pass the branch name
Change
var streamTask = client.GetStringAsync(https://myURL/api/v4/projects/myRepo/repository/files/directory_01%2FmyFile_01%2Ejpg?private_token=myToken"
To
var streamTask = client.GetStringAsync(https://myURL/api/v4/projects/myRepo/repository/files/directory_01%2FmyFile_01%2Ejpg?private_token=myToken&ref=<branch_name>"

Just in case anyone else runs into this issue, here's a way to do this:
// Get list of files***********
string treeUri = $"{project}tree?private_token={projectToken}&ref=master";
string treeData = await client.GetStringAsync(treeUri);
List<AssetEntry> json = System.Text.Json.JsonSerializer.Deserialize<List<AssetEntry>>(treeData);
Console.WriteLine("FILE LIST:\n");
foreach (var item in json)
{
Console.WriteLine(item.name + " " );
}

Related

Find Blobs in Azure Storage Container in multi level folder structure using C#

I am having folder structure like
(Container)->(1)School->(2)Staffs
->(2.a)OfficeStaffs
-> (2.a.i)Admin ->(Blobs)
-> (2.a.ii)Clerk->(Blobs)
->(2.b)Teachers
->(2.b.i)SeniorStudents ->(Year)->AttendanceReport.xlx
->ExamReport.xlx
->(2.b.ii)JuniorStudents ->(Year)->AttendanceReport.xlx
->ExamReport.xlx
School is my parent folder and all other folders are sub folders. Now I need to find blobs using folder name. The folder name may persist in middle. For example user have the search options in the UI by Staff type or Teachers or Students Type or By Year. There is no mandatory options to search blobs by folder level one by one. If the User selects Teachers , need to display all teachers and Students folder with respective blobs. If the user selects year , we need to get all the blobs belongs to the particular year folder. In this case, we will receive 'Year' value from user. We will not be knowing its parent folders. Based on the year only we need to retrieve. If User selects OfficeStaffs and Teachers, we need to retrieve all the subfolders and blobs from both the folders.
I tried with Blob Prefix to get middle folder but no luck. It is always expecting the Initial folder path and with next folders in order basis. Could not able to get the middle folder.
BlobContainerClient client = new BlobContainerClient(connectionString, containerName);
List<FileData> files = new List<FileData>();
await foreach (BlobItem file in client.GetBlobsAsync(prefix: "SeniorStudents"))
{
files.Add(new FileData
{
FileName = file.Name
}
}
This is not getting the blobs under SeniorStudents folder. It is returning empty. Please help me on this. Thanks.
I am having folder structure like
No, you don't. Unless hierarchical namespace is enabled all the (sub)folders you see are virtual. Those folders are defined by the name of the blob. Each / will be seen as a virtual folder in the storage explorer.
In your case you have multiple blobs in a container:
School/Staffs/OfficeStaffs/Admin/Blob1.ext
School/Staffs/OfficeStaffs/Clerk/Blob2.ext
School/Staffs/Teachers/SeniorStudents/2022/AttendanceReport.xlx
School/Staffs/Teachers/SeniorStudents/2022/ExamReports.xlx
School/Staffs/Teachers/JuniorStudents/2022/AttendanceReport.xlx
School/Staffs/Teachers/JuniorStudents/2022/ExamReports.xlx
As you can see it is a flat list. When you try to find blobs based on a prefix you need to remember it is like the equivalent of matching a string using the C# string.StartsWith method.
So with prefix School/Staffs/OfficeStaffs/Admin/ you will find blob 1, a prefix School/Staffs/Teachers will give you blobs 3 to 6. The prefix Staffs does not list any blobs as there are no blobs that have the text staffs at the start of their name.
In your case, that means that you will have to get all blobs, split their names using for example string.Split(). For example, the code below finds all blobs that are somehow in a folder named SeniorStudents, no matter at what level that virtual folder is present:
class FileData
{
public string FileName { get;set;}
public IEnumerable<string> Folders => FileName.Split('/').SkipLast(1);
}
...
await foreach (BlobItem file in client.GetBlobsAsync())
{
files.Add(new FileData
{
FileName = file.Name
});
}
var targetFiles = files.Where(f => f.Folders.Contains(("SeniorStudents")));
In the above example, if you want all 2022 files for all teachers you can do:
var targetFiles = files.Where(f => f.Folders.Contains("Teachers") && f.Folders.Contains("2022"));
Alternative
If you have lots of blobs the above method will force you to perform an inefficient query to get maybe 5 results out of 2000 blobs because you need to get all the blobs before you can determine whether they match the criteria.
As an alternative you might want to add tags to your blobs, each tag representing a folder or category. It is then easy to find all blobs having a specific tag. Beware the limits, there may be up to 10 tags defined on a given blob, see the docs.
Fetching the folders from container
string connectionString = "Connection String";
List<string> dir = new List<string>();
BlobServiceClient blobServiceClient = new BlobServiceClient(connectionString);
string containerName = "containerName";
BlobContainerClient containerClient = blobServiceClient.GetBlobContainerClient(containerName);
var blobitem = containerClient.GetBlobs(prefix: "test");
List<FileData> files = new List<FileData>();
foreach (var file in blobitem)
{
Console.WriteLine(file.Name);
string[] sub_names = file.Name.Split('/');
Console.WriteLine(sub_names.Length);
files.Add(new FileData
{
FileName = file.Name
});
if (sub_names.Length > 1 && !dir.Contains(sub_names[sub_names.Length - 1]))
{
dir.Add(sub_names[sub_names.Length - 1]);
}
}
foreach (var item in dir)
{
Console.WriteLine(item);
}
Fetching the files and folder structure using code
Sample Output using the C# code
Blob storage in explorer
In Azure Portal
Update
For fetching blobs or files from azure portal and also uploading files to specific folder can also be done by using the below code.
var blob_Container = GetBlobContainer(containerName);
var blob_Directory = blob_Container.GetDirectoryReference(directoryName);
var blob_Infos = new List<BlobFileInfo>();
var blobs = blob_Directory.ListBlobs().ToList();
foreach (var blob in blobs)
{
if (blob is CloudBlockBlob)
{
var blob_FileName = blob.Uri.Segments.Last().Replace("%20", " ");
var blob_FilePath = blob.Uri.AbsolutePath.Replace(blob.Container.Uri.AbsolutePath + "/", "").Replace("%20", " ");
var blob_Path = blob_FilePath.Replace("/" + blob_FileName, "");
blob_Infos.Add(new BlobFileInfo
{
File = blob_FileName,
Path = blob_Path,
Blob_FilePath = blob_FilePath,
Blob = blob
});
}
if (blob is CloudBlobDirectory)
{
var blob_Dir = blob.Uri.OriginalString.Replace(blob.Container.Uri.OriginalString + "/", "");
blob_Dir = blob_Dir.Remove(blob_Dir.Length - 1);
var subBlobs = ListFolderBlobs(containerName, blob_Dir);
blob_Infos.AddRange(subBlobs);
}
}
return blob_Infos;

SharePoint 2016 FileNotFound Exception

Through C#, I'm trying to download a file from SharePoint (SharePoint 2016). Below is the code I'm using:
site = new ClientContext(url);
//credential setting has no issues. So I am skipping it. I am using NetworkCredentials
site.Load(web);
site.ExecuteQuery();
List list = web.Lists.GetByTitle("Documents");
site.Load(list);
site.ExecuteQuery();
site.Load(list.RootFolder);
site.ExecuteQuery();
site.Load(list.RootFolder.Folders);
site.ExecuteQuery();
Folder folder = web.GetFolderByServerRelativeUrl(sharePointPath);
site.Load(folder);
site.ExecuteQuery();
site.Load(folder.Files);
site.ExecuteQuery();
While the last "site.ExecuteQuery()" is being executed, an exception is thrown:
ExceptionMessage: File not found
at Microsoft.SharePoint.Client.ClientRequest.ProcessResponseStream
But, there are files in that path and manually we are able to upload and download with the same credentials. URLs, Paths, etc have been double-checked and no issues with that.
When I print "folder.ItemCount", it is printing the correct no. of files in the folder. Only in ExecuteQuery for loading files, it is throwing exception.
Build settings: .NET framework 4.5 and x64
In other posts, people advised to change to .NET 3.5 but it was for SharePoint 2010. Further, changing it to 3.5 ends up in lot of build errors for me.
Please help in fixing this.
Here is the code snippet to download a file from SharePoint default Document library and save into a local folder:
static void Main(string[] args)
{
string siteUrl = "http://sp2016/sites/dev";
ClientContext clientContext = new ClientContext(siteUrl);
var list = clientContext.Web.Lists.GetByTitle("Documents");
var listItem = list.GetItemById(5);
clientContext.Load(list);
clientContext.Load(listItem, i => i.File);
clientContext.ExecuteQuery();
var fileRef = listItem.File.ServerRelativeUrl;
var fileInfo = Microsoft.SharePoint.Client.File.OpenBinaryDirect(clientContext, fileRef);
var fileName = Path.Combine(#"C:\Test", (string)listItem.File.Name);
using (var fileStream = System.IO.File.Create(fileName))
{
fileInfo.Stream.CopyTo(fileStream);
}
}
I was working with collection and above solution did not work for me, here is how I did so that it can help someone.
List list = web.Lists.GetByTitle("Events");
ListItemCollection listItems = list.GetItems(cmlqry);
context.Load(listItems);
context.ExecuteQuery();
if (listItems != null)
{
foreach (var listItem in listItems)
{
Console.WriteLine("Id: {0}, Title: {1}", listItem["ID"].ToString(), listItem["Title"].ToString());
context.Load(listItem.AttachmentFiles);
context.ExecuteQuery();
foreach (var file in listItem.AttachmentFiles)
{
Console.WriteLine("File: {0}", file.FileName);
var fileRef = file.ServerRelativeUrl;
var fileInfo = Microsoft.SharePoint.Client.File.OpenBinaryDirect(context, fileRef);
var fileName = Path.Combine(#"C:\temp\Events\", String.Format("{0}_{1}", listItem.Id, file.FileName));
using (var fileStream = System.IO.File.Create(fileName))
{
fileInfo.Stream.CopyTo(fileStream);
}
}
}
}

Get FTP file details based on datetime in C#

Question: I want to get file Details from FTP server based on some specific datetime without using any 3rd party.
Problem : My FTP server contains 1000s of files so getting all files and after that filtering it takes time.
Is there any Quicker way to do this ?
string ftpPath = "ftp://directory/";
// Some expression to match against the files...do they have a consistent
// name? This example would find XML files that had 'some_string' in the file
Regex matchExpression = new Regex("^test.+\.xml$", RegexOptions.IgnoreCase);
// DateFilter
DateTime cutOff = DateTime.Now.AddDays(-10);
List<ftplineresult> results = FTPHelper.GetFilesListSortedByDate(ftpPath, matchExpression, cutOff);
public static List<FTPLineResult> GetFilesListSortedByDate(string ftpPath, Regex nameRegex, DateTime cutoff)
{
List<FTPLineResult> output = new List<FTPLineResult>();
FtpWebRequest request = FtpWebRequest.Create(ftpPath) as FtpWebRequest;
ConfigureProxy(request);
request.Method = WebRequestMethods.Ftp.ListDirectoryDetails;
FtpWebResponse response = request.GetResponse() as FtpWebResponse;
StreamReader directoryReader = new StreamReader(response.GetResponseStream(), System.Text.Encoding.ASCII);
var parser = new FTPLineParser();
while (!directoryReader.EndOfStream)
{
var result = parser.Parse(directoryReader.ReadLine());
if (!result.IsDirectory && result.DateTime > cutoff && nameRegex.IsMatch(result.Name))
{
output.Add(result);
}
}
// need to ensure the files are sorted in ascending date order
output.Sort(
new Comparison<FTPLineResult>(
delegate(FTPLineResult res1, FTPLineResult res2)
{
return res1.DateTime.CompareTo(res2.DateTime);
}
)
);
return output;
}
Problem : My FTP server contains 1000s of files so geting all files and after that filtering it takes time.
Is there any Quicker way to do this ?
No.
The only standard FTP API, is the LIST command and its companions. All these will give you list of all files in a folder. There's no FTP API to give you files filtered by a timestamp.
Some servers support non-standard file masks in the LIST command.
So they will allow you to return only the *.xml files.
See How to get list of files based on pattern matching using FTP?
Similar questions:
Download files from FTP if they are created within the last hour
C# - Download files from FTP which have higher last-modified date
I have got an alternative solution to do my functionality using FluentFTP.
Explanation:
I am downloading the files from FTP (Read permission reqd.) with same folder structure.
So everytime the job/service runs I can check into the physical path same file(Full Path) exists or not If not exists then it can be consider as a new file. And Ii can do some action for the same and download as well.
Its just an alternative solution.
Code Changes:
private static void GetFiles()
{
using (FtpClient conn = new FtpClient())
{
string ftpPath = "ftp://myftp/";
string downloadFileName = #"C:\temp\FTPTest\";
downloadFileName += "\\";
conn.Host = ftpPath;
//conn.Credentials = new NetworkCredential("ftptest", "ftptest");
conn.Connect();
//Get all directories
foreach (FtpListItem item in conn.GetListing(conn.GetWorkingDirectory(),
FtpListOption.Modify | FtpListOption.Recursive))
{
// if this is a file
if (item.Type == FtpFileSystemObjectType.File)
{
string localFilePath = downloadFileName + item.FullName;
//Only newly created files will be downloaded.
if (!File.Exists(localFilePath))
{
conn.DownloadFile(localFilePath, item.FullName);
//Do any action here.
Console.WriteLine(item.FullName);
}
}
}
}
}

Umbraco Adding base64 File with SetValue

I'll explain the problem right away, but first of all...is this achievable?
I have a Document Type in Umbraco where I store data from a Form. I can store everything except the file.
...
content.SetValue("notes", item.Notes);
content.SetValue("curriculum", item.Curriculum); /*this is the file*/
...
I'm adding items like this where SetValue comes from the following namespace namespace Umbraco.Core.Models and this is the function signature void SetValue(string propertyTypeAlias, object value)
And the return error is the following
"String or binary data would be truncated.
↵The statement has been terminated."
Did I missunderstood something? Shouldn't I be sending the base64? I'm adding the image to a media file where it creates a sub-folder with a sequential number. If I try to add an existing folder it appends the file just fine but if I point to a new media sub-folder it also returns an error. Any ideas on how should I approach this?
Thanks in advance
Edit 1: After Cryothic answer I've updated my code with the following
byte[] tempByte = Convert.FromBase64String(item.Curriculum);
var mediaFile = _mediaService.CreateMedia(item.cvExtension, -1, Constants.Conventions.MediaTypes.File);
Stream fileStream = new MemoryStream(tempByte);
var fileName = Path.GetFileNameWithoutExtension(item.cvExtension);
mediaFile.SetValue("umbracoFile", fileName, fileStream);
_mediaService.Save(mediaFile);
and the error happens at mediaFile.SetValue(...).
If I upload a file from umbraco it goes to "http://localhost:3295/media/1679/test.txt" and the next one would go to "http://localhost:3295/media/1680/test.txt". Where do I tell on my request that it has to add to the /media folder and increment? Do I only point to the media folder and umbaco handles the incrementation part?
If I change on SetValue to the following mediaFile.SetValue("curriculum", fileName, fileStream); the request succeeds but the file is not added to the content itself and the file is added to "http://localhost:3295/umbraco/media" instead of "http://localhost:3295/media".
If I try the following - content.SetValue("curriculum", item.cvExtension); - the file is added to the content but with the path "http://localhost:3295/umbraco/test.txt".
I'm not understanding very well how umbraco inserts files into the media folder (outside umbraco) and how you add the media service path to the content service.
Do you need to save base64?
I have done something like that, but using the MediaService.
My project had the option to upload multiple images on mulitple wizard-steps, and I needed to save them all at once. So I looped through the uploaded files (HttpFileCollection) per step. acceptedFiletypes is a string-list with the mimetypes I'd allow.
for (int i = 0; i < files.Count; i++) {
byte[] fileData = null;
UploadedFile uf = null;
try {
if (acceptedFiletypes.Contains(files[i].ContentType)) {
using (var binaryReader = new BinaryReader(files[i].InputStream)) {
fileData = binaryReader.ReadBytes(files[i].ContentLength);
}
if (fileData.Length > 0) {
uf = new UploadedFile {
FileName = files[i].FileName,
FileType = fileType,
FileData = fileData
};
}
}
}
catch { }
if (uf != null) {
projectData.UploadedFiles.Add(uf);
}
}
After the last step, I would loop throug my projectData.UploadedFiles and do the following.
var service = Umbraco.Core.ApplicationContext.Current.Services.MediaService;
var mediaTypeAlias = "Image";
var mediaItem = service.CreateMedia(fileName, parentFolderID, mediaTypeAlias);
Stream fileStream = new MemoryStream(file.FileData);
mediaItem.SetValue("umbracoFile", fileName, fileStream);
service.Save(mediaItem);
I also had a check which would see if the uploaded filename was ending on ".pdf". In that case I'd change the mediaTypeAlias to "File".
I hope this helps.

Get a specific file from folder

I've this folder and I want to get a specific file from it, how can I do that given that there could be more than one file in there.
I've tried to use StartsWith but wasn't able to do so, is there any way this could be done?
The file is in the CustomerWorkSheet folder and its name starts with the customer id field value. What should I use if I have this path: .../uploads/attachments/CustomerWorkSheet/**File Name Here**
IWordDocument doc = new WordDocument();
doc.Open(
System.Web.HttpContext.Current.Server.MapPath
("~/Content/uploads/attachments/CustomerWorkSheet/"),
FormatType.Doc);
I need some thing like that
if(file.StartsWith(customerId))
but couldn't get the file
var directoryPath = System.Web.HttpContext.Current.Server.MapPath("~/Content/uploads/attachments/CustomerWorkSheet");
var file = Directory.EnumerateFiles(directoryPath).SingleOrDefault(f => f.Contains("CustomerWorkSheet/" + myCustomerId));
if(file != null){
IWordDocument doc = new WordDocument();
doc.open(file, FormatType.Doc);
}
Based on your comments, you are looking for this:
var filename = myDirectoryFiles.SingleOrDefault(f => f.Contains("CustomerWorkSheet/" + myCustomerId + "."));

Categories

Resources