How can I download files from Sharepoint? - c#

I'm browsing my instance, https://instance.sharepoint.com/1234/abc
That page contains a list of several folders and files. How do I download files from that path?
ClientContext cxt = new ClientContext(fullWebUrl);
cxt.Credentials = new SharePointOnlineCredentials(username, new NetworkCredential("", password).SecurePassword);
List list = cxt.Web.Lists.GetByTitle("Documents");
cxt.Load(list);
cxt.ExecuteQuery();
FolderCollection fcol = list.RootFolder.Folders;
List<string> lstFile = new List<string>();
foreach (Folder f in fcol)
{
if (f.Name == "filename")
{
cxt.Load(f.Files);
cxt.ExecuteQuery();
FileCollection fileCol = f.Files;
foreach (Microsoft.SharePoint.Client.File file in fileCol)
{
lstFile.Add(file.Name);
}
}
}
This fails down at the foreach with the error
Microsoft.SharePoint.Client.CollectionNotInitializedException: 'The collection has not been initialized
Would not cxt.ExecuteQuery do the job?

you can make a mass download from a SharePoint library without using code, just do the following:
1:go to your SharePoint Library
2:copy the full path
for example
https://sharepointdomain.com/sites/yoursite/yourLibraryName/forms/Allitems.aspx
now remove
/forms/allitems.aspx from the address and you will remain with
https://sharepointdomain.com/sites/yoursite/yourLibraryName
copy that and open your windows file browser and paste that in the address bar and your library will be opened as a local folder, then you can select all files and copy them to any destination even you can move them to another SharePoint library you might be prompted to enter your SharePoint credentials make sure to enter it is as follow:
username is : domain\your SharePoint user name
password : your SharePoint login password.

You need to load the FolderCollection fcol first (cxt.Load(fcol)):
List list = cxt.Web.Lists.GetByTitle("Documents");
cxt.Load(list);
cxt.ExecuteQuery();
FolderCollection fcol = list.RootFolder.Folders;
cxt.Load(fcol);
cxt.ExecuteQuery();
List<string> lstFile = new List<string>();
foreach (Folder f in fcol)
{
if (f.Name == "filename")
{
cxt.Load(f.Files);
cxt.ExecuteQuery();
FileCollection fileCol = f.Files;
foreach (Microsoft.SharePoint.Client.File file in fileCol)
{
lstFile.Add(file.Name);
}
}
}
To download the file:
foreach (Microsoft.SharePoint.Client.File file in fileCol)
{
var localstream = System.IO.File.Open("c:/" + file.Name, System.IO.FileMode.CreateNew);
var fileInfo = File.OpenBinaryDirect(cxt, file.ServerRelativeUrl);
var spstream = fileInfo.Stream;
spstream.CopyTo(localstream);
}

Related

C# sharepoint loop through all files in folder and all subfolders

I am trying to loop after all files in a folder and all subfolders as deep as it goes from the first folder.
I have found a way but I think it's stupid and is probably a way better method to do it.
The code loops through the first folder and all files. After that it loops again the subfolders then files and again for the third time.
Is there some other way I can do this? just choose one folder then it loops down the hierarchy automatically.
static void ReadAllSubs(string siteUrl, string siteFolderPath, string localTempLocation)
{
ClientContext ctx = new ClientContext(siteUrl);
ctx.AuthenticationMode = ClientAuthenticationMode.Default;
SecureString passWord = new SecureString();
string pwd = "xxx";
foreach (char c in pwd.ToCharArray()) passWord.AppendChar(c);
ctx.Credentials = new SharePointOnlineCredentials("test#test.com", passWord);
FolderCollection folderCollection = ctx.Web.GetFolderByServerRelativeUrl("Delte%20dokumenter/07 - Detaljprosjekt").Folders;
// Don't just load the folder collection, but the property on each folder too
ctx.Load(folderCollection, fs => fs.Include(f => f.ListItemAllFields));
// Actually fetch the data
ctx.ExecuteQuery();
foreach (Folder folder in folderCollection)
{
//LOOP MAIN FOLDER
Console.WriteLine("---------------FIRST LAYER FOLDER---------------------");
var item = folder.ListItemAllFields;
var folderpath = item["FileRef"];
FolderCollection LoopFolder = ctx.Web.GetFolderByServerRelativeUrl(folderpath.ToString()).Folders;
ctx.Load(LoopFolder, fs => fs.Include(f => f.ListItemAllFields));
ctx.ExecuteQuery();
Console.WriteLine(folderpath);
//LOOP ALL FILES IN FIRST MAIN FOLDER
FileCollection mainfiles = ctx.Web.GetFolderByServerRelativeUrl(folderpath.ToString()).Files;
ctx.Load(mainfiles);
ctx.ExecuteQuery();
Console.WriteLine("---------------FIRST LAYER FILES---------------------");
foreach (File mainfile in mainfiles)
{
Console.WriteLine(mainfile.Name);
Console.WriteLine(mainfile.MajorVersion);
}
//LOOP SUBFOLDER
Console.WriteLine("---------------SECOUND LAYER FOLDER---------------------");
foreach (Folder ff in LoopFolder)
{
var subitem = ff.ListItemAllFields;
var folderpathsub = subitem["FileRef"];
Console.WriteLine(folderpathsub);
//LOOP ALL FILES IN FIRST SUBFOLDER
FileCollection files = ctx.Web.GetFolderByServerRelativeUrl(folderpathsub.ToString()).Files;
ctx.Load(files);
ctx.ExecuteQuery();
Console.WriteLine("---------------SECOUND LAYER FILES---------------------");
foreach (File file in files)
{
Console.WriteLine(file.Name);
Console.WriteLine(file.MajorVersion);
}
var created = (DateTime)item["Created"];
var modified = (DateTime)item["Modified"];
Console.WriteLine("---------------THIRD LAYER FOLDER---------------------");
FolderCollection ThirdLoopFolder = ctx.Web.GetFolderByServerRelativeUrl(folderpathsub.ToString()).Folders;
ctx.Load(ThirdLoopFolder, fs => fs.Include(f => f.ListItemAllFields));
ctx.ExecuteQuery();
foreach (Folder fff in ThirdLoopFolder)
{
var item3 = fff.ListItemAllFields;
var folderpath3 = item3["FileRef"];
Console.WriteLine(folderpath3);
//LOOP ALL FILES IN THIRD SUBFOLDER
FileCollection thirdfiles = ctx.Web.GetFolderByServerRelativeUrl(folderpath3.ToString()).Files;
ctx.Load(thirdfiles);
ctx.ExecuteQuery();
Console.WriteLine("---------------THIRD LAYER FILES---------------------");
foreach (File file in thirdfiles)
{
Console.WriteLine(file.Name);
Console.WriteLine(file.MajorVersion);
}
}
}
}
}
I may propose two solutions.
First method
The first would be a recursive approach similar to Your solution.
private static void UseRecursiveMethodToGetAllItems()
{
using (var context = new ClientContext(WebUrl))
{
context.Credentials = new SharePointOnlineCredentials(UserName, Password);
var rootFolders = context.Web.GetFolderByServerRelativeUrl(LibName).Folders;
context.Load(rootFolders, folders => folders.Include(f => f.ListItemAllFields));
context.ExecuteQuery();
foreach (var folder in rootFolders)
{
GetFilesAndFolders(context, folder);
}
Console.ReadLine();
}
}
private static void GetFilesAndFolders(ClientContext context, Folder folder)
{
if (folder != null && folder.ListItemAllFields.FieldValues.Count > 0)
{
Console.WriteLine($"Folder - {folder.ListItemAllFields.FieldValues["FileLeafRef"]}");
var fileCollection = folder.Files;
context.Load(fileCollection, files => files.Include(f => f.ListItemAllFields));
context.ExecuteQuery();
foreach(var file in fileCollection)
{
Console.WriteLine($" -> {file.ListItemAllFields.FieldValues["FileLeafRef"]}");
}
var subFolderCollection = folder.Folders;
context.Load(subFolderCollection, folders => folders.Include(f => f.ListItemAllFields));
context.ExecuteQuery();
foreach (var subFolder in subFolderCollection)
{
GetFilesAndFolders(context, subFolder);
}
}
}
the first function does the authentication to the given WebUrl and gets the folders from the root folder (which is the name of the library). Then the second method is recursive. First gets all files from the current folder and prints them to the console, after that the next step is to query the subfolders in this folder and then do the same method.
I have created a sample library with folders and files and the result of the above method is
Second method
The second method is a bit more 'flat'. It is possible to create a CAML query to get all items from library recursively and then check if it's file or folder. All items have path property to determine the hierarchy.
private static void UseQueryToGetAllItems()
{
using (var context = new ClientContext(WebUrl))
{
context.Credentials = new SharePointOnlineCredentials(UserName, Password);
List<ListItem> result = new List<ListItem>();
try
{
ListItemCollectionPosition position = null;
int page = 1;
do
{
List list = context.Web.Lists.GetByTitle(LibName);
CamlQuery query = new CamlQuery();
query.ViewXml = new StringBuilder()
.Append("<View Scope=\"RecursiveAll\">")
.Append("<Query>")
.Append("")
.Append("</Query>")
.Append("<RowLimit>5000</RowLimit>")
.Append("</View>")
.ToString();
query.ListItemCollectionPosition = position;
ListItemCollection items = list.GetItems(query);
context.Load(items);
context.ExecuteQuery();
position = items.ListItemCollectionPosition;
if (items.Count > 0)
result.AddRange(items);
context.ExecuteQuery();
page++;
}
while (position != null);
result.ForEach(item =>
{
Console.WriteLine($"{item["ID"]}) Path: {item["FileDirRef"]} - Name: {item["FileLeafRef"]} - Type: {item.FileSystemObjectType}");
});
}
catch (Exception ex)
{
Console.WriteLine(ex);
}
Console.ReadLine();
}
}
This method also does the authentication to the same library and then executes a query to get all the items to the list (the Query is done in a pagination way to overcome the threshold limit to get more than 5000 elements in query). After the method gets the list of all items it prints them out presenting the path, name of the file, and type (file or folder.. or other.. If remember well there might be also web and some other in this enum).
For the same library as the first method The result of this approach is
hope it helps :)

Uploading a single Sharepoint document with metadata

I defined terms in the Term Store Management Tool which I added as "Managed Metadata" columns in a document library.
I want to upload a document and to update its "Managed Metadata" columns.
In order to do so, I wrote the following code:
void UploadDocument(Document document)
{
try
{
using (ClientContext context = SPHelper.GetClientContext())
{
List library = context.Web.Lists.GetByTitle("MyDocumentLibrary");
FileCreationInformation fileInfo = new FileCreationInformation
{
Url = "MyFileTarget",
Content = document.Content,
Overwrite = true
};
File file = library.RootFolder.Files.Add(fileInfo);
ListItem item = file.ListItemAllFields;
item["RegularColumn"] = "some data";
item["Metadata"] = "some other data";
item.Update();
context.ExecuteQuery(); // "The given guid does not exist in the term store." Exception thrown
}
}
catch (Exception ex)
{
LogHelper.RecordError("Failed to upload document", ex, System.Reflection.MethodInfo.GetCurrentMethod().Name);
}
}
I can upload a file and update its regular columns but I can't update the Metadata columns.
Is there a way to specify item["Metadata"] GUID ?
The Term Guid can be found in Term Store:
Add reference to Microsoft.SharePoint.Client.Taxonomy.dll:
Here is the code snippet to set managed metadata field value with TaxonomyFieldValue class:
using (ClientContext context = new ClientContext(sharePointSite))
{
FileCreationInformation FCInfo = new FileCreationInformation();
FCInfo.Url = "http://sp2016/sites/dev/Shared%20Documents/Test.txt";
FCInfo.Overwrite = true;
FCInfo.Content = System.IO.File.ReadAllBytes(fileToUpload);
Web web = context.Web;
List library = web.Lists.GetByTitle(libraryName);
Microsoft.SharePoint.Client.File uploadfile = library.RootFolder.Files.Add(FCInfo);
ListItem item = uploadfile.ListItemAllFields;
item["Title"] = "some data";
var fields = library.Fields;
var field = fields.GetByInternalNameOrTitle("managedcolumn");
context.Load(fields);
context.Load(field);
context.ExecuteQuery();
var taxKeywordField = context.CastTo<TaxonomyField>(field);
TaxonomyFieldValue termValue = new TaxonomyFieldValue();
termValue.Label = "TermC";
termValue.TermGuid = "045830f1-f51e-4bac-b631-5815a7b6125f";
termValue.WssId = 3;
taxKeywordField.SetFieldValueByValue(item, termValue);
item.Update();
context.ExecuteQuery();
uploadfile.CheckIn("testcomment", CheckinType.MajorCheckIn);
context.ExecuteQuery();
}

CSOM Set Sharing on OneDrive Folder

I'd like to set sharing rights on a folder in OneDrive. I know there is a post out there about ListItems, but I need it at a folder level. First, is this possible or am I wasting my time? I've tried the following:
I'm able to get the site object but I'm not able to get the folder in order to share it. The web object doesn't have the folders available to enumerate through. It says it's not initialized. This code below successfully runs but the folder object is not working:
static void Main(string[] args)
{
var webUrl = "https://tenant-my.sharepoint.com/personal/me_tenant_com";
var userName = "me";
string securePassword = "mypassword";
SecureString sec_pass = new SecureString();
Array.ForEach(securePassword.ToArray(), sec_pass.AppendChar);
using (var ctx = new ClientContext(webUrl))
{
ctx.Credentials = new SharePointOnlineCredentials(userName, sec_pass);
var web = ctx.Web;
ClientResult<Microsoft.SharePoint.Client.Utilities.PrincipalInfo> persons = Microsoft.SharePoint.Client.Utilities.Utility.ResolvePrincipal(ctx, ctx.Web, "dpunchak#AvvenireInc.com", Microsoft.SharePoint.Client.Utilities.PrincipalType.User, Microsoft.SharePoint.Client.Utilities.PrincipalSource.All, null, true);
ctx.ExecuteQuery();
var folder = ctx.Web.GetFolderByServerRelativeUrl("/documents/Test Folder");
Microsoft.SharePoint.Client.Utilities.PrincipalInfo person = persons.Value;
//ShareListItem(folder, person, "Read");
}
}
public static void ShareListItem(ListItem listItem, Principal principal, string permissionLevelName)
{
var ctx = listItem.Context as ClientContext;
var roleDefinition = ctx.Site.RootWeb.RoleDefinitions.GetByName(permissionLevelName);
listItem.BreakRoleInheritance(true, false);
var roleBindings = new RoleDefinitionBindingCollection(ctx) { roleDefinition };
listItem.RoleAssignments.Add(principal, roleBindings);
ctx.ExecuteQuery();
}
I think you have to pass folder.ListItemAllFields property to ShareListItem().
To avoid collection has not been initialized error you can try placing RoleAssignments.Add() inside ctx.ExecuteQuery():
ctx.ExecuteQuery(listItem.RoleAssignments.Add(principal, roleBindings);

how to display all folder and sub folder from FTP using c# windows application?

i want to display the all the files and sub folders from FTP using c#
how do i do?
can you one upload code?
i have tried import data from FTP in Tree view.but i cannot do
my code:
private TreeNode CreateDirectoryNode(string root, string p)
{
var directoryNode = new TreeNode("CGT");
var directoryListing = GetDirectoryListing(path);
var directories = directoryListing.Where(d => d.IsDirectory);
var files = directoryListing.Where(d => !d.IsDirectory);
foreach (var dir in directories)
{
directoryNode.Nodes.Add(CreateDirectoryNode(dir.FullPath, dir.Name));
}
foreach (var file in files)
{
directoryNode.Nodes.Add(new TreeNode(file.Name));
}
return directoryNode;
}
Leave this code this for example only..you people please help me out of here

Denied access to User Folder

I Need to find my pictures in my User folder. But I get the runtime error Access Denied
Here is my code
static void Main(string[] args)
{
string pic = "*.jpg";
string b = Environment.GetFolderPath(Environment.SpecialFolder.UserProfile);
string appdata = Path.Combine(b, "AppData"); // I Dont want search in this folder.
string data = Path.Combine(b, "Data aplikací"); // Here also not.
foreach (string d in Directory.GetDirectories(b))
{
try
{
if ((d == data) || (d == appdata))
{
continue;
}
else
{
foreach (string f in Directory.GetFiles(d, pic))
{
//...
}
}
}
catch (System.Exception excpt)
{
Console.WriteLine(excpt.Message);
}
}
}
Running the application as admin doesn't work either. How to avoid this?
check if the folder is read only (in windows) if it is, just clear the read only flag.
if it isn't read only, make sure that the admin user has full rights on that folder. You can check this by right clicking on the folder --> properties --> security
check out this link for more information on how to set it programatically:
C# - Set Directory Permissions for All Users in Windows 7
Oh, don't go changing your directory/folder permissions - that's just asking for future pain.
There's no "one-liner" solution here - basically, you need to recursively walk through the folder structure looking for the files you care about, and absorbing/eating the UnauthorizedAccessExceptions along the way (you could avoid the exception altogether by checking DirectoryInfo.GetAccessControl, but that's a whole different question)
Here's a blob o'code:
void Main()
{
var profilePath = Environment
.GetFolderPath(Environment.SpecialFolder.UserProfile);
var imagePattern = "*.jpg";
var dontLookHere = new[]
{
"AppData", "SomeOtherFolder"
};
var results = new List<string>();
var searchStack = new Stack<string>();
searchStack.Push(profilePath);
while(searchStack.Count > 0)
{
var path = searchStack.Pop();
var folderName = new DirectoryInfo(path).Name;
if(dontLookHere.Any(verboten => folderName == verboten))
{
continue;
}
Console.WriteLine("Scanning path {0}", path);
try
{
var images = Directory.EnumerateFiles(
path,
imagePattern,
SearchOption.TopDirectoryOnly);
foreach(var image in images)
{
Console.WriteLine("Found an image! {0}", image);
results.Add(image);
}
var subpaths = Directory.EnumerateDirectories(
path,
"*.*",
SearchOption.TopDirectoryOnly);
foreach (var subpath in subpaths)
{
searchStack.Push(subpath);
}
}
catch(UnauthorizedAccessException nope)
{
Console.WriteLine("Can't access path: {0}", path);
}
}
}

Categories

Resources