I have some function (shown below that works as expected) that creates a SharePoint folder. For simplicity sake we can ignore the operation of creating a folder and assume the folder already exists within the SharePoint site...
I am curious as to how to open a given folder using the API and add a file to it that is a PowerPoint. The purposes of the PowerPoint is that each newly created folder will contain a template PowerPoint which can then be copy/changed by the user removing the need for the user to download the template themselves and add the PowerPoint to the folder manually.
For simplicity as mentioned earlier we can assume the folder already exists so I would just need to access it using
var sharepoint = await graphClient.Sites.GetByPath("/sites/SiteFolder", "localhost.sharepoint.com").Request().GetAsync();
Then perform a similar Add operation that is being used to create a new folder.
I know I'd either need to read the binary data from the PowerPoint and pass that to some File object or if there's a simpler way to use the direct link to the template PowerPoint and simply create a copy of it and insert it into the SharePoint folder.
public async Task<string> Sharepoint_FolderCreate(string NewFolderName, string sharepoint_folder_path = "/SomeFolderPath")
{
var item = new DriveItem
{
Name = NewFolderName.Replace("?", " ").Replace("/", " ").Replace("\\", " ").Replace("<", " ").Replace(">", " ").Replace("*", " ").Replace("\"", " ").Replace(":", " ").Replace("|", " "),
Folder = new Folder { },
AdditionalData = new Dictionary<string, object>()
{
{"#microsoft.graph.conflictBehavior","rename"}
}
};
var scopes = new[] { "https://graph.microsoft.com/.default" };
var options = new TokenCredentialOptions
{
AuthorityHost = AzureAuthorityHosts.AzurePublicCloud
};
// https://docs.microsoft.com/dotnet/api/azure.identity.clientsecretcredential
var clientSecretCredential = new ClientSecretCredential(
tenantID, clientId, clientSecret, options);
var graphClient = new GraphServiceClient(clientSecretCredential, scopes);
var sharepoint = await graphClient.Sites.GetByPath("/sites/SiteFolder", "localhost.sharepoint.com").Request().GetAsync();
await graphClient.Sites[sharepoint.Id].Drive.Root.ItemWithPath(sharepoint_folder_path).Children.Request().AddAsync(item);
var NewFolder = await graphClient.Sites[sharepoint.Id].Drive.Root.ItemWithPath($"{sharepoint_folder_path}/{item.Name}").Request().GetAsync();
return NewFolder.WebUrl;
}
Related
I am trying to extract a list of items in a SharePoint Site below the root site at host.sharepoint.com/sites/mysite. I've tried a bunch of different methods, but only one seems to work:
var host = "host.sharepoint.com:/";
var siteName = "mysite";
var listName = "MyList";
// Generate the Client Connection
var graphHelper = new ApplicationAuthenticatedClient(ClientId, Tenant, ClientSecret);
await graphHelper.ConnectAsync().ConfigureAwait(false);
// Code: itemNotFound
//Message: The provided path does not exist, or does not represent a site
//var list = await graphHelper.GraphClient.Sites[$"{host}{siteName}"].Request().GetAsync();
// Returns a Site, no Lists.
//var list = await graphHelper.GraphClient.Sites[host].Sites[siteName].Request().GetAsync();
//Code: itemNotFound
//Message: The provided path does not exist, or does not represent a site
//var list = await graphHelper.GraphClient.Sites[host].Sites[siteName].Lists[listName].Request().GetAsync();
// List retrieved, but no Items
//var site = await graphHelper.GraphClient.Sites[host].Sites[siteName].Request().Expand("lists").GetAsync();
//var list = await graphHelper.GraphClient.Sites[site.Id].Lists[listName].Request().Expand("Items").GetAsync();
//Code: invalidRequest
//Message: Can only provide expand and select for expand options
//var queryOptions = new List<QueryOption>() { new QueryOption("expand", "fields") };
// This works
var site = await graphHelper.GraphClient.Sites[host].Sites[siteName].Request().GetAsync();
var list = await graphHelper.GraphClient.Sites[site.Id].Lists[listName].Items.Request().Expand("Fields").GetAsync();
I've finally managed to get it to connect, but I'm wondering if there's a better way to navigate to the list, rather than the two API calls? (Assuming that I don't know the Site ID beforehand)
Edit: Using the Graph Explorer, I can access the items using https://graph.microsoft.com/v1.0/sites/{host}.sharepoint.com:/sites/{siteName}:/lists/{listName}/items?expand=fields, but I don't know how to (or if) access that API call in a single call in the .NET API.
It appears that I was on the right track with var list = await graphHelper.GraphClient.Sites[$"{host}{siteName}"].Request().GetAsync(); but the URI was not formatted correctly.
The correct Site ID for https://host.sharepoint.com/sites/mysite/MyList is:
Sites[host.sharepoint.com:/sites/mysite:"]
Retrieving the list from the code in my original question would look like this:
var host = "host.sharepoint.com";
var siteName = "mysite";
var listName = "MyList";
// Generate the Client Connection
var graphHelper = new ApplicationAuthenticatedClient(ClientId, Tenant, ClientSecret);
await graphHelper.ConnectAsync().ConfigureAwait(false);
var list = await graphHelper.GraphClient.Sites[$"{host}:/sites/{siteName}:"].Lists[listName].Request().GetAsync();
it's possible in one API call.
GET https://graph.microsoft.com/v1.0/sites/{host}.sharepoint.com:/sites/{siteName}:/lists/{listTitle}/items?$expand=Fields
Someone has shared a Box.com folder with me using the link. I need to be able to use the C# SDK or REST API to download the documents from their folder.
I have tried all 3 authentication types and have attempted to access with both the C# SDK and REST API.
//SDK attempt
var findFolder = await client.SharedItemsManager.SharedItemsAsync("https://<userWhoSharedWithMe>.box.com/s/<folderHash>"); // notFound
var folder = await client.FoldersManager.GetInformationAsync(findFolder.Id);
var items = folder.ItemCollection;
//API Attempt
var client = new HttpClient
{
BaseAddress = new Uri("https://api.box.com")
};
client.DefaultRequestHeaders.Authorization = new AuthenticationHeaderValue("Bearer", "<bearerToken>");
var response = await client.GetAsync("2.0/folders/<folderId>/items");
var content = await response.Content.ReadAsStringAsync();
Is there any way to programmatically download documents from a box folder that was shared with me via link?
-- Edited 06/04/2019
The folder owner and I have tried various things and it seems the API still will not allow me to see the content of the shared folder. Is there anything the folder owner needs to do to make it visible?
Based on the suggestion that I received from a Box employee, I made the following changes.
First the snippet that didn't work as expected:
// DOES NOT WORK
var reader = new StreamReader("box-config.json");
var json = reader.ReadToEnd();
var config = BoxConfig.CreateFromJsonString(json);
var sdk = new BoxJWTAuth(config);
var token = sdk.AdminToken();
var session = new OAuthSession(token, "N/A", 3600, "bearer");
boxClient = new BoxClient(config, session, asUser: boxUserId);
Secondly, the modified version that worked, allowing me to see the folder that was shared to me and allowed me to traverse its contents:
// THIS WORKS !!!!!!!!
var reader = new StreamReader("box-config.json");
var json = reader.ReadToEnd();
var config = BoxConfig.CreateFromJsonString(json);
var sdk = new BoxJWTAuth(config);
var token = sdk.UserToken(boxUserId);
boxClient = sdk.UserClient(token, boxUserId);
And for completeness' sake, here's a snippet of code that will allow you to programmatically access a Box folder and traverse its contents:
//folderId <-- You can find this ID by logging into your box account and navigating to the folder that you're interested in accessing programmatically.
var items = await boxClient.FoldersManager.GetFolderItemsAsync(folderId, limit: 5000, offset: 0, autoPaginate: false,
sort: "name", direction: BoxSortDirection.DESC);
// How many things are this folder?
Console.WriteLine($"TotalCount: {items.TotalCount}");
// Loop through those items
foreach (var item in items.Entries)
{
// Get info on each item
var file = await boxClient.FilesManager.GetInformationAsync(item.Id);
// Print the filename
Console.WriteLine($"file: {item.Name}");
}
From Azure DevOps portal, I can manually add file/ folder into repository irrespective of the fact that source code is cloned or not - Image for illustration.
However, I want to programmatically create a folder and a file inside that folder within a Repository from c# code in my ASP .NET core application.
Is there a Azure DevOps service REST API or any other way to do that? I'll use BASIC authentication through PAT token only.
Note : I'm restricted to clone the source code at local repository.
Early reply is really appreciated.
I tried HttpClient, GitHttpClient and LibGit2Sharp but failed.
Follow below steps in your C# code
call GetRef REST https://dev.azure.com/{0}/{1}/_apis/git/repositories/{2}/refs{3}
this should return the object of your repository branch which you can use to push your changes
Next, call Push REST API to create folder or file into your repository
https://dev.azure.com/{0}/{1}/_apis/git/repositories/{2}/pushes{3}
var changes = new List<ChangeToAdd>();
//Add Files
//pnp_structure.yml
var jsonContent = File.ReadAllText(#"./static-files/somejsonfile.json");
ChangeToAdd changeJson = new ChangeToAdd()
{
changeType = "add",
item = new ItemBase() { path = string.Concat(path, "/[your-folder-name]/somejsonfile.json") },
newContent = new Newcontent()
{
contentType = "rawtext",
content = jsonContent
}
};
changes.Add(changeJson);
CommitToAdd commit = new CommitToAdd();
commit.comment = "commit from code";
commit.changes = changes.ToArray();
var content = new List<CommitToAdd>() { commit };
var request = new
{
refUpdates = refs,
commits = content
};
var personalaccesstoken = _configuration["azure-devOps-configuration-token"];
var authorization = Convert.ToBase64String(System.Text.ASCIIEncoding.ASCII.GetBytes(string.Format("{0}:{1}", "", personalaccesstoken)));
_logger.LogInformation($"[HTTP REQUEST] make a http call with uri: {uri} ");
//here I making http client call
// https://dev.azure.com/{orgnizationName}/{projectName}/_apis/git/repositories/{repositoryId}/pushes{?api-version}
var result = _httpClient.SendHttpWebRequest(uri, method, data, authorization);
I need to programmatically backup/export a SQL Database (either in Azure, or a compatible-one on-prem) to Azure Storage, and restore it to another SQL Database. I would like to use only NuGet packages for code dependencies, since I cannot guarantee that either the build or production servers will have the Azure SDK installed. I cannot find any code examples for something that I assume would be a common action. The closest I found was this:
https://blog.hompus.nl/2013/03/13/backup-your-azure-sql-database-to-blob-storage-using-code/
But, this code exports to a local bacpac file (requiring RoleEnvironment, an SDK-only object). I would think there should be a way to directly export to Blob Storage, without the intermediary file. One thought was to create a Stream, and then run:
services.ExportBacpac(stream, "dbnameToBackup")
And then write the stream to storage; however a Memory Stream wouldn't work--this could be a massive database (100-200 GB).
What would be a better way to do this?
Based on my test, the sql Microsoft Azure SQL Management Library 0.51.0-prerelease support directly export the sql database .bacpac file to the azure storage.
We could using sqlManagementClient.ImportExport.Export(resourceGroup, azureSqlServer, azureSqlDatabase,exportRequestParameters) to export the .bacpac file the azure storage.
But we couldn't find ImportExport in the lastest version of Microsoft Azure SQL Management Library SDK. So we could only use sql Microsoft Azure SQL Management Library 0.51.0-prerelease SDK.
More details about how to use sql Microsoft Azure SQL Management Library to export the sql backup to azure blob storage, you could refer to below steps and codes.
Prerequisites:
Registry an App in Azure AD and create service principle for it. More detail steps about how to registry app and get access token please refer to document.
Details codes:
Notice: Replace the clientId,tenantId,secretKey,subscriptionId with your registered azure AD information. Replace the azureSqlDatabase,resourceGroup,azureSqlServer,adminLogin,adminPassword,storageKey,storageAccount with your own sql database and storage.
static void Main(string[] args)
{
var subscriptionId = "xxxxxxxx";
var clientId = "xxxxxxxxx";
var tenantId = "xxxxxxxx";
var secretKey = "xxxxx";
var azureSqlDatabase = "data base name";
var resourceGroup = "Resource Group name";
var azureSqlServer = "xxxxxxx"; //testsqlserver
var adminLogin = "user";
var adminPassword = "password";
var storageKey = "storage key";
var storageAccount = "storage account";
var baseStorageUri = $"https://{storageAccount}.blob.core.windows.net/brandotest/";//with container name endwith "/"
var backName = azureSqlDatabase + "-" + $"{DateTime.UtcNow:yyyyMMddHHmm}" + ".bacpac"; //back up sql file name
var backupUrl = baseStorageUri + backName;
ImportExportOperationStatusResponse exportStatus = new ImportExportOperationStatusResponse();
try
{
ExportRequestParameters exportRequestParameters = new ExportRequestParameters
{
AdministratorLogin = adminLogin,
AdministratorLoginPassword = adminPassword,
StorageKey = storageKey,
StorageKeyType = "StorageAccessKey",
StorageUri = new Uri(backupUrl)
};
SqlManagementClient sqlManagementClient = new SqlManagementClient(new Microsoft.Azure.TokenCloudCredentials(subscriptionId, GetAccessToken(tenantId, clientId, secretKey)));
var export = sqlManagementClient.ImportExport.Export(resourceGroup, azureSqlServer, azureSqlDatabase,
exportRequestParameters); //do export operation
while (exportStatus.Status != Microsoft.Azure.OperationStatus.Succeeded) // until operation successed
{
Thread.Sleep(1000 * 60);
exportStatus = sqlManagementClient.ImportExport.GetImportExportOperationStatus(export.OperationStatusLink);
}
Console.WriteLine($"Export DataBase {azureSqlDatabase} to Storage {storageAccount} Succesfully");
}
catch (Exception exception)
{
//todo
}
}
private static string GetAccessToken(string tenantId, string clientId, string secretKey)
{
var authenticationContext = new AuthenticationContext($"https://login.windows.net/{tenantId}");
var credential = new ClientCredential(clientId, secretKey);
var result = authenticationContext.AcquireTokenAsync("https://management.core.windows.net/",
credential);
if (result == null)
{
throw new InvalidOperationException("Failed to obtain the JWT token");
}
var token = result.Result.AccessToken;
return token;
}
Result like this:
1.Send request to tell sql server start exporting to azure blob storage
2.Continue sending request to monitor the database exported operation status.
3.Finish exported operation.
Here's an idea:
Pass the stream to the .ExportBacPac method but hold a reference to it on a different thread where you regularly empty and reset the stream so that there's no memory overflow. I'm assuming here, that Dac does not have any means to access the stream while it is being filled.
The thing you have to take care of yourself though is thread safety - MemoryStreams are not thread safe by default. So you'd have to write your own locking mechanisms around .Position and .CopyTo. I've not tested this, but if you handle locking correctly I'd assume the .ExportBacPac method won't throw any errors while the other thread accesses the stream.
Here's a very simple example as pseudo-code just outlining my idea:
ThreadSafeStream stream = new ThreadSafeStream();
Task task = new Task(async (exitToken) => {
MemoryStream partialStream = new MemoryStream();
// Check if backup completed
if (...)
{
exitToken.Trigger();
}
stream.CopyToThreadSafe(partialStream);
stream.PositionThreadSafe = 0;
AzureService.UploadToStorage(partialStream);
await Task.Delay(500); // Play around with this - it shouldn't take too long to copy the stream
});
services.ExportBacpac(stream, "dbnameToBackup");
await TimerService.RunTaskPeriodicallyAsync(task, 500);
It's similiar to the Brando's answer but this one uses a stable package:
using Microsoft.WindowsAzure.Management.Sql;
Nuget
Using the same variables in the Brando's answer, the code will be like this:
var azureSqlServer = "xxxxxxx"+".database.windows.net";
var azureSqlServerName = "xxxxxxx";
SqlManagementClient managementClient = new SqlManagementClient(new TokenCloudCredentials(subscriptionId, GetAccessToken(tenantId, clientId, secretKey)));
var exportParams = new DacExportParameters()
{
BlobCredentials = new DacExportParameters.BlobCredentialsParameter()
{
StorageAccessKey = storageKey,
Uri = new Uri(baseStorageUri)
},
ConnectionInfo = new DacExportParameters.ConnectionInfoParameter()
{
ServerName = azureSqlServer,
DatabaseName = azureSqlDatabase,
UserName = adminLogin,
Password = adminPassword
}
};
var exportResult = managementClient.Dac.Export(azureSqlServerName, exportParams);
You can use Microsoft.Azure.Management.Fluent to export your database to a .bacpac file and store it in a blob. To do this, there are few things you need to do.
Create an AZAD (Azure Active Directory) application and Service Principal that can access resources. Follow this link for a comprehensive guide.
From the first step, you are going to need "Application (client) ID", "Client Secret", and "Tenant ID".
Install Microsoft.Azure.Management.Fluent NuGet packages, and import Microsoft.Azure.Management.Fluent, Microsoft.Azure.Management.ResourceManager.Fluent, and Microsoft.Azure.Management.ResourceManager.Fluent.Authentication namespaces.
Replace the placeholders in the code snippets below with proper values for your usecase.
Enjoy!
var principalClientID = "<Applicaiton (Client) ID>";
var principalClientSecret = "<ClientSecret>";
var principalTenantID = "<TenantID>";
var sqlServerName = "<SQL Server Name> (without '.database.windows.net'>";
var sqlServerResourceGroupName = "<SQL Server Resource Group>";
var databaseName = "<Database Name>";
var databaseLogin = "<Database Login>";
var databasePassword = "<Database Password>";
var storageResourceGroupName = "<Storage Resource Group>";
var storageName = "<Storage Account>";
var storageBlobName = "<Storage Blob Name>";
var bacpacFileName = "myBackup.bacpac";
var credentials = new AzureCredentialsFactory().FromServicePrincipal(principalClientID, principalClientSecret, principalTenantID, AzureEnvironment.AzureGlobalCloud);
var azure = await Azure.Authenticate(credentials).WithDefaultSubscriptionAsync();
var storageAccount = await azure.StorageAccounts.GetByResourceGroupAsync(storageResourceGroupName, storageName);
var sqlServer = await azure.SqlServers.GetByResourceGroupAsync(sqlServerResourceGroupName, sqlServerName);
var database = await sqlServer.Databases.GetAsync(databaseName);
await database.ExportTo(storageAccount, storageBlobName, bacpacFileName)
.WithSqlAdministratorLoginAndPassword(databaseLogin, databasePassword)
.ExecuteAsync();
I'm trying to add permissions to specific folders within a document library using the SharePoint 2013 Client Object Model in C#. In effect I'm trying to reproduce the behaviour you get when you "Share" a folder via the UI. This is the code I've got so far, but its not giving the behaviour I'm after. In the code I'm trying to add a single user to the RoleAssigments collection of the folder. Note: The document library does not inherit permissions from the site level.
using (ClientContext ctx = new ClientContext(SPSiteURL))
{
ctx.AuthenticationMode = ClientAuthenticationMode.Default;
Web web = ctx.Web;
Folder AccountFolder = web.GetFolderByServerRelativeUrl("account/" + OurFolderName);
ctx.Load(AccountFolder);
ctx.ExecuteQuery();
ListItem AllFields = AccountFolder.ListItemAllFields;
ctx.Load(AllFields);
ctx.ExecuteQuery();
// Add the user to SharePoint, if they have not already been added
Principal AccountUser = ctx.Web.EnsureUser(UsersName);
ctx.Load(AccountUser);
ctx.ExecuteQuery();
var info = Utility.ResolvePrincipal(ctx, ctx.Web, AccountUser.LoginName, PrincipalType.All, PrincipalSource.All, null, false);
context.ExecuteQuery();
Principal ResolvedUser = context.Web.EnsureUser(info.Value.LoginName);
ctx.Load(ResolvedUser);
ctx.ExecuteQuery();
// Get the existing RoleAssignments collection for the folder
RoleAssignmentCollection RoleAssignments = AllFields.RoleAssignments;
// Create a new RoleDefinitionBindingCollection object
RoleDefinitionBindingCollection collRDB = new RoleDefinitionBindingCollection(ctx);
// Get the default "Contribute" role and add it to our RoleDefinitionBindingCollection
RoleDefinition ContributeRoleDef = ctx.Web.RoleDefinitions.GetByName("Contribute");
collRDB.Add(ContributeRoleDef);
// Break the Role Inheritance, but copy the parent roles and propagate our roles down
AllFields.BreakRoleInheritance(true, true);
// Add our new RoleAssigment to the RoleAssignmentCollection for the folder
RoleAssignments.Add(ResolvedUser, collRDB);
// Push our permission update back to SharePoint
ctx.ExecuteQuery();
}
The following example demonstrates how to share folder using CSOM API:
using (var ctx = new ClientContext(webUri))
{
var folder = ctx.Web.GetFolderByServerRelativeUrl("/Shared Documents/Archive");
var folderItem = folder.ListItemAllFields;
//grant Read permissions to 'Everyone' Sec Group
var everyoneSecGroup = ctx.Web.SiteUsers.GetById(4); //get Everyone security group
ShareListItem(folderItem, everyoneSecGroup, "Read");
}
where
public static void ShareListItem(ListItem listItem, Principal principal, string permissionLevelName)
{
var ctx = listItem.Context as ClientContext;
var roleDefinition = ctx.Site.RootWeb.RoleDefinitions.GetByName(permissionLevelName);
listItem.BreakRoleInheritance(true, false);
var roleBindings = new RoleDefinitionBindingCollection(ctx) { roleDefinition };
listItem.RoleAssignments.Add(principal, roleBindings);
ctx.ExecuteQuery();
}
Result