Is it possible to remotely upload files using a Windows Application (C#) to Sharepoint Server?
Thank you.
Yes, although you may need some of the SharePoint assemblies on the "remote" machine in order to achieve what you need.
Uploading files using Client Object Model in SharePoint 2010 is a pretty good starting point for SharePoint 2010.
In case you are using the 2007 version (WSS 3.0), you can find a great summary of different ways to upload files on this link: http://vspug.com/smc750/2009/05/19/uploading-content-into-sharepoint-let-me-count-the-ways/
You must be very careful if your farm is 32bit, in that case it is very easy to use up all the available memory in the w3wp.exe process if you're uploading large files or many files in parallel, especially if the farm is a busy one. In that case you might want to use the RPC interface described in the link above, since this is the only one where you can upload files in chunks. With all other ways the entire file being uploaded must first be loaded in the w3wp's memory before it's committed to the SharePoint list item.
For ways that involve SharePoint object model, you might want to write your own web service facade to enable the clients that do not have SharePoint dlls to upload files (+ metadata if you need it).
You can use the client object model in sp2010, rather than talking to the web services directly.
Taken from my upload profile picture applications:
http://spc3.codeplex.com/SourceControl/changeset/view/57957#1015709
using (ClientContext context = new ClientContext(siteurl)) {
context.AuthenticationMode = ClientAuthenticationMode.Default;
List list = context.Web.Lists.GetByTitle(listname);
context.Load(list);
context.Load(list.RootFolder);
context.ExecuteQuery();
string url = siteurl.CombineUrl(list.RootFolder.ServerRelativeUrl).CombineUrl(listfolder).CombineUrl(name);
FileCreationInformation fci = new FileCreationInformation();
fci.Content = data;
fci.Overwrite = true;
fci.Url = url;
Microsoft.SharePoint.Client.File file = list.RootFolder.Files.Add(fci);
context.ExecuteQuery();
}
I wrote a tool available as a NuGet package in Visual Studio called SharePoint.DesignFactory.ContentFiles. With this tool you can manage all files with metadata to be uploaded to the SharePoint content database. You can use this for SharePoint 2007 (currently have to work on the SharePoint machine itself) or for SharePoint 2010 and Office 365. In this case you can work from a machine without SharePoint installed. See http://weblogs.asp.net/soever/archive/tags/SharePoint.DesignFactory.ContentFiles/default.aspx for blogposts on the tooling.
Related
I have a SharePoint server and I want to open files directly from the Server with SharePoint CSOM.
User clicks button --> the file (Excel, Word, ...) opens at the client machine with the standard software.
Directly means, that if I change something to the file and click save, that the file is directly saved on the SharePoint server (or if I click e.g. 'Save as' in Excel the suggested path is 'https://sharpoint.url.com/folder').
Actually I have:
using Microsoft.SharePoint.Client;
var clientContext = new ClientContext("https://sharpoint.url.com");
string relativePath = "/folder/file.xls";
clientContext.Credentials = CredentialCache.DefaultCredentials;
var file = clientContext.Web.GetFileByServerRelativeUrl(relativePath);
clientContext.Load(file);
clientContext.ExecuteQuery();
What do I have to do now, if I want to open the file directly (no download)?
I assume you ask how to access the file's stream instead of downloading it to a local folder.
You can use the File.OpenBinaryDirect method to get access to its ETag and stream, eg :
using(var fileInfo=File.OpenBinaryDirect(clientContext,"/folder/file.xls"))
using(var reader=new StreamReader(fileInfo.Stream))
{
//Do whatever you want with the data
}
BTW you shouldn't use the old xls files. The format is deprecated for over 10 years. The current Excel format, xlsx, is a zipped package of XML files that's better supported by SharePoint itself, doesn't require Excel to generate or read.
For example, if you wanted to read cell values from an xlsx file, you could use the popular EPPlus library to read directly from the stream:
using(var fileInfo=File.OpenBinaryDirect(clientContext,"/folder/file.xlsx"))
using(var package=new ExcelPackage(fileInfo.Stream))
{
var sheet=package.Workbook.Worksheets[0];
var value=ws.Cells["A1"].Value;
//...
}
UPDATE
It seems the question isn't related to programming after all. All that's needed to save or open a SharePoint document is clicking on the document's link. What happens then depends on the Open Documents in Client Applications setting at the site and document library level.
This affects the headers the server sends to the browser when the user clicks on a document link. The browser may still refuse to open the registered application and display the Save dialog.
If that doesn't work, you should check why instead of writing code. It's probably a configuration error or a browser setting. Solving it is easier than creating workarounds, pushing them to all client machines. And then keeping track of all the patches, where they are deployed and deploying new ones.
Apart from that, the Office applications know about SharePoint and document libraries since 2003. They can browse them, display SharePoint properties for the document, show collaborators etc.
As I mentioned in the question comments, a lot of what people think as "SharePoint Developoment" is nothing more that configuration, administration and end user features.
MSDN docs don't help either - they actually cause harm by not covering SP administration or explaining the features and how they are used. You'll find that in Technet. For years, people created webparts in code to change how grids looked because MSDN didn't explain how eg the DataViewWebPart worked or how you could style a grid from the UI.
In general, the best place for such questions is http://sharepoint.stackexchange.com. For example, check “Open in the client application” Vs “Use the server default (Open in the client application)” inside the document library advance settings
We can create Map Network Drive for SharePoint library, and open the file from the network location. Check article below:
http://support.sherweb.com/Faqs/Show/how-to-connect-to-a-sharepoint-site-using-webdav-sharepoint-2013
Or we can download the file from SharePoint and open it using the code below:
Application.Workbooks.Open(#"C:\Test\YourWorkbook.xlsx");
Reference: https://msdn.microsoft.com/en-us/library/b3k79a5x.aspx
I am building a web application to check files and folders existance in SVN repository, I have to access to the repository structure and search files in subdirs. Access is through network.
NB : There is no local connection to svn respository, it's a web application.
I am using sharpSVN, I found a way to access to local repository but not through a network. Any solutions ?
SharpSvn has various clients. You seem to be using SvnWorkingCopyClient, which requires a local working copy.
You're looking for just SvnClient:
var client = new SvnClient();
SvnInfoEventArgs info;
client.GetInfo(targetUri, out info);
Now you can read your info from info.
I am trying to access Amazon S3 file properties such as "Exists", "Length", "LastWriteTime", etc...with AWS .NET SDK in VS2010.
I am getting a list of files within a directory using:
S3DirectoryInfo directory = new S3DirectoryInfo(client, bucket, key);
S3FileInfo[] fileList = directory.GetFiles();
fileList is now an array of S3FileInfo objects. However, when I look at the object, all of the file properties have an exception listed, instead of the property. The exception is an AmazonS3Exception, Forbidden 404 from the server. I can do the same thing with a directory listing and get the properties for directories, but file properties are always forbidden. Just in case the issue was with the directory.GetFiles() method, I also tried to creating a single S3FileInfo using:
S3FileInfo fileInfo = new S3FileInfo(client, bucket, key;
The results are the same. I know the client, bucket and key are fine, as I use them for all sorts of other operations with success, it just S3FileInfo that is having issues.
I am running a MVC 2 web application, in Visual Studio 2010, running version 2.3.18.0 of the AWS .NET SDK, with Windows 7 Professional.
I figured out the problem. I went into the AWS IAM Console, Users. I selected the User I wanted, and under Permissions I selected "Attach User Policy". I used the Administrator Access Policy Template. That fixed my problem. Thanks Vor for sending me down the path of looking at my policies and roles. The strange thing is that I could use the .NET SDK to add AWS users, create S3 buckets, keys, upload/download files, set encryption etc...The only thing I couldn't do was access file properties.
Very strange, and my guess is it is a bug or at least not intended. Seems strange to disallow viewing file properties without Administrative Access, while I could do virtually anything else.
I am currently building a local static site generator in C#. It compiles a bunch of templates together into a hierarchy of plain old HTML files. I want to upload the resulting files to my Windows Azure Website and have the changes reflected live, and I want to be able to do this programmatically via my script.
As it stands, I'm having to upload the generated files manually using WebMatrix, as I haven't been able to find an API or SDK that lets me directly upload HTML to a Windows Azure Website.
Surely there must be a way to do this from code, other than just using an sFTP library (which, because it doesn't use the WebMatrix/IIS protocol, which I think sends zipped diffs, would be slow and would mean out-of-sync data during the upload while some files have been updated and others haven't.) I'd also rather not have to commit my generated site to source control if I can avoid it. It seems conceptually wrong to me to be putting something into source control merely as an implementation detail of deployment.
Update: WebMatrix internally uses Web Deploy (MSDeploy). Theoretically you should be able to build the deployment package yourself using the API, but 99% of the examples I can find are using the command-line tool or the GUI tools in Visual Studio. I need to build the package and deploy it programmatically from within C#. Any ideas or guidance on how to go about this? The docs on MSDN don't really show any examples for this kind of scenario.
OK, so I worked out what to do with help from a couple of friendly folks at Microsoft. (See David's Ebbo's response to my forum question, and this very helpful info from Sayed Hashimi showing how to do exactly what I wanted to do from the msdeploy.exe console app).
Just grab your PublishSettings file from the Azure web portal. Open it in a text editor to get the values to paste into the below code.
var destinationOptions = new DeploymentBaseOptions()
{
// userName from Azure Websites PublishSettings file
UserName = "$msdeploytest",
// pw from PublishSettings file
Password = "ThisIsNotMyPassword",
// publishUrl from PublishSettings file using https: protocol prefix rather than 443 port
// and adding "/msdeploy.axd?site={msdeploySite-variable-from-PublishSettings}"
ComputerName = "https://waws-prod-blu-003.publish.azurewebsites.windows.net/msdeploy.axd?site=msdeploytest",
AuthenticationType = "Basic"
};
// This option says we're giving it a directory to deploy
using (var deploymentObject = DeploymentManager.CreateObject(DeploymentWellKnownProvider.ContentPath,
// path to root directory of source files
#"C:\Users\ryan_000\Downloads\dummysite"))
{
var syncOptions = new DeploymentSyncOptions();
syncOptions.WhatIf = false;
// "msdeploySite" variable from PublishSettings file
var changes = deploymentObject.SyncTo(DeploymentWellKnownProvider.ContentPath, "msdeploytest", destinationOptions, syncOptions);
Console.WriteLine("BytesCopied: " + changes.BytesCopied.ToString());
Console.WriteLine("Added: " + changes.ObjectsAdded.ToString());
Console.WriteLine("Updated: " + changes.ObjectsUpdated.ToString());
Console.WriteLine("Deleted: " + changes.ObjectsDeleted.ToString());
Console.WriteLine("Errors: " + changes.Errors.ToString());
Console.WriteLine("Warnings: " + changes.Warnings.ToString());
Console.WriteLine("ParametersChanged: " + changes.ParameterChanges.ToString());
Console.WriteLine("TotalChanges: " + changes.TotalChanges.ToString());
}
You might also be able to stumble your way through the obscure documentation on MSDN. There is a lot of passing around of oddly-named options classes, but with a bit of squinting of one's eyes and flailing about in the docs it's possible to see how the command-line options (of which it is much easier to find examples online) map to API calls.
The easiest way is probably to set up Git publishing for your website and programmatically do a git commit followed by a git push. You can think of it as a deployment mechanism instead of source control, given that Azure websites natively support a backing Git repository that doesn't have to have anything to do with your chosen SCM solution.
WebMatrix uses WebDeploy to upload the files to Windows Azure Web Sites.
An alternative is to use the VFS REST API (https://github.com/projectkudu/kudu/wiki/REST-API#wiki-vfs). The diagnostic console uses this to work with the file system today.
I'm trying to debug a webpart installed on a client's SharePoint instance. I wanted a quick and easy logging feature, so I thought of writing messages to a text file in the temp directory. SharePoint doesn't seem to like it, so what are my options?
IF you are writing to the temp directory, you will need to give the file (if it exists) or the directory rights for the IIS Application pool that the SharePoint IIS application is running under.
There are few ways of custom logging in sharepoint -
Use SPDiagnosticsService - You may write to the ULS via SPDiagnosticsService class.
Utilize diagnostics.asmx web service -
SharePointDiagnostics SharePointDiagnosticsObject = new SharePointDiagnostics();
SharePointDiagnosticsObject.Credentials = System.Net.CredentialCache.DefaultNetworkCredentials;
string Response = SharePointDiagnosticsObject.SendClientScriptErrorReport(message, file, line, client, stack, team, originalFile);
For more details on usage of diagnostics.asmx refer the following link -
https://vivekkumar11432.wordpress.com/2016/09/23/how-to-do-logging-in-uls-from-csom-in-c/
For more details on logging refer the following link -
http://www.codeproject.com/Articles/620996/Five-suggestions-to-implement-a-better-logging-in
Don't use
Microsoft.Office.Server.Diagnostics.PortalLog.LogString("Message");
According to Microsoft documentation - LogString is reserved for internal use and is not intended to be used directly from your code.
I would guess that this is a permissions issue that SharePoint is blocking you on (and probably not telling you that it is). When you try to write to a text file on the server, you need to have elevated permissions in order to do it. You can accomplish this using SPSecurity.RunWithElevatedPrivileges. Something like the following, if you want just a simple, small-code solution.
SPSecurity.RunWithElevatedPrivileges(delegate() {
using (StreamWriter sw = new StreamWriter(#"C:\log.txt"))
{
//log information here
}
});
Try a logging framework like log4net, or write a small logging framework writing into an external database, you could also use lists to log if you want to stay inside sharepoint