In SharePoint Online (365) I want to add solutions (.wsp) to the "Web Designer Galleries" (_catalogs/solutions/Forms/). I saw a post that said it is possible, but I haven't seen anything showing how. I would also like to activate these solutions as well, which that same post said is not possible. I am looking for any way to do this: JavaScript, REST, C#, PowerShell, whatever. I've been looking into this for a while and so far no dice. I'm really hoping I'm missing something here.
As the title states: I need to be able to do this programmatically. I need to be able to upload multiple solutions to the gallery at one time, or automated at least.
I had a similar question answered on sharepoint.stackexchange.com, see https://sharepoint.stackexchange.com/questions/90809/is-it-possible-to-activate-a-solution-using-client-code-in-sharepoint-online-201
You can use CSOM to upload a wsp file to a list, there are a few ways of doing this, depends on where you're wsp files are to be loaded from as well.
Related
I'm trying to come up with a viable (and the simpler the better) solution for a multi-select file upload control. Normally this would be a breeze except for a few things...
The user needs to be able to literally select multiple files in the dialog, NOT one by one.
Can't use open source code. (But Javascript/JQuery is OK)
Cant use a third-party library the Microsoft doesn't support.
(Please don't bother with "Why can't you?" comments.)
I don't have a lot of experience making my own controls. (And I'd assume if there was a simple way to do this just by modifying the "Open" control, it would be an easily found tutorial.)
Thanks.
EDIT: To answer some questions...
I haven't tried much of anything outside of researching. Not really sure of where to start with all these limitations.
I can't use HTML5. In fact, I need IE7 compatibility. So no multiple attribute.
How about Telerik's multi-file upload? I believe they are an MS certified partner.
-J
If you want to make a customized multi-file uploader control yourself, you have to build a rich file explorer using java script client side and then upload files using ajax technology. I think all multi-file upload components use this method. If you can't use open source or third party components, it seems you have to make it from scratch.
I'm trying to find out in a prerequisite checker tool (written in C#), if Internet Explorer has enabled JavaScript. I don't want to change it ... just read out the information. Is that available somewhere in the registry?
First you need to know what security zone the website(s) that needs javascript would fall under.
When you know what zone you are looking for you can find it under SOFTWARE\Windows\CurrentVersion\Internet Settings\Zones...
The parameters are not humanreadable though so you'd need to lookup some information regarding those.
However it all feels a bit sketchy doing it this way
I hope that someone can give you a better answer than mine, at least a simpler.
I think modernizr is what you're looking for, it enables you to read the supported features from the HTML tag of your web page.
See:
http://www.modernizr.com/
I have written some wizards in c# and I would like to make look like more professional by adding some watermarks on the welcome and complete page.
I am struggling to find any source where I can download some free to use EG Database related watermarks that I can use in a Database script generator that I have written.
Any suggestions?
I am pretty useless at making them.
thanks
PS
Not sure where I should post it .I put under c# but again not sure
You can try to download watermark tools, these links might be helpful:
http://www.softpedia.com/get/Multimedia/Graphic/Graphic-Others/Fast-Watermark-Free.shtml
http://www.mydigitallife.info/2009/11/20/download-free-watermark-image-to-watermark-or-add-logo-to-photos-in-batch/
http://www.freedownloadscenter.com/Best/free-watermark.html
And you can search for more on Google. Don't forget to scan these tools for spyware, etc.
Or you can create your own watermark here: http://www.webwatermarks.com/
Please consider checking Google. When I looked up "Free Watermarks" there were several viable hits.
Also try looking into "Stock Photography" and some combination thereof with "Free Watermarks"
I'm currently creating a website a little bit like Digg.com. There are different category like "Technology", "Sports", etc.. I want to create an RSS feeds for my website and while doing research on this, I have question that I can't find the answer.
First, this is what I have:
-I have the .NET code in C# that create a file with the last 15 news from a query from my database.
What I need to know:
-Is the RSS feeds (the xml file) needs to be generated at each load of the page (I saw that on some tutorial page but maybe it was only for a educational purpose). Personaly, I'm thinking about regenerating the .xml file each time someone submit something new. Is this a good idea?
-Do I need to create a different file for each categorie. Example: feedSports.xml, feedTechnology.xml, etc??? Or is there another way (I saw something about channel.???)
-What does feedburner do with all of this?
Thanks a lot for you help. I know this must be very newbie question so that's why I can't find anything answering this clearly on google.
DarkJaf
Your feeds would be generated just as your HTML pages are generated, after each request. But instead of outputting HTML it would be outputting RSS.
I probably would not make a file for each feed but it sure is possible. A better approach may be to pass a variable via GET or POST to your page generating the RSS and grab the data that pertains to the variable passed. You most likely can use the same logic you use for generate your HTML news lists if you isolate your code well.
I would also take a look at the article posted by Raj. It looks like C# has a nice namespace (System.ServiceModel.Syndication) that contains some objects that make the job pretty easy.
Have fun!
Nick
nickgs.com
I have a webcrawler application. It successfully crawled most common and simple sites. Now i ran into some types of websites wherein HTML documents are dynamically generated through FORMS or javascripts. I believe they can be crawled and I just don't know how. Now, these websites do not show the actual HTML page. I mean if I browse that page in IE or firefox, the HTML code does not match what's actually in the IE or firefox. These sites contain textboxes, checkboxes, etc... so I believe they are what they call "Web Forms". Actually I am not much familiar with web development so correct me if I'm wrong.
My question is, does anyone in similar situation as I am now and have successfully solved these types of "challenges"? Does anyone know of a certain book or article regarding web crawling? Those that pertains to these advanced type of websites?
Thanks.
There are two separate issues here.
Forms
As a rule of thumb, crawlers do not touch forms.
It might be appropriate to write something for a specific website, that submits predetermined (or semi-random) data (particularly when writing automated tests for your own web applications), but generic crawlers should leave them well alone.
The spec describing how to submit form data is available at http://www.w3.org/TR/html4/interact/forms.html#h-17.13, there may be a library for C# that will help.
JavaScript
JavaScript is a rather complicated beast.
There are three common ways you can deal with it:
Write your crawler so it duplicates the JS functionality of specific websites that you care about.
Automate a web browser
Use something like Rhino with env.js
I found an article which tackles deep web and its very interesting and I think this answers my questions above.
http://www.trycatchfail.com/2008/11/10/creating-a-deep-web-crawler-with-net-background/
Gotta love this.
AbotX handles javascript out of the box. Its not free though.