Common ASPX pages used by multiple Application - c#

Have a number of applications, which have similar pages.
E.g Payment Page, Search Page etc.
Looking for the best way to design the application so all the applications can use the common pages rather than having 3 versions of the same file.
It was suggested creating an app containing the common pages only where the other apps would transfer to when accessing the webpage and out again when finished. This just seems wrong, or am I incorrect.
Any advise appreciated.
ASPNET, c# framework 4.0

You'd need to override virtual path provider, which doesn't seem to be a trivial task..
Also there is a similar question - Share aspx page between projects (suggests to reuse at user controls level).

Related

Custom Server Control in ASP.NET and C# -

I've got a aspx and aspx.cs file with some components. Now I want to reuse parts of that page in another page. My approach would be to pull out the duplicate part into a WebServerControl.
So before I waste more time yahoogling, is that even the right idea and if so, is there a way to use parts of the aspx file rather than doing it tediously in RenderContents with the HtmlTextWriter, WriteBeginTag, WriteAttribute and so on. That looks like a mess for complicated layout and sizeable amounts of controls.
What's the standard?
Depends.
The main driving factor is that if you need to reuse your control in multiple web applications, you should go with a Custom Control (.cs in C#).
Else, if you only intend to reuse your control in one web application, choose a User Control (.ascx).
This MSDN article is a good starting point.
UPDATE (since OP asked further details):
To embed JavaScript for a custom control, a common approach is
var initializeScript = string.Format("MyNamespace.initialize('{0}', {1});", ClientID, myScriptString);
Attributes.Add("onmouseover", initializeScript);
Suggest to write JavaScript code in a js file and not in .cs since the latter is a nightmare to maintain and debug. Hope this helps.
It sounds like what you want to do is bundle the items into a User Control. This will allow you to design the control by using existing .NET controls rather than rendering everything out from scratch.
All you need is to create an ASP.NET Web User Control
Taken from MSDN:
An ASP.NET Web user control is similar to a complete ASP.NET Web page
(.aspx file), with both a user interface page and code. You create the
user control in much the same way you create an ASP.NET page and then
add the markup and child controls that you need. A user control can
include code to manipulate its contents like a page can, including
performing tasks such as data binding.

Share webpages between multiple websites (Best Way)

I have created two websites using C# in Visual studio 2008. Both websites having common maintenance (User and Role) pages. I have also create a common class library (C#) and added reference to both website, which is working properly.
I don't want to replicate/maintain the multiple copies of the same aspx pages.
Is there any better way except user controls, to have single copy of common aspx pages and add reference to both website, just like class library?
Thanks.
In your response to Mr.Lister it sounds like you aren't using a master page, and your really should.

Content Editor for Client

I've created a website for a client of mine. It is coded in ASP.NET with C# and hosted on GoDaddy. She requires this website to updated daily by her. However, this client has very little knowledge of how to edit HTML or text within a site. I don't want to edit it every time she wants an update on the site.
What would be the best solution to my problem? I have looked up Content Management Systems, but I'm a little confused by what exactly it does in terms of coding and the management of the existing site. Does it require me to reformat the whole site to follow the CMS's 'templates'? Would it be better for me design my own back-end panel for her to edit the content (this would obviously take significant work)?
If you want to stick with a site you're developing from scratch, I'd use the HtmlEditor from the AjaxControlToolkit or a similar control, and store the html content in the database.
Then, when outputting the html from the database to the client pages, I'd make sure to use the Microsoft Anti-Cross Site Scripting Library to sanitize the html using the GetHtmlFragment() function (since this is tagged asp.net). It's not that much work, actually, if you design the database correctly, and if you've got the skills.
CMS systems are (trying not to oversimplify) entire web sites that are already built and allow people to edit the content using built-in content editing functionality. They range in functionality and extensibility from a "You get what you get and there's very little you can change" to "You can customize the heck out of it and buy or build your own modules to extend functionality." There are a lot of good ones out there, some free, and some expensive.

Web Crawling Sites with Javascripts or web forms

I have a webcrawler application. It successfully crawled most common and simple sites. Now i ran into some types of websites wherein HTML documents are dynamically generated through FORMS or javascripts. I believe they can be crawled and I just don't know how. Now, these websites do not show the actual HTML page. I mean if I browse that page in IE or firefox, the HTML code does not match what's actually in the IE or firefox. These sites contain textboxes, checkboxes, etc... so I believe they are what they call "Web Forms". Actually I am not much familiar with web development so correct me if I'm wrong.
My question is, does anyone in similar situation as I am now and have successfully solved these types of "challenges"? Does anyone know of a certain book or article regarding web crawling? Those that pertains to these advanced type of websites?
Thanks.
There are two separate issues here.
Forms
As a rule of thumb, crawlers do not touch forms.
It might be appropriate to write something for a specific website, that submits predetermined (or semi-random) data (particularly when writing automated tests for your own web applications), but generic crawlers should leave them well alone.
The spec describing how to submit form data is available at http://www.w3.org/TR/html4/interact/forms.html#h-17.13, there may be a library for C# that will help.
JavaScript
JavaScript is a rather complicated beast.
There are three common ways you can deal with it:
Write your crawler so it duplicates the JS functionality of specific websites that you care about.
Automate a web browser
Use something like Rhino with env.js
I found an article which tackles deep web and its very interesting and I think this answers my questions above.
http://www.trycatchfail.com/2008/11/10/creating-a-deep-web-crawler-with-net-background/
Gotta love this.
AbotX handles javascript out of the box. Its not free though.

Home/Landing screen design for a website in asp.net

I have an web based application. The content for the Home page has been currently mentioned in the HTML code for the Home page using , and tags. To change the content anytime in future, it needs to be changed in the HTML code. :(
Is there a way that we can pick up the content from some external place and get it reflected through the website. This ways, any change if required can be made at the external location without referring to the application's code.
Please advise if there is any solution for it.
Thanks.
You can
Use a database
Include external files using Server Side Includes
Read external files and write their contents and an alternative method
Sounds like you're looking for a Content Management System (CMS), which will allow your content editors access to modify only specific blocks of a page that you specify.
There are a ton out there to do what you want, so you don't have to start from scratch. Just Google 'CMS'.
Although I haven't used it myself, DotNetNuke is a popular one these days and has a free version.

Categories

Resources