Serve canned offline web content using the Web Browser control - c#

I'm developing a C# replacement for a legacy VB app for my company. The front end is basically a Web Browser control inside of a Windows form, serving offline content which is sometimes altered to include the user's data. Because there are 100 or more web files in the legacy app, we are going to reuse the web UI from the old application with a new C# wrapper around it, modifying them as needed.
My questions are about how to store and deliver the web content.
Does it make sense to copy the web files to a temporary folder and point the Web Browser control to the file:// address of the temporary folder?
Is there some kind of pre-built offline-friendly server framework that makes more sense than copying the files to a temporary folder?
I have the web source files in my project as resources, but I'm not sure if that is appropriate for my uses. Is it?
The legacy VB implementation alters the web files to inject data using Substring methods; it searches for magic strings and replaces them with the appropriate data. That code smells pretty bad, is there a better, more native data injection strategy I should look at?
Some background:
The data is presented using HTML\CSS\JS and also sometimes XSL.
The browser delivers content that is available at compile time.
I'm going to have to handle some events using c# code when users click on buttons of the page.
I'm free to choose whatever approach is necessary to implement the application.

Hosting
I would probably avoid using a temporary location for the web content it just seems a little crude. If there is no internal linking between your html pages and all the css/js is embedded in one file it may be easier to just use the WebBrowser.DocumentText property.
Another option I have successfully used as a lightweight embedded web server is logv-http, it has a pretty easy to configure syntax. If you want to configure against anything other than localhost it does require administrator privileges but it sounds like everything will be local.
var server = new Server("localhost", 13337);
server.Get("http://localhost:13337" ,(req, res) => res.Write("Hello World!"));
server.Start();
Templating
I think the string replaces aren't necessarily bad depends how many there are and how complicated they are trying to be, but for simple find replace it shouldn't be too hard to manage. If there are lots of replaces wrapping them into a RegEx should help performance.
Storing the web content as embedded resources is probably how I would go that way you can read them out at run-time do you pre-processing and then return either via the the web server method or direct into the DocumentText.

Related

.Net Core Razor Pages - Server Side Include

I'm unable to get server side includes (*.html files) working in a .net core razor pages web application. I've made sure to have the appropriate handler in my applicationhost.config, but I'm thinking there's a different issue here. Any help is appreciated.
Why am I doing this? I have multiple web applications sharing the server side include files (for navigation bar, footer, head content, etc..). Each of these different applications may be of different Microsoft web architecture. Our goal is to move everything to .net core, but we have lingering web forms projects to deal with along the way.
I have performed a work around by taking the SSI file contents and using #Html.Raw to serve up the content. This is probably wrong also.
I went ahead and changed the file extension of the html files into cshtml which allowed me to treat these files as partial views. I'm using a prebuild event to copy these files from a shared solution folder into my project Pages/Shared/ssi folder. I also copy those partials into wwwroot/ssi for the other applications to use via SSI. Eventually all of the apps will use the partial views instead.
The problem with this solution is that it is not necessarily clear that all edits need to happen in the shared solution folder instead of directly in the project, but the documentation for the project will address this. I tried using linked files, but only one link to a specific file can be made in a project.
Not a perfect solution (to the problem), but this not a perfect website either.

Using the right search method in windows 8 app

I'm creating a Windows 8 app that can upload content from the local machine into the app (for local storage). I need to search through this content. What is the preferred strategy to use to incorporate this search functionality?
I have been trying to use the SampleData.json and SampleDataSource.cs from the grid template app as a starting point, but to me, it seems like the SampleData.json file will need to be updated each time new content is added to the app, seeing that data is populated from the SampleData.json file.
I have been going through the tutorial from MSDN:
Is there any other tutorials or advice anyone has for me? I need to incorporate this asap
Windows can index files for you, and then you can use the StorageFolder.CreateFileQuery[WithOptions] APIs to search via properties. If you place content inside a folder called "Indexed" in local or roaming app data, then indexing happens automatically and queries execute very quickly. You can also store the content in "appcontent-ms" files if that works better.
There's also the [Windows.Storage.Search.ContentIndexer][2] API for nonfile content or content that can't live in Indexed appdata folders. The ContentIndexer has its own query methods.
For all the details, see the section "Indexing and Searching Content" in Chapter 15 of my free ebook, Programming Windows Store Apps with HTML, CSS, and JavaScript, 2nd Edition. Even though it's using JS as the language, much of the book is just about WinRT so it's entirely useful even if you're working with C#. And it's free, so there's nothing to lose!

ASP.NET Browser Cache causing issues

I have looked for answers to this question, but I am not sure if I am asking it right.
I am looking for what do developers do in this situation:
I am developing an ASP.NET C# applications. I have CSS and SCRIPT files, and I am using jQuery. I install my application to the Web Servers (or I have my customer install them). If I have made any changes to my script files by adding some new jQuery or something, my customers don't get that effect after I do an update. I assume that the reason is that their browsers cache the file on the local computer and they do now download the new file from the server.
In my development environment I clear the cache when I close the browser and on IE I tell it in options to always load from the server. That way when developing I never have cached data.
What is the best practice to make sure that if I do make changes, those files get refreshed on the client computers after I do an update? Is there something in Code I can do?
I really don't want to change the filename and update all my script references.
Thanks,
Cory
The traditional way is to append a query string argument to the end of the reference to the css/script file path. For example, if you append a build number as the query string, each version of the software will make its own request for the relevant resource.

URL Rewriting from Winforms or console application

I'm importing classic ASP pages into a new Sitefinity installation. Unfortunately, the existing site makes extensive use of URL rewriting via Helicon ISAPI Rewrite 3.
I'm generating the list of pages that need to be imported by crawling the navigation menus in the old site. These are, unfortunately, not dynamically generated from any sort of central repository, so the best way I've found to build the site hierarchy is to crawl the site.
When creating page nodes in the Sitefinity nav hierarchy to hold the content from the old pages, I need to be able to create the new pages at a location roughly equivelant to their location in the file system in the old site. However, the rewrite rules make this difficult to determine. For instance, I may get a link form parsing the old HTML like:
http://www.mysite.com/product_name
which is rewritten (not redirected) to
http://www.mysite.com/products/product_name/product_root.asp
I need a way to get the second url from the first. The first thing that comes to mind is to somehow use the .htaccess file to parse the URLs, get the result and use that for the rest of the import process.
Is there a way to do this from a Winforms app without having to involve a web server? I realize that I could modify one of the ASP includes, such as the page footer, to emit a comment containing the rewritten URL of each page, but I'd rather not make unnecessary changes to the existing code if it can be avoided.
Update
For example,
http://www.keil.com/arm/
rewrites to
http://www.keil.com/products/arm/mdk.asp

Home/Landing screen design for a website in asp.net

I have an web based application. The content for the Home page has been currently mentioned in the HTML code for the Home page using , and tags. To change the content anytime in future, it needs to be changed in the HTML code. :(
Is there a way that we can pick up the content from some external place and get it reflected through the website. This ways, any change if required can be made at the external location without referring to the application's code.
Please advise if there is any solution for it.
Thanks.
You can
Use a database
Include external files using Server Side Includes
Read external files and write their contents and an alternative method
Sounds like you're looking for a Content Management System (CMS), which will allow your content editors access to modify only specific blocks of a page that you specify.
There are a ton out there to do what you want, so you don't have to start from scratch. Just Google 'CMS'.
Although I haven't used it myself, DotNetNuke is a popular one these days and has a free version.

Categories

Resources