I've a C# WPF application developed in VS 2015, and I want the browser to read some data from it. Just a short string. I can save it in a text file, or in a variable but it should be visible to the browser (using JS I suppose). For instance using file:/// doesn't work if the original page is hosted online - as in my case (different source conflict). This should work in Opera and FFox, but looking at their extensions, it seems you can only develop with front-end technologies, which are not enough in my case since I use WPF to look into Win OS, and then I need to share the result with the browser.
I suspect it's possible, and no , it's not to write a malicious piece of code. For instance I can read the details of the graphic card for diagnostic purposes.
Please help, many thanks.
Browsers run in a security sandbox which is intended to stop them reading or writing files to the file system.
You could write to the user's appdata. There are various javascript frameworks which persist data to there so they can provide offline or static data.
I don't think that is a good plan though.
I suggest your first candidate would be a cookie.
Quick google on how to do that, I find:
How to create cookie in c#.net windows application?
From a web page you can use the content of a cookie dynamically. So you could change what you see in the web page after it's up and running from some process in your wpf app and do a counter or whatever.
I've not used this with windows apps and a browser but I have with a web app and Silverlight. I'm afraid I don't have that code to hand though.
Related
Are there any way to do a project using pwa concept in ASPX page? I am using HTML with PWA, it was working fine but I moved into the ASP .NET. It doesn't work anymore and the JSON file is not loaded.
You will maybe found this SO post useful.
After testing, i was successful at implementing the functionality by
adding the serviceworker and manifest to a ASP.NET MVC application.
Since the view (HTML) gets rendered in the backend, it's only possible
to cache an static version of your web application. So preferable you
should use angular etc to generate your HTML.
A progressive web app works on IIS and apache web server.
progressive web app is a general concept. It has nothing to do with your web server. Please give more details about your code architecture etc
you can also use swtoolbox plugin for handing client side caching
mybe your problem is client-side caching. however PWA concepts are as follows, neither of them has nothing to do with web server type
Progressive - Works for every user regardless of their browser
Responsive - The app works on any form factor whether it's desktop, mobile, or tablet.
Connectivity-independent - Allows the user to use the web app even if it's offline.
Native Look-and-feel - Acts and feels like a native application, but is strictly web-based.
Safe - Always served up to the client through HTTPS.
Discoverable - Even though it's an "application," it can be indexed into a search engine.
Re-engageable - Allows re-engagement through features like push notifications.
Zero-Deploy hassle - Allows users to add the web app onto their home screen without the issues with app stores.
Link-friendly - Allows you to reshare using a Url.
Yes finally I able to accomplish this. PWA now works not only on ASP.NET webform but On any framework.
https://github.com/cpbenipal/PWA_Aspx
I'm developing a software on C# which has to get info from a website which the user opens in chrome, the user has to input some data and then the website returns a list of different items.
What I want is a way to be able to access to the source code of the page in order to get the info, I cant open the web myself as it doesnt show anything because I didnt input any data, so I need to get it directly from chrome.
How can I achieve this ? A chrome extension ? Or can I access to chrome directly from my software ?
Off the top of my head, I don't know any application that gets data directly from an open instance of Chrome. You'd have to write your own Chrome extension.
Alternatively, you can open the web browser from your application initially.
You can look into these libraries for doing so:
Watin (My personal favourite)
Selenium
Awesomium (You'd have to roll out your own UI, it's invisible)
Cef
Essential Objects Web Browser
EDIT: I didn't think about using QA tools as the actual browser hook as #TheAnathema mentions. That would probably work for your needs.
You're going to need to create it as Chrome extension if you must be dependent on the user actually going to a specific web site (i.e. not being able to do the requests yourself with either Selenium or standard web requests in Python).
The reason why a Chrome extension would be required is because think of how bad it could be for any software to easily read the pages you browse. Banking, medical, email, etc. could all be accessed anonymously from any process if Google allowed any outside process to tap into the web page.
Even Chrome extensions have to ask for permission to be able to do what they want, but at least it is software the user knowingly installed and agreed to the permissions.
A quick search yielded this example of modifying a page's HTML with a Chrome extension: https://blog.lateral.io/2016/04/create-chrome-extension-modify-websites-html-css/
It sounds like you want to do web scraping. Here's a good tutorial to get you started: HTML Scraping.
And this answer has a good example of how to scrape data from a website where you need to submit a form to get access to the data.
i want to interact via code with a silverlight (ver. 4) website.
i need to scrape data from the silverlight object as well as click on buttons.
what would be the simple way to do this from c# ?
what would be the simple way to do this from c++ ?
There is no such thing as a "Silverlight Website". Silverlight is a client-side technology.
Perhaps you could use something like Fiddler to examine the client to server conversation as the silverlight app is used. You might then be able to emulate it in a C++ or C# application.
Otherwise you will need some scriptable UI testing tool perhaps.
I doubt that you can scrape any data from the Silverlight control directly. If you "view source" on the page, that's all you will be able to get by scraping the page the control runs in.
UPDATE:
Anthony makes a good point that you might be able to observe the client/server communication. Fiddler is a good tool to see what's happening in that communication. If you find that the data you need is accessible in that communication, you could modify an http proxy to intercept the traffic and pull out the data you are interested in. You would tell your web browser to go to your http proxy, and the http proxy would then connect to the internet (or your existing proxy if you use one).
There are numerous http proxies available with source code. Here's a very simple one: http://code.cheesydesign.com/?p=393
For what I gather from your very brief description of your problem I'm going to jump to the conclusion that you want to do basically what Sliverlight Spy does. Checkout this blog post describing someone trying to emulate a little of what Spy does:
http://blog.aschommer.de/page/Injecting-code-into-Silverlight-applications.aspx
He's using Fiddler to modify the binaries in the XAP as they are downloaded, but before they're loaded by the SL plug-in. Pretty complicated.
Alternatively, I wonder if there something that could be done with a hosted browser in a C++/C# application, dynamic injection of javascript into the hosted page, and the Javascript API that SL exposes.
I wish to build a Windows application that will generally run in the background, but have a configurable front-end Windows Forms GUI. I also would like this program to publish a small web page which can be accessed from other machines/devices and interact or call functions of the server application.
I'd rather not deploy a full-fledged ASP.NET web site with IIS, etc. I just need something simple.
So how would I go about doing this?
Take a look at Kayak. It's a relatively small and lightweight HTTP server that you can embed into your application and should provide all the functionality you're looking for.
FWIW, I am in no way associated with this project.
Maybe it's just because i've been doing asp dev for years, but I really think you should go the iis asp route as its very simple and built into windows. I can't imagine a more straightforward way of serving a webpage that has c# behind it to programmatically effect the host system.
Thanks to Kev in the comments on my question, he pointed me to this question, in which I found a link to a lightweight C# HTTP server component I could just drop in to my application: http://webserver.codeplex.com/
Works well for little stuff like I was doing.
I'm trying to automate the download of a file from a website. Normally to download the file, I login with a username and password. Navigate to a particular screen then click a button.
I've been trying to watch the sequence of POSTs using Chrome's developer mode, and then replicate all the steps using .Net WebClient class, but to no success. I've derived from the WebClient class and added cookie handling. Which seems to be working. I go to the login page and post using WebClient.UploadValues. About half the times it seems to work. The next step appears to make another POST action to a reporting URL. Once again I use WebClient.UploadValues, but the response from the server is a page showing an internal error.
I have a couple of questions.
1) Are there better tools than hand coding C# code to replicate a bunch of web browser interactions? I really only care about being able to download the file at a particular time each day onto a Windows box.
2) The WebClient does not seem to be the best class to use for this. Perhaps it's a bit to simplistic. I tried using HttpWebRequest, but it has no facilities for encoding POST requests. Any other recommendations?
3) Although Chrome's developer plugin appears to show all interaction, I find it a bit cumbersome to use. I'd be interested in seeing all of the raw communication (unencrypted though, the site is only accesses via https), so I can see if I'm really replicating all of the steps.
I can even post the exact code I'm using. The site I'm pulling data from, specifically is the Standard and Poors website. They have the ability to create custom reports for downloading historical data which I need for reporting, not republishing.
Using IE to download the file would be a much easier, as compared to writing C# / Perl / Java code to replicate http requests.
Reason is, even a slight change in JavaScript code can break the flow.
With IE, you can automate it using COM. Following VBA example opens IS and performs a google search:
Sub Search_Google()
Dim IE As Object
Set IE = CreateObject("InternetExplorer.Application")
IE.Navigate "http://www.google.com" 'load web page google.com
While IE.Busy
DoEvents 'wait until IE is done loading page.
Wend
IE.Document.all("q").Value = "what you want to put in text box"
ie.Document.all("btnG").Click
'clicks the button named "btng" which is google's "google search" button
While ie.Busy
DoEvents 'wait until IE is done loading page.
Wend
End Sub
3) Although Chrome's developer plugin appears to show all interaction, I find it a bit cumbersome to use. I'd be interested in seeing all of the raw communication (unencrypted though, the site is only accesses via https), so I can see if I'm really replicating all of the steps.
For this you can use Fiddler to view all the interaction going on and the RAW data going back and forth. To make it work with HTTPS you will need to install the Certificates to enable decryption of trafffic.