Automatically Check if a webpage source code has changed - c#

I am new to C#/WPF programming and I am trying to automatically update a local copy of a source code if the source code for a certain page has changed. Is there a way to check the source code say every other day without me having to go in manual do diff?
To get the source code for the website I have
Private bool getSourceCode(string UserInputSub)
{
//insert error catching..
using (WebClient webClient = new WebClient()) //Get source code of page.. user //enters URL
{
string s = webClient.DownloadString(UserInputSub);
string fixedString = s.Replace("\n", "\r\n");
string desktopPath = Environment.GetFolderPath(Environment.SpecialFolder.Desktop);
string FilePath = desktopPath + "\\SourceCode.txt";
System.IO.StreamWriter wr = new System.IO.StreamWriter(#FilePath);
wr.Write(fixedString);//writes to script
wr.Close();
}
return true;
}
This only runs once the program is being runned. I would like it so the user does not have the run teh program for it to update the txt it produces.

Add a timer to the project, set the interval to 86400000 ms(24 hs), and then in the tick event call your function,is not the best solution,better will be to add as a cron job or something, but if is a dedicated machine,sure it will work.

The simple solution would be to write a windows service that will fire once a day, and do this for you.
It'll add some complexity, but do what you want.
edit:
If you want this as part of the windows application, you can set a timer, or poll every x amount of time, but then your application needs to be open all the time for this to happen.
If you want to collect the data independently from the windows program who uses it, you'll have to have a separate service running in the background. Of course, you can have a simple console windows in the background that'll do it for you, but that so hackish that it should be illegal.

Related

Is it possible to make a HIDDEN and unseen Call to a web browser for a simple single, invisible .PHP script/page with C#?

Is it possible to make a HIDDEN and unseen Call to a web browser for a simple single, invisible .PHP script/page with C# ?
Guys, i've got 1 simple thing that i'm doing from my MAIN C# APP, which is a program that does Text-Message Alerts. What i'd "LIKE" to do:
-After every Text-Message Send, call a simple and easy .PHP script (with the RESULTS of the Text-Message send: STATUS=GOOD||BAD||UNKNOWN,number,carrier
This quick+easy .PHP script and call is just to keep a database of KNOWN and WORKING (AND NON-WORKING) Numbers... a simple and modest task- I just don't want the ugly Web Browser (and it's multiple TABS) being shown AT ALL (I want it to be INVISIBLE, or ELSE I can't USE this .PHP Method of tracking the GOOD+BAD Text Numbers. It's NOT Gonna WORK if i can't just HIDE the browser window somehow or alternately call the .PHP script (WHICH HAS NO OUTPUT: all it does is write what the status, number, and carrier are to a FLAT FILE and it exits: QUICK+EASY like I said...).
But guys, tell me if this is just not possible the way I want it with my C#/.NET app here...?
Thanks in advance!
I don't know why you are doing this with PHP instead of doing everything you need in your C# app, but you can do this by running a hidden command line call process like this:
var startInfo = new System.Diagnostics.ProcessStartInfo
{
WindowStyle = System.Diagnostics.ProcessWindowStyle.Hidden,
FileName = "cmd.exe",
Arguments = "/C php <yourscript>"
};
var process = new System.Diagnostics.Process
{
StartInfo = startInfo
};
process.Start();
and then reading the output:
var output = process.StandardOutput.ReadToEnd();

C# and ExtendScript for batch processing of .incx files

I have a ton of .incx text documents clustered into their own individual subfolders that I need to iterate through and convert to plaintext as part of a C# winform app I've created. I have the latest version of InCopy and the ExtendScript Toolkit, and a .jsx script that works great to quietly and quickly create my plaintext files.
My problem/question is that there isn't much guidance on how to best launch this from within a C# class in a running 3rd party app, sending in relevant info. When I run my .jsx script, I need to send it a target folder from my app where it can find the .incx files.
The target folder(s) will be dynamic depending on other previous actions in my app.
I've found a few vague hints to solutions on Adobe's forums involving additional .vbs files and/or external temp files to hold arguments, but they're all pretty dated, so I thought I'd ask and see if anyone knew of a contemporary method. If anything is unclear, I'll respond right away to clarify.
Through a lot more Googling and my own trial and error, I have found my answer.
The best way I can find is to do all of my InCopy scripting in VBS and then use a Process instance to send in my arg(s) with cscript.
Example C#:
Process myScriptProc = new Process();
myScriptProc.StartInfo.FileName = #"cscript";
myScriptProc.StartInfo.WorkingDirectory = rootDir + "\\"; // rootDir being the path where my vbs lives
myScriptProc.StartInfo.Arguments = "MyScript.vbs " + filesPath; // filesPath is the arg sent to the script
myScriptProc.StartInfo.WindowStyle = ProcessWindowStyle.Hidden;
myScriptProc.Start();
myScriptProc.WaitForExit();
myScriptProc.Close();
MyScript.vbs
main
Function main()
Set myInCopy = CreateObject("InCopy.Application.CC.2015")
Set obj = CreateObject("Scripting.FileSystemObject")
myInCopy.ScriptPreferences.UserInteractionLevel = 1699640946
myFormat = 1952412773
myExtension = ".txt"
Set objFSO = CreateObject("Scripting.FileSystemObject")
objStartFolder = WScript.Arguments(0)
Set objFolder = objFSO.GetFolder(objStartFolder)
Set colFiles = objFolder.Files
For Each x In colFiles
If LCase(objFSO.GetExtensionName(x.name)) = "incx" Then
thisDoc = x
Set myDoc = myInCopy.open(thisDoc)
Set myStory = myInCopy.ActiveDocument.Stories.Item(1)
parts = split(x.Name, ".")
myFilePath = objStartFolder & "/" & parts(0) & myExtension
myStory.Export myFormat, myFilePath
myDoc.close()
obj.DeleteFile(thisDoc)
End If
Next
myInCopy.ScriptPreferences.UserInteractionLevel = 1699311169
End Function
I rewrote my JavaScript file in VBScript because judging from the tumbleweeds blowing through the Adobe forums, I was never going to get any answers as to why their documentation examples for calling DoJavaScriptFile produce object missing method errors.
The biggest hurdle I ran into after redoing my script in VB was that you have to use the super-secret enumerated decimal values for Adobe-specific things if you run the scripts externally. If you look at MyScript.vbs you'll see a few instances of what look like random 10 digit values. Those come from here:
http://jongware.mit.edu/idcs5js_html_3.0.3i/idcs5js/index_Enum%20Suite.html
Bless the guy who created that resource, because I couldn't find that information in any of Adobe's documentation to save my life.
TL;DR: If you're trying to automate using processes and scripts that run outside an Adobe app, do everything in VBScript, and beware the mystery decimal enumerations.
useless footnote:
MyScript.vbs here reads all *.incx files from the passed in directory, exports as plain .txt (with the same filename, into the same dir), and deletes the original.

.Net Application logging

Hello all I have a scenario where i have one winform app as server and infinite number of winform apps as clients
basically each client connects to server and sends a string to server , server than do some calculations and return string back to client, but server have to connect to another server for calculation of that string and in response from that second server our main server stores the response in a string variable and after some specific time intervals it shows that string variable in a textbox but this string gets bigger and bigger after each calculation and hence my server some times starts consuming 1gb memory in task manager and 40% of my cpu usage , and when i removed the string variable my server was running on 45mb of memory and 0-4% of cpu usage i am using string variable like this
string Serverlog += datafetched + "cl"
i have also tried a string builder object but result is same so can any one help me to sort out things ( how can i save logs without consuming to much memory ) and one thing more logs will not be maitained in any file they are only for showing them into textbox
Best solution is to store your logging somewhere, database / file / winlogging / other
What kind of app are you running on the clients? Be aware that u use the AppendText function of the textbox. So dont use:
Textbox.Text += "additional info"
but use
Textbox.AppendText(teTonenTekst + Environment.NewLine);
While logging to a file is best, you mentioned you do not want that.
For UI based logging, I usually avoid a TextBox, and instead use a ListView or DataGridView with hidden gridlines. That way it is easy to truncate the amount of values to a limit, keeping only recent data in the control.
It is also easier to color code different types of logging data.
You can write the text in to the file or MSMQ or Telnet and clear the variable. While displaying the contents read from one of the above mentioned source.
You should be used text file for logging application actions.
I would suggest to store the logs in a file instead of keeping everythingin a variable. I personaly always do that by using a logging function which creates an individual log file for each user each day. Like that you have a better control over your logs and dont have to worry about your preformance problem. Have a look at this example:
internal static void WriteLog(string str, string clientNumber)
{
StreamWriter logWriter = null;
string todayDateString = DateTime.Now.Day.ToString() + "-" + DateTime.Now.Month.ToString() + "-" + DateTime.Now.Year.ToString();
string fullLogFileName = todayDateString + "_" + clientNumber + "_log.txt";
string LogPath = #"\\server\folder\Logs\";
string fullLogFilePathWithName = LogPath + fullLogFileName;
if (!File.Exists(fullLogFilePathWithName))
{
logWriter = new StreamWriter(fullLogFilePathWithName, true);
logWriter.WriteLine(DateTime.Now.ToString("d.MM.yyyy h:mm") + " - " + str);
logWriter.Flush();
}
else
{
logWriter = File.AppendText(fullLogFilePathWithName);
logWriter.WriteLine(DateTime.Now.ToString("d.MM.yyyy h:mm") + " - " + str);
logWriter.Flush();
}
logWriter.Dispose();
logWriter.Close();
}
.net already has built in support for tracing / logging:
http://msdn.microsoft.com/en-us/library/zs6s4h68.aspx
However, we use log4net, and I'm quite happy with it:
http://logging.apache.org/log4net/
From your question it is not quite clear if the log is displayed by the server or the client. However, log4net has support for logging over the net, e.g. an UPD Appender.

FileSystemWatcher and Monitoring Config File Changes

I have about 5-6 Server Manager programs that write their own configuration file out to a particualr folder, such as C:\ACME. The config files all end with a *ServerConfig.cfg" where * = Program name that created it.
I have a Windows service that has a FileSystemWatcher setup that I want to FTP the configuration files each time the program updates. I've gotten everything to work, but I'm noticing that the different Server Manager programs are behaving differently.
When saving a configuration file, the FileSystemWatcher is picking up two "change" events. This is causing my program to FTP the configuration file twice where I only need it once.
In other instances I'm seeing where it may create 4, 5, or 6 "change" events when saving a configuration file.
What is the best way to handle processing/FTPing these files when they are really done saving only one time.
I really dont want o set something up to poll the directory for filechanges every so often... and like the idea that each time a configuration is saved, I get a duplicate copy along with a date/timestamp appended to the filename copied elsewhere.
I have seen lots of suggestions Googling around and even here on Stackoverflow, but nothing that seems to be all-in-one for me.
I suppose I could put the filename in a queue when a "change" event occurred if it didn't already exist in the queue. Not sure if this is the best approx.
Here is my sample code:
Startup-code:
private DateTime _lastTimeFileWatcherEventRaised = DateTime.Now;
_watcherCFGFiles = new FileSystemWatcher();
_watcherCFGFiles.Path = #"C:\ACME";
_watcherCFGFiles.IncludeSubdirectories = true;
_watcherCFGFiles.Filter = "*ServerConfig.cfg";
_watcherCFGFiles.NotifyFilter = NotifyFilters.Size;
//_watcherCFGFiles.NotifyFilter = NotifyFilters.LastAccess | NotifyFilters.FileName;
_watcherCFGFiles.Changed += new FileSystemEventHandler(LogFileSystemChanges);
_watcherCFGFiles.Created += new FileSystemEventHandler(LogFileSystemChanges);
_watcherCFGFiles.Deleted += new FileSystemEventHandler(LogFileSystemChanges);
_watcherCFGFiles.Renamed += new RenamedEventHandler(LogFileSystemRenaming);
_watcherCFGFiles.Error += new ErrorEventHandler(LogBufferError);
_watcherCFGFiles.EnableRaisingEvents = true;
Here is that actual handler for the "change" event. I'm skipping the first "change" event if the second is within 700ms. But this doesn't account for the files that make 3-4 change events...
void LogFileSystemChanges(object sender, FileSystemEventArgs e)
{
string log = string.Format("{0} | {1}", e.FullPath, e.ChangeType);
if( e.ChangeType == WatcherChangeTypes.Changed )
{
if(DateTime.Now.Subtract(_lastTimeFileWatcherEventRaised).TotalMilliseconds < 700)
{
return;
}
_lastTimeFileWatcherEventRaised = DateTime.Now;
LogEvent(log);
// Process file
FTPConfigFileUpdate(e.FullPath);
}
}
I had the exact same issue. I used a HashMap that mapped filenames to times of writes, I then used this as a lookup table for files to check and see if the changed event had been applied very quickly. I defined some epsilon (for me it was about 2 seconds to make sure events were flushed). If the time found in the map was older than that I would put it on a queue to be processed. Essentially all I had to do was keep the HashMap up to date with events and changes and this worked out (although you may want to change your epsilon value depending on your application).
Its normal this behavior because the antivirus system or other programs make more writes when a file change the content. I usually create a (global) HashTable and check if the filename exists, if don't, put the filename in it and start and an asynchronous operation to remove the filename after 3-5 seconds.
This is expected behavior - so you need to figure out how to handle it in your particular case.
The file system does not have a concept of "program done working with this file". I.e. one can write editor that updates (open/write/close) file on every keystroke. File system will report a lot of updates, but from the user point of view there is only one update when the editor is closed.

Delete uploaded file if it is not moved to another folder

I am developing a commenting system with asp.net. A user can attach an image with "Attach" button and post the comment with "Post" button. Uploading the image starts when the user attaches it. An ASHX handler saves the uploaded file to "temp" folder. If the user clicks "Post" button, I move the image into a safe place. If he doesn't click "Post", closes the browser and goes away, the file remains in the "temp" folder. How can I delete a file from this "temp" folder one hour later after it is uploaded?
Details:
I thought using System.Timers.Timer in the ashx file used for uploading
System.Timers.Timer timer = new System.Timers.Timer(300);
string fileName;
public void Cleaner()
{
System.Timers.Timer timer = new System.Timers.Timer(300); //3 second
timer.Elapsed += new System.Timers.ElapsedEventHandler(timer_Elapsed);
timer.Start();
}
protected void timer_Elapsed(object sender, System.Timers.ElapsedEventArgs a)
{
timer.Stop();
timer.Close();
string path = "temp";
string mapPath = HttpContext.Current.Server.MapPath("../" + path);
FileInfo TheFile = new FileInfo(mapPath + "\\" + fileName);
if (TheFile.Exists) File.Delete(mapPath + "\\" + fileName);
}
public void ProcessRequest(HttpContext context)
{
//Saving uploaded file
Cleaner();
}
but I feel that I am not doing right.
Timer ticks after 3 seconds but HttpContext.Current in the timer_Elapsed() function returns null. Besides, file name also returns null after timer ticks. I couldn't find a way to pass file name as a parameter when binding an event. Simply, it is problematic. I am looking for a more elegant way to delete the uploaded file after one hour.
I would avoid timers as you will create one timer per file which will not scale very well at all.
How about this, run a clean up process on another thread in the web app started on app start that will delete temp files each time a session expires. That way you need no timers as the process will be prompted each time a session expires. You will need a class to store a reference (by unique name I guess) to the file which are still live (by that I mean the session to which they belong is still live) which the clean process can check.
LMK if you want some code pointers.
HttpContext.Current should be null as the the context died as soon as response was sent.
If you were using unix, I would suggest write a script and run using cron. But seems you are using Windows.
So, write a program (exe) which deletes files (even better only image files) from temp directory based on creation date. Google and you will find lots of tutorial how to do it. Deleting a file is one line of code. If you are using system temp dir, that is another line of code. If you are using custom temp dir, you already know the path.
If you want to check creation time property (or last modified time property), you need to write few more lines.
Now, schedule the exe as per your requirement using windows task manager. Or you can use 3rd party task managers available for windows.

Categories

Resources