Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
Questions asking for code must demonstrate a minimal understanding of the problem being solved. Include attempted solutions, why they didn't work, and the expected results. See also: Stack Overflow question checklist
Closed 9 years ago.
Improve this question
I currently have a web application, that has certain filters and on those filters a method from a class library is called and the respective activities are performed like processing records, placing files on certain locations on different servers.
I want to automate this process.
How can I achieve this and what is the best possible way?
I read about various options windows service, Cache Item Call backs, Workflows.
But not able to evaluate.
Please help me.
You can create a console application for the process and then schedule it through Windows Scheduler. Or you may look at Quartz.NET - Enterprise Job Scheduler for .NET Platform
Related
Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
Questions asking for code must demonstrate a minimal understanding of the problem being solved. Include attempted solutions, why they didn't work, and the expected results. See also: Stack Overflow question checklist
Closed 9 years ago.
Improve this question
I have an issue to create a C++ application and C# application those two must have a shared region between them, so both of them can Read/Write from/to this region. if you have any documents/solution can you share it to me?
thanks in Advance.....
using a named pipe so the two processes can communicate (C# server and C++ Client)
you can also see "Sharing Files and Memory" at MSDN.
You can use memory mapped files to communicate between applications, along with mutexs and events to single between applications.
Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
Questions asking for code must demonstrate a minimal understanding of the problem being solved. Include attempted solutions, why they didn't work, and the expected results. See also: Stack Overflow question checklist
Closed 9 years ago.
Improve this question
I have library with log functionality and ASP.NET web application, which uses this library. I need to write logs to web application folder. If I write logs from web app, I use HttpContext.Current.Request.ApplicationPath, but how can I get path of current web application from external lib? Thank you.
You can, for example in your Global.asax's Application_Start method:
protected void Application_Start()
{
YourLogLibrary.StaticLogConfiguration.LogDirectory = Server.MapPath("~/Logs");
}
Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
Questions asking for code must demonstrate a minimal understanding of the problem being solved. Include attempted solutions, why they didn't work, and the expected results. See also: Stack Overflow question checklist
Closed 9 years ago.
Improve this question
Long explanation made short, I NEED to supply a proper user agent when posting data to my server; Why? I don't know entirely.
In any case, I've read that making a web request prior and grabbing the header from it would suffice, but I'd like to know if there's a more cleaner/sufficient method.
You can indeed just grab one from your favourite browser, or pick one from here.
The user agent string is just that - a string containing various info about the browser. So it's just a matter of passing it along with you request. If your program will live for a while, I'd try to pick one that's as generic as possible.
Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
Questions asking for code must demonstrate a minimal understanding of the problem being solved. Include attempted solutions, why they didn't work, and the expected results. See also: Stack Overflow question checklist
Closed 9 years ago.
Improve this question
Given an website URL.
Is there a way to traverse through all the links on the website and keep track of all the Pages in a text file or something. I want to use Selenium for this.
However, some of them are pop up dialogs that will be on every header and footer of every page. So obviously keep track of visited links and not go back to them again.
Thanks.
Try Scrapy: http://scrapy.org/
Scrapy is a fast high-level screen scraping and web crawling framework, used to crawl websites and extract structured data from their pages. It can be used for a wide range of purposes, from data mining to monitoring and automated testing.
Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
Questions asking for code must demonstrate a minimal understanding of the problem being solved. Include attempted solutions, why they didn't work, and the expected results. See also: Stack Overflow question checklist
Closed 9 years ago.
Improve this question
I got one task which I am unable to do... The task is like this: I need to develop an ASP.NET web site which has the following two sections:
A text box where the user can type a message and hit the broadcast button.
Where all the messages being broadcast by any user can be seen (real time message feed, new messages should appear without need of reloading the page/hitting a button).
If WCF is not mandatory, look into SignalR.
From signalr.net:
ASP.NET SignalR is a new library for ASP.NET developers that makes it incredibly simple to add real-time web functionality to your applications. What is "real-time web" functionality? It's the ability to have your server-side code push content to the connected clients as it happens, in real-time.
Its easy to implement, well documented, and it can be used for more than just browser/html based clients. They even have some sample chat applications that might point you in the right direction.