I have a C# program which queries a database (server3) to determine the files a user is after, then copies those files from (server1) to (server2).
To simplify that further
C# application is executed on a desktop computer
Original files are on server1
Files are to be copied to server2
Server3 contains the database
When I run this program on my desktop, everything works fine except server1, which seems to almost grind to a holt after about 5 minutes, even though the copying process continues working fine even after 5 minutes. Any other application/user who try to connect to that server can't.
They just get a spinning cursor which only stops if I stop running the program on my desktop. For the first 5 minutes into the copying process, everything is fine for everyone. When going beyond the 5 minute range, the files continue to copy, but that's when others start experiencing connection problems to server1.
I have even tried using sleep as I assumed that the slow down was because of too much network activity and/or too much disk I/O activity on server1. sleep did nothing to help, the same problem continues. So I'm guessing the problem is happening for some other reason.
I am using code similar to this to copy the files
while (reader1.read(){
// system.threading.thread.sleep(2000);
system.io.file.copy(source, destination);
}
Why is this happening?
According to this article, the main cause of the slowdown is the use of buffering by the file copy.
On Windows Vista or later, it's possible to avoid using buffering by specifying COPY_FILE_NO_BUFFERING to the CopyFileEx() Windows API function.
You can specify the P/Invoke as follows:
enum CopyProgressResult: uint
{
PROGRESS_CONTINUE = 0,
PROGRESS_CANCEL = 1,
PROGRESS_STOP = 2,
PROGRESS_QUIET = 3
}
enum CopyProgressCallbackReason: uint
{
CALLBACK_CHUNK_FINISHED = 0x00000000,
CALLBACK_STREAM_SWITCH = 0x00000001
}
delegate CopyProgressResult CopyProgressRoutine(
long TotalFileSize,
long TotalBytesTransferred,
long StreamSize,
long StreamBytesTransferred,
uint dwStreamNumber,
CopyProgressCallbackReason dwCallbackReason,
IntPtr hSourceFile,
IntPtr hDestinationFile,
IntPtr lpData);
[Flags]
enum CopyFileFlags: uint
{
COPY_FILE_FAIL_IF_EXISTS = 0x00000001,
COPY_FILE_RESTARTABLE = 0x00000002,
COPY_FILE_OPEN_SOURCE_FOR_WRITE = 0x00000004,
COPY_FILE_ALLOW_DECRYPTED_DESTINATION = 0x00000008,
COPY_FILE_COPY_SYMLINK = 0x00000800, //NT 6.0+
COPY_FILE_NO_BUFFERING = 0x00001000
}
[DllImport("kernel32.dll", SetLastError=true, CharSet=CharSet.Auto)]
[return: MarshalAs(UnmanagedType.Bool)]
static extern bool CopyFileEx
(
string lpExistingFileName,
string lpNewFileName,
CopyProgressRoutine lpProgressRoutine,
IntPtr lpData,
ref Int32 pbCancel,
CopyFileFlags dwCopyFlags
);
Then call it like this (substituting your own file names);
int cancel = 0;
CopyFileEx(#"C:\tmp\test.bin", #"F:\test.bin", null, IntPtr.Zero, ref cancel, CopyFileFlags.COPY_FILE_NO_BUFFERING);
It might be worth trying this and seeing if it helps.
Related
I am currently trying to write a console application in C# with two screen buffers, which should be swapped back and forth (much like VSync on a modern GPU). Since the System.Console class does not provide a way to switch buffers, I had to P/Invoke several methods from kernel32.dll.
This is my current code, grossly simplified:
static void Main(string[] args)
{
IntPtr oldBuffer = GetStdHandle(-11); //Gets the handle for the default console buffer
IntPtr newBuffer = CreateConsoleScreenBuffer(0, 0x00000001, IntPtr.Zero, 1, 0); //Creates a new console buffer
/* Write data to newBuffer */
SetConsoleActiveScreenBuffer(newBuffer);
}
The following things occured:
The screen remains empty, even though it should be displaying newBuffer
When written to oldBuffer instead of newBuffer, the data appears immediately. Thus, my way of writing into the buffer should be correct.
Upon calling SetConsoleActiveScreenBuffer(newBuffer), the error code is now 6, which means invalid handle. This is strange, as the handle is not -1, which the documentation discribes as invalid.
I should note that I very rarely worked with the Win32 API directly and have very little understanding of common Win32-related problems. I would appreciate any sort of help.
As IInspectable points out in the comments, you're setting dwDesiredAccess to zero. That gives you a handle with no access permissions. There are some edge cases where such a handle is useful, but this isn't one of them.
The only slight oddity is that you're getting "invalid handle" rather than "access denied". I'm guessing you're running Windows 7, so the handle is a user-mode object (a "pseudohandle") rather than a kernel handle.
At any rate, you need to set dwDesiredAccess to GENERIC_READ | GENERIC_WRITE as shown in the sample code.
Also, as Hans pointed out in the comments, the declaration on pinvoke.net was incorrect, specifying the last argument as a four-byte integer rather than a pointer-sized integer. I believe the correct declaration is
[DllImport("kernel32.dll", SetLastError = true)]
static extern IntPtr CreateConsoleScreenBuffer(
uint dwDesiredAccess,
uint dwShareMode,
IntPtr lpSecurityAttributes,
uint dwFlags,
IntPtr lpScreenBufferData
);
I'm working on a project, which requires to copy a lot of files and directories, while preserving their original timestamps. So I need to make many calls to the target's SetCreationTime(), SetLastWriteTime() and SetLastAccessTime() methods in order to copy the original values from source to target. As the screenshot below shows, these simple operations take up to 42% of the total computation time.
Since this is limiting my whole application's performance tremendously, I'd like to speed things up. I assume, that each of these calls requires to open and close a new stream to the file/directory. If that's the reason, I'd like to leave this stream open until I finished writing all attributes. How do I accomplish this? I guess this would require the use of some P/Invoke.
Update:
I followed Lukas' advice to use the WinAPI method CreateFile(..) with the FILE_WRITE_ATTRIBUTES. In order to P/Invoke the mentioned method I created following wrapper:
public class Win32ApiWrapper
{
[DllImport("kernel32.dll", SetLastError = true, CharSet = CharSet.Auto)]
private static extern SafeFileHandle CreateFile(string lpFileName,
[MarshalAs(UnmanagedType.U4)] FileAccess dwDesiredAccess,
[MarshalAs(UnmanagedType.U4)] FileShare dwShareMode,
IntPtr lpSecurityAttributes,
[MarshalAs(UnmanagedType.U4)] FileMode dwCreationDisposition,
[MarshalAs(UnmanagedType.U4)] FileAttributes dwFlagsAndAttributes,
IntPtr hTemplateFile);
public static SafeFileHandle CreateFileGetHandle(string path, int fileAttributes)
{
return CreateFile(path,
(FileAccess)(EFileAccess.FILE_WRITE_ATTRIBUTES | EFileAccess.FILE_WRITE_DATA),
0,
IntPtr.Zero,
FileMode.Create,
(FileAttributes)fileAttributes,
IntPtr.Zero);
}
}
The enums I used can be found here.This allowed me to do all all things with only opening the file once: Create the file, apply all attributes, set the timestamps and copy the actual content from the original file.
FileInfo targetFile;
int fileAttributes;
IDictionary<string, long> timeStamps;
using (var hFile = Win32ApiWrapper.CreateFileGetHandle(targetFile.FullName, attributeFlags))
using (var targetStream = new FileStream(hFile, FileAccess.Write))
{
// copy file
Win32ApiWrapper.SetFileTime(hFile, timeStamps);
}
Was it worth the effort? YES. It reduced computation time by ~40% from 86s to 51s.
Results before optimization:
Results after optimization:
I'm not a C# programmer and I don't know how those System.IO.FileSystemInfo methods are implemented. But I've made a few tests with the WIN32 API function SetFileTime(..) which will be called by C# at some point.
Here is the code snippet of my benchmark-loop:
#define NO_OF_ITERATIONS 100000
int iteration;
DWORD tStart;
SYSTEMTIME tSys;
FILETIME tFile;
HANDLE hFile;
DWORD tEllapsed;
iteration = NO_OF_ITERATIONS;
GetLocalTime(&tSys);
tStart = GetTickCount();
while (iteration)
{
tSys.wYear++;
if (tSys.wYear > 2020)
{
tSys.wYear = 2000;
}
SystemTimeToFileTime(&tSys, &tFile);
hFile = CreateFile("test.dat",
GENERIC_WRITE, // FILE_WRITE_ATTRIBUTES
0,
NULL,
OPEN_EXISTING,
FILE_ATTRIBUTE_NORMAL,
NULL);
if (hFile == INVALID_HANDLE_VALUE)
{
printf("CreateFile(..) failed (error: %d)\n", GetLastError());
break;
}
SetFileTime(hFile, &tFile, &tFile, &tFile);
CloseHandle(hFile);
iteration--;
}
tEllapsed = GetTickCount() - tStart;
I've seen that the expensive part of setting the file times is the opening/closing of the file. About 60% of the time is used to open the file and about 40% to close it (which needs to flush the modifications to disc). The above loop took about 9s for 10000 iterations.
A little research showed that calling CreateFile(..) with FILE_WRITE_ATTRIBUTES (instead of GENERIC_WRITE) is sufficient to change the time attributes of a file.
This modification speed things up significantly! Now the same loop finishes within 2s for 10000 iterations. Since the number of iterations is quite small I've made a second run with 100000 iterations to get a more reliable time measurement:
FILE_WRITE_ATTRIBUTES: 5 runs with 100000 iterations: 12.7-13.2s
GENERIC_WRITE: 5 runs with 100000 iterations: 63.2-72.5s
Based on the above numbers my guess is that the C# methods use the wrong access mode when opening the file to change to file time. Or some other C# behavior slows things down...
So maybe a solution to your speed issue is to implement a DLL which exports a C function which changes the file times using SetFileTime(..)? Or maybe you can even import the functions CreateFile(..), SetFileTime(..) and CloseHandle(..) directly to avoid calling the C# methods?
Good luck!
I'm coding an AntiCheat Module for an own little Projekt. It's coded in C#, but it wouldn't matter, if the solution is in C++ (As long, as it's working).
The Problem is, my program runs in the Background, you can't see it. For that I used this Code:
[DllImport("kernel32.dll")]
static extern IntPtr GetConsoleWindow();
[DllImport("user32.dll")]
static extern bool ShowWindow(IntPtr hWnd, int nCmdShow);
const int SW_HIDE = 0;
const int SW_SHOW = 5;
var handle = GetConsoleWindow();
ShowWindow(handle, SW_HIDE);
But what is, if the User is detecting this program and simply close it? All I need is an Event, that appears when a User is closing it via Taskmanager (or something similar), that the Game can be closed and no cheats can apply & take effect to the game.
Do you have any plan? Or just something what matches my requirements, so that the AntiCheat is working?
As I said, if it's in C++, it doesn't matter.
I need to know when the user switches to the logon screen (as triggered by ctrl-alt-del) in order to circumvent a pesky bug in WPF. I want to work around this bug by reinitializing my GUI after returning from the logon screen. Currently it works, but I have to trigger it manually.
I have found SystemEvents.SessionSwitch, but unfortunately this is only triggered when logging off.
How can I detect when the logon screen is displayed by forming ctrl-alt-del?
The tricky thing is that this is not a session change, but rather just a desktop change. In particular, Ctrl+Alt+Del switches to a secured desktop associated with Winlogon.
I don't think you're really supposed to detect this kind of thing (that is, after all, the whole point of having a "secure desktop"), but you could probably do it using an Active Accessibility hook. Call the SetWinEventHook function to install an event hook for the EVENT_SYSTEM_DESKTOPSWITCH event and see what notifications you receive.
To get it going, you'll need to do the following:
Ensure that you're pumping a message loop on your client thread in order to receive event notifications. This shouldn't be a problem for a standard WPF application.
Make sure that you specify the WINEVENT_OUTOFCONTEXT flag, considering that you're working from managed code. You don't want the system to attempt to inject your DLL that contains the callback function into every process. Instead, this will cause the callback function to be called asynchronously from a queue; much safer from the land of managed code.
A little bit of P/Invoke magic. To get you started…
const uint WINEVENT_OUTOFCONTEXT = 0x0;
const uint EVENT_SYSTEM_DESKTOPSWITCH = 0x0020;
[DllImport("user32.dll")]
static extern IntPtr SetWinEventHook(uint eventMin,
uint eventMax,
IntPtr hmodWinEventProc,
WinEventDelegate lpfnWinEventProc,
uint idProcess,
uint idThread,
uint dwFlags);
delegate void WinEventDelegate(IntPtr hWinEventHook,
uint event,
IntPtr hwnd,
int idObject,
int idChild,
uint dwEventThread,
uint dwmsEventTime);
[DllImport("user32.dll")]
static extern bool UnhookWinEvent(IntPtr hWinEventHook);
The process which gets started to show the logon screen seems to be called LogonUI.exe.
Using the Windows Management Instrumentation (WMI) infrastructure you can listen for processes which start and shut down. You will need to reference the System.Management assembly.
var interval = new TimeSpan( 0, 0, 1 );
const string isWin32Process = "TargetInstance isa \"Win32_Process\"";
// Listen for started processes.
WqlEventQuery startQuery
= new WqlEventQuery( "__InstanceCreationEvent", interval, isWin32Process );
_startWatcher = new ManagementEventWatcher( startQuery );
_startWatcher.Start();
_startWatcher.EventArrived += OnStartEventArrived;
// Listen for closed processes.
WqlEventQuery stopQuery
= new WqlEventQuery( "__InstanceDeletionEvent", interval, isWin32Process );
_stopWatcher = new ManagementEventWatcher( stopQuery );
_stopWatcher.Start();
_stopWatcher.EventArrived += OnStopEventArrived;
Handling these events, you can get information about the started or closed process. This way you can verify when LogonUI.exe was shut down, and subsequently trigger the required actions.
void OnStopEventArrived( object sender, EventArrivedEventArgs e )
{
var o = (ManagementBaseObject)e.NewEvent[ "TargetInstance" ];
string name = (string)o[ "Name" ];
...
}
First off I would like to say, that I am not trying to hack a game. I am actually employed by the company whose process I am trying to inject. :)
I would like to know how to call a function from an already injected DLL.
So, I have successfully injected and loaded my DLL in the target using CreateRemoteThread(). Below you can see a snippet of the injection:
private static bool Inject(Process pToBeInjected, string sDllPath,out string sError, out IntPtr hwnd, out IntPtr hLibModule)
{
IntPtr zeroPtr = (IntPtr)0;
hLibModule = zeroPtr;
IntPtr hProcess = NativeUtils.OpenProcess(
(0x2 | 0x8 | 0x10 | 0x20 | 0x400), //create thread, query info, operation ,write, and read
1,
(uint)pToBeInjected.Id);
hwnd = hProcess;
IntPtr loadLibH = NativeUtils.GetProcAddress( NativeUtils.GetModuleHandle("kernel32.dll"),"LoadLibraryA");
IntPtr dllAddress = NativeUtils.VirtualAllocEx(
hProcess,
(IntPtr)null,
(IntPtr)sDllPath.Length, //520 bytes should be enough
(uint)NativeUtils.AllocationType.Commit |
(uint)NativeUtils.AllocationType.Reserve,
(uint)NativeUtils.MemoryProtection.ExecuteReadWrite);
byte[] bytes = CalcBytes(sDllPath);
IntPtr ipTmp = IntPtr.Zero;
NativeUtils.WriteProcessMemory(
hProcess,
dllAddress,
bytes,
(uint)bytes.Length,
out ipTmp);
IntPtr hThread = NativeUtils.CreateRemoteThread(
hProcess,
(IntPtr)null,
(IntPtr)0,
loadLibH, //handle to LoabLibrary function
dllAddress,//Address of the dll in remote process
0,
(IntPtr)null);
uint retV= NativeUtils.WaitForSingleObject(hThread, NativeUtils.INFINITE_WAIT);
bool exitR = NativeUtils.GetExitCodeThread(hThread, out hLibModule);
return true;
}
Note: Error checking and freeing resources were removed for brevity, but rest assured I check all the pointers and free my resources.
After the function above exits, I have a non-zero module handle to my DLL returned by LoadLibrary through hLibModule, meaning that the DLL was loaded correctly.
My DLL is a C# class library meant to show a message box (for testing). I have tried testing the function and the message box pops up. It looks like this:
public class Class1
{
public static void ThreadFunc(IntPtr param )
{
IntPtr libPtr = LoadLibrary("user32.dll");
MessageBox(IntPtr.Zero, "I'm ALIVE!!!!", "InjectedDll", 0);
}
[DllImport("kernel32", SetLastError = true)]
public static extern IntPtr LoadLibrary(string lpFileName);
[DllImport("user32.dll", CharSet = CharSet.Auto)]
static extern int MessageBox(IntPtr hWnd, String text, String caption, int options);
}
I compile it from Visual Studio and the DLL appears in the Debug folder. I then pass the full path of my DLL to the injector.
After injection into the target process, I don't know how to call my ThreadFunc from the injected DLL, so it never executes.
I cannot use GetProcAddress(hLibModule,"ThreadFunc") since I am out of process, so the answer must lie into calling CreateRemoteThread() somehow. Also, I have read that DllMain is no longer allowed for .NET DLLs, so I cannot get any free execution that way either.
Does anyone have any idea how to call a function from an injected DLL?
Thank you in advance.
Well, you already got a thread running inside that process. You make it do something boring, it only loads a DLL. This works completely by accident, LoadLibrary just happens to have to correct function signature.
It can do much more. That however better be unmanaged code, just like LoadLibrary(), you cannot count on any managed code running properly. That takes a heckofalot more work, you have to load and initialize the CLR and tell it to load and execute the assembly you want to run. And no, you cannot load the CLR in DllMain().
Keywords to look for are CorBindToRuntimeEx() and ICLRRuntimeHost::ExecuteInAppDomain(). This is gritty stuff to get going but I've seen it done. COM and C++ skills and generous helpings of luck required.