Hooking the 3rd X-Mouse button? - c#

I wrote a low-level mouse hook in C#, which should capture XBUTTON events. For the 1st and 2nd xButton it works just fine, but there is no message for the 3rd xButton on my mouse. It seems like there is no possible way to capture events for that button?
I have a gaming mouse and there, between the two first xButtons, is a third xButton. When I click it, nothing happens, so I wanted to write a custom C# Mouse-Hook app to program a custom behaviour for that button...

That's correct. The third X-button is handled by your mouse drivers, not by Windows itself. Windows doesn't have built-in knowledge of or support for more than two X-buttons. Those additional buttons wouldn't do anything at all without special drivers installed.
You need to find out how to communicate with your mouse driver software. That's the only way to get notifications when those buttons are clicked.

Related

Create a WPF Window that captures AND passes through Mouse Events

I'm working on a screen capture utility that captures active windows. I'm using transparent overlays to capture the full screen and then overlay the active windows based on mouse move events passed through to the underlying desktop/windows.
Both of the overlay windows currently use the WS_EX_TRANSPARENT style to allow mouse events to pass through to the underlying windows so I can detect where the mouse cursor is located. I grab the window handle and rect size to outline the window and then use Global Mouse and Keyboard Hooks to accept or reject a capture.
It's pretty ugly and spread out code (which is why I'm not posting here for now) but it all works very well and I can highlight the windows in mousemove and capture clicks with the global mouse and key handlers.
It all works except for this problem:
The Global Windows Hooks do not fire over an Admin Window so when I want to capture a Powershell, Command or Visual Studio (in Admin mode) Window no hook events are forwarded.
Apparently there's no way to work around this security issues using Windows hooks (or GetAsyncKeystate() for that matter).
I've tried a couple of different approaches to work around this issue:
Instead of using Hooks I tried using the highlight window to capture mouse/key events
This sort of works, but it's clumsy - fails if no window is selected at all (no way to get out) and doesn't allow for selecting contained windows once the parent is selected (ie. no drill down)
I also tried Win32 GetAsyncKeystate() which captures the last mouse or keyboard input and that would work, but it too fails to send mouse or key interactions from Admin windows.
So I have two choices imperfect solutions at the moment: using Hooks or GetAsyncKeyState to get the proper Window browsing selection behavior for all but admin windows, or I can capture all windows but lose the ability to drill into child windows after a parent window is selected.
I'm at the end of my rope and the real question is this:
Is there some way to create a semi-transparent or transparent window that can intercept mouseclicks and pass them on to the window area below?

UI Automation in C# without using mouse possible

So, I am trying to use UI Automation(Specically White library) to simpulate multiple automations at the same time. I would not like my mouse to be taken over. Is there anyway to do this? Basically, I want an instance of my UI Automation to use a virtual mouse specifically for that program, instead of my main mouse.
Not really clear on what you mean. The White library uses UI Automation elements to interact with desktop applications. Once you capture the element you can send any sort of command to it you like such as click and it won't actually use your existing mouse to click on it. Sounds like you are automating your mouse instead of automating the application.

Metro-style Appbar in fullscreen WPF program

I am currently working on a desktop C# WPF application where the goal is to make it look and feel like a "real" Windows Store App.
I want to add an appbar that should be shown when the user swipes up from the bottom. To do this in a normal app you just position your finger outside the screen area, and swipe up.
But if I do that in a fullscreen WPF program I don't receive any TouchDown or TouchMove events - probably because the finger is already down when entering the actual screen area.
I have tried with the Manipulation framework also, but same result here. Even when I hook directly into the message queue using WndProc or other hooks I get no events at all.
The funny thing is that I can see the "touch cursor" move around the screen, so at least something in the underlying framework is notified.
Does anyone have an idea how to do this?
p.s. It is not an option for me just to use a windows store app instead, because of hardware connectivity issues ;-)
You will need to keep track of the cursor location coordinates, and see when the cursor (swipe) starts at the edge of the screen and moves in. When that triggers (with whatever trigger you want, distance covered most likely) you can fire up your Appbar.
There was a similar question asked on MSDN:
https://social.msdn.microsoft.com/Forums/vstudio/en-US/d85dcde7-839a-44d3-9f2a-8b47b947576c/swipe-gesture-and-page-change?forum=wpf

C# Suppressing mouse click for multiple mice

I have written an application that currently handles clicks from multiple mouse devices.
I used this project and modified to handle mouse clicks as apposed to keyboards.
This is working fine, however now I need to know if there is a way to suppress a click event if it has been handled by my app. The app is a quiz game so the idea is that the quiz master will have (and still be able to use) 1 mouse, and the other contestants will have their own mouse (as buzzers). So when they buzz in, I don't want the mouse click events to fire in the operating system (or at least this application).
The concept is the familiar override of void WndProc(ref Message message), and so I have tried not calling base.WndProc(ref Message) when I don't want the click events to fire, but this has not worked.
Can anybody point me in the right direction here?
Should I be going down the windows hook route? I have looked at this, but I can't seem to work out how I could hook to each mouse device individually.
Any help would be greatly appreciated.
Edit:
This is a Windows Form UI project, and not WPF. So the MultiPoint SDK from Microsoft won't work.
The solution to this lies within not WndProc, but PreFilterMessage(). By intercepting messages before they even reach the form, you can remove them from the message pump causing them to never reach the control that was clicked. This also works for child controls within the form.
I answered this and posted the full source in the following question:
C# Get Mouse handle (GetRawInputDeviceInfo)

Does NotifyIcon have a MouseDown equivalent?

I have a NotifyIcon in the system tray. How can I detect when the user has left-clicked down on it? I assumed the MouseDown event would be what I want to use but it only handles right click and middle-button click. For left-click, it only fires after the user has let go (as in they've just performed a normal click). Is there a way to get the MouseDown event only?
This is by design, the shell synthesizes the MouseDown message from the up event. You'll see why it works this way when you click and hold the button down and then start dragging. Note how the notification area overflow window pops up and lets you drag the icon into it to remove it from the visible area. It can't work both ways.
Technically you could hook the window owned by Explorer.exe to get a crack at the messages before Explorer does with SetWindowsHookEx(). That however requires a kind of DLL that you cannot write in C#, it needs to be injected into Explorer. Very destabilizing and hard to beat the competition that is trying to do the same thing. Also the kind of code that causes sleepless nights for the Microsoft appcompat team.
It appears that the underlying Win32 API Shell_NotifyIcon sends a WM_LBUTTONDOWN message when the user clicks the icon. According to MSDN anyway.
Examining the Windows Forms source code for NotifyIcon reveals standard mouse down event handling, so if the Win32 message was being sent at the "correct" time it would work as you want/expect.
I have to agree with a previous comment that NotifyIcon will be swallowing WM_LBUTTONDOWN since it needs to do mouse capture to allow the user to drag the icons about.
It's possible that this article about creating a tray icon for WPF will be useful since it shows how to use SetWindowsHookEx etc from C#.

Categories

Resources