I remember that in C# there is a method called DragMove() that allows to drag a window by clicking on the selected area.
Example (from this answer):
private void Window_MouseDown(object sender, MouseButtonEventArgs e)
{
if (e.ChangedButton == MouseButton.Left)
this.DragMove();
}
Is there any function like this in Qt world?
I know that a solution would be to listen for mousePressEvent and mouseMoveEvent but I just want to know if there is native method that would allow to drag the winodw on the screen just calling it, like in C# this.DragMove().
Also, it needs to be cross-platform...
No, there is no function like what you refer to in Qt. Listening for mouse events is simple, and would be considered idiomatic in Qt.
If you want the widget to be draggable by pressing anywhere between controls, just implement those events in the base event. Any area not covered by controls will be "draggable".
Related
I would like to simulate/route right click on a WPF "control".
To make a long story short. I have an Adorner which should react to left click (so is hit test visible must be true) but at the same time I would like it to be "transparent" for the right clicks. (In another words i would like for a control under it to receive this click - btw right click makes Adorner disappear).
I tried to raise MouseRightButtonUp event on control directly under mouse (after Adorner disappears but it doesn't seem to work). I would like to avoid calling system functions (like mouse_event through P/Invoke). Can it be even done in wpf?
As far as I remember, I had troubles with routing events and changing Adorners IsHitTestVisible property. The main problem was, if I recall it correctly, that adorner and controls are on different branches of visual tree, so routed events spawned on the adorner won't make it to your controls.
I can't say much without you providing the code, but the simplest thing that should work would be to find a control under your mouse position and do
private void myAdorner_MouseRightButtonDown(object sender, MouseButtonEventArgs e)
{
MouseButtonEventArgs revent = new MouseButtonEventArgs(e.MouseDevice, e.Timestamp, MouseButton.Right);
revent.RoutedEvent = e.RoutedEvent;
//find you control
control.RaiseEvent(revent);
}
I am making an application for personal use where the stylus can be used to draw on the current screen but the normal use (with mouse) won't be interrupted.
Currently, I am trying to use WS_EX_TRANSPARENT to set the window to allow mouse events through, but it seems like that stylus events also get passed through without being captured.
Is there any other method I can use to pass through mouse/keyboard events while still allowing stylus events? Here is what my program looks like so far:
Disable the RealTimeStylus for WPF Applications on MSDN states:
[...] (WPF) has built in support for processing Windows 7 touch input [...] Windows 7 also provides multi-touch input as Win32 WM_TOUCH window messages. These two APIs are mutually exclusive on the same HWND.
This seems to imply that, for a given window, you can receive stylus events or touch events but not both. As you do want to handle the stylus events this means you don't need to bother filtering the touch events. That just leaves the mouse and keyboard.
At first I thought you might be able to use a custom window procedure (WndProc) and filter-out the mouse and keyboard messages. However, the WndProc (when used in WPF) is really just a notification mechanism and you can't block the received messages.
I found a Windows API called BlockInput that supposedly "Blocks keyboard and mouse input events from reaching applications". However from the docs this appears to be system-wide not app-specific so may not be any use to you.
The only other way I can think of is to use a low-level keyboard or mouse hook. This requires some P/Invoke but it's not too difficult. These hooks allow you to register callback functions that get called when keyboard and mouse events are raised. The advantage is that you can prevent those events from propagating and effectively "swallow" them, which sounds like what you need.
I don't really like posting an answer that basically says "do a search for ..." but the amount of code involved is non-trivial and has been posted in numerous places both on Stack Overflow and elsewhere, so: try doing a search for low level keyboard hook c# wpf and you should find some code that might help!
One thing you may have trouble with even if you go down this route is focus. As soon as your "invisible" topmost window gets a stylus message that it responds to, I'm presuming focus will switch to your WPF application, thus "stealing" focus from whatever application was being used prior. You might be able to use P/Invoke again to set the window style flags of your main window to prevent this (as per the accepted answer to this SO question).
ORIGINAL ANSWER
You can override the appropriate keyboard, mouse and touch Preview... event handler methods of the Window and mark them as handled. This has the effect of stopping child controls from receiving those events.
public partial class MainWindow : Window
{
public MainWindow()
{
InitializeComponent();
}
protected override void OnPreviewKeyDown(KeyEventArgs e)
{
e.Handled = true;
base.OnPreviewKeyDown(e);
}
protected override void OnPreviewMouseDown(MouseButtonEventArgs e)
{
e.Handled = true;
base.OnPreviewMouseDown(e);
}
protected override void OnPreviewMouseMove(MouseEventArgs e)
{
e.Handled = true;
base.OnPreviewMouseMove(e);
}
protected override void OnPreviewMouseWheel(MouseWheelEventArgs e)
{
e.Handled = true;
base.OnPreviewMouseWheel(e);
}
protected override void OnPreviewTouchDown(TouchEventArgs e)
{
e.Handled = true;
base.OnPreviewTouchDown(e);
}
protected override void OnPreviewTouchMove(TouchEventArgs e)
{
e.Handled = true;
base.OnPreviewTouchMove(e);
}
}
I've done the basic keyboard, mouse and touch events here. In a simple app test it seemed to do the trick and I assume it would still let stylus events through (I don't have a stylus I can test with).
You may have to experiment with which events need to be handled like this. I only did KeyDown for example, not KeyUp as I presume the latter is irrelevant without the former. I may also have implemented some that didn't need to be handled, and I'm not sure the calls to the base methods are needed either. As I say, experiment until you get something that works for you.
I am writing a metro app where it makes sense for focus to jump to a single text box anytime the user starts typing. But I can't figure out how to implement this functionality locally, without modifying other user controls to detect keydown and forward focus to the text box.
In WinForms I would have used the form's "KeyPreview" property, which caused any key presses within the form's controls to fire form KeyDown/KeyPress/KeyUp events. I can't find any equivalent in metro.
I tried the naive solution of just forcing focus to the text box anytime it left, but that has obvious problems (e.g. buttons flicker instead of staying highlighted when you click and hold on them).
How can I ensure any keyboard typing goes to a specific text box?
The event needs to be placed on the current core window, which is the root element all controls are nested on.
Windows.UI.Xaml.Window.Current.CoreWindow.KeyDown += (sender, arg) => {
// invoked anytime a key is pressed down, independent of focus
}
Here you go ...
Responding to keyboard input (Metro style apps using C#/VB/C++ and XAML)
&&
Implementing keyboard accessibility (Metro style apps using C#/VB/C++ and XAML)
Pay special attention to routed events. There are examples there too.
Hope this helps.
In your xaml code u bind these to events to page::
Loaded="pageRoot_Loaded_1"
Unloaded="pageRoot_Unloaded_1"
and inside these methods u have to bind and unbind ur events for keydown or keypress
private void pageRoot_Loaded_1(object sender, RoutedEventArgs e)
{
Window.Current.CoreWindow.KeyDown += CoreWindow_KeyDown;
}
private void pageRoot_Unloaded_1(object sender, RoutedEventArgs e)
{
Window.Current.CoreWindow.KeyDown -= CoreWindow_KeyDown;
}
The thing is, I have 2 usercontrols, lets call them A and B. They both have MouseRightButtonDown and MouseRightButtonUp events and usercontrol A kinda overlaps B.
Now when I right mouse click on A, the mouse event on B does not fire. When I disable the mouseevents on usercontrol A, the mouseevents on usercontrol B fires.
But how can I get them both to fire simultaneously?
(hope I've explained it clearly)
a bit hacky but this would work, bind the event handler only to the control 1 and call the other event handler like this:
private void textBlock1_MouseRightButtonDown(object sender, MouseButtonEventArgs e)
{
Debug.WriteLine("textBlock1_MouseRightButtonDown");
textBlock2_MouseRightButtonDown(sender, e);
}
private void textBlock2_MouseRightButtonDown(object sender, MouseButtonEventArgs e)
{
Debug.WriteLine("textBlock2_MouseRightButtonDown");
}
personally I would not do this, I would do all my best to re-architect the logic and not have to call the other handler from one of the two controls, but without knowing more what your are doing, impossible to tell...
As we are talking about Silverlight here, I strongly suggest to look on how to work Routed Events, which is a mechanism which actually will help you to avoid event relaunching, as these are events that are traverse Visual Tree of your element, from top to bottom.
Definitely better then relaunching events.
I have an app with a custom window (transparency and no borders). I made a header with a dragmove behavior on left mouse button down. This allows me to drag the window to the top so it maximizes. Now I want to write the code so that when I click the header and drag it, it restores the windowstate to normal...
Is there a click & drag event handler, or another way?
EDIT: Platform C#, in WPF
You need to use Window.StateChanged Event
The best way to handle Maximalization and Minimalization is to manipulate WindowState Property. It saves the Window.RestoreBounds property with previous size. If you need more sophisticated solution
here is an example
Ps. Similar to Win 7 feature. Maybe there is no need to do so? :)
Edit: in UIElement there is MoveMove event
private void Window_MouseMove(object sender, MouseEventArgs e)
{
if (e.LeftButton == MouseButtonState.Pressed)
{
MainWindow1.WindowState = WindowState.Normal;
}
}
this is a bit messy since event is going to fire everytime you move it