Someone I've been banging my head around - is there a way to detect when intertial scrolling begins on a scrollviewer in WinRT's XAML model in a touch input scerario, no matter how round abouts it is?
I've tried listening to PointerUp & Down's & Released & Move events on UI elements and on the CoreWindow object itself and none of them fire after the scrollviewer starts moving (so it seems the scrollviewer is eating the pointer moved events before the application Window even recieves them? What fun DirectInput is...)? And of course overlaying something on top the of scrollviewer that catches events and then says it hasn't handled them simply doesn't work (I'm guessing a side effect of DirectInput again). So is there a way to do, without writing my own panel?
(I've been tasked to write a flipview like control where the items scale / zoom up as they become the active item - and it looks horrible to only fire it on the selection changed event).
Related
I am working with a WPF application that will be used on Windows tablets. The issue I am having is that I cannot scroll through a large multi-line TextBox on a tablet by pressing and dragging the content. However, it still scrolls on a desktop with a mouse wheel.
This question (Enable swipe scrolling on Textbox control in WPF Scrollviewer) seems to answer the same problem I am having, but I need to do it programmatically. This is what I am doing to set the panning mode of the TextBox:
txtLongText.SetValue(ScrollViewer.PanningModeProperty, PanningMode.None);
Which I can tell is working because the click & drag text selection is now disabled, but the content still does not scroll. I am also setting the panning mode of the outer ScrollViewer as such:
popupScrollView.PanningMode = PanningMode.Both;
The popupScrollView object is then being set as the content inside a Popup.
The only thing I can think of is if there is somewhere else higher up that I need to be setting the panning mode? Any help would be appreciated. Thanks.
i have same problem with touch devices. i have a tricky way to handle this kind of issues
You have to handle touch event manually
i have written some codes to handle touch events manually
when UIElement_OnTouchDown(object sender, TouchEventArgs e) event occurred you can keep position of touched position by eventArgs.GetTouchPoint(this).Position.Y.
after that, you can determine is scroll happened or not by watching the position changes.
here is my sample gist
, i use this approach for same issue with touch devices
I think you require to use three properties to achieve this.
ScrollViewer.PanningMode
ScrollViewer.PanningDeceleration
ScrollViewer.PanningRatio
By default, PanningMode sets to None, but set it to another value will enable touch scrolling.
Another thing you can try is to set ScrollViewer CanContentScroll to true.
While Im not sure there is a viable way to solve that using wpf only, I recommend trying to implement html UI inside your wpf application using DoNetBrowser, link.
Then you can use the textarea control in html, which in default lets you scroll on mobile.
Hope this answer helped you.
For C# WPF, there is a event handler MouseWheel for scrolling using mouse wheel. Is there any similar event handler for scrolling using touch/panning?
If you are using a ScrollViewer and you have touch hardware (which causes manipulation events to occur), then you can use the PanningMode property.
https://msdn.microsoft.com/en-us/library/system.windows.controls.scrollviewer.panningmode(v=vs.110).aspx
If you don't have touch hardware, and you want the mouse to be used for panning, then you can add some additional behaviour to a ScrollViewer to support that.
Can i scroll an ItemsControl by dragging row items instead of scrollbar
If you want a normal mouse to be able to generate cause "manipulation" events to occur (i.e. simulate a touch device)....then you can use the MultiTouch SDK to provide a new device that will map mouse events to touch ones - something you don't normally get.
https://blogs.msdn.microsoft.com/ansont/2010/01/30/custom-touch-devices/
How do I Emulate touch events without using touch screen
WPF: Is there a possibility to "route" ordinary mouse events to touch events in Windows 7
If you have want to handle manipulation events (via real touch hardware, or the simulated one via MultiTouch SDK), and have PanningMode work at the same time on the ScrollViewer...use this.
Manipulation event and panning mode
I have a WPF-program that has a grid with two columns. First one has buttons and second one has WindowsFormsHost-element that embeds an ActiveX component. One button hides the WindowsFormsHost-element and shows a SurfaceListBox on the same location on screen in the second column. If I have touched the WindowsFormsHost element just before pressing this button, it takes approximately 8 seconds from the last touch until the SurfaceListBox becomes responsible for touch gestures.
The thread is probably not blocked, because I can use the buttons in another column, and use use the ListBox with mouse.
The ListBox remains unresponsive for touch events forever, if I touch it within the 8 second waiting time. So it seems that somehow the ListBox does not get the touch events.
If I programmatically create another ListBox, it does not work either, for 8 seconds, if it is placed in same are on screen than the WindowsFormsHost was.
I noticed there is a method CaptureTouch() for UIElement, but I cannot get hold of to the TouchDevice that I could pass it as parameter. I have set ManipulationEnabled="true" for every UIElement and no TouchEvent will be fired.
I have also desperately used UpdateLayout() etc with no luck.
So I think the touch gestures are somehow routed wrong and after the waiting time it implicitly fixes the routing, but is there a way I could make the touch gestures work in the ListBox immediately?
The problem disappeared when I removed "focus tracking for launching on-screen keyboard" from my program.
So in case of somebody else struggles with the same problem,
http://www.infragistics.com/community/blogs/blagunas/archive/2013/12/17/showing-the-windows-8-touch-keyboard-in-wpf.aspx and SurfaceListBoxes aren't meant for each other.
I have very short contact with the C# and WPF, although most of the stuff I could find over the Internet. However, I cannot find anything (or don't know how to request google to find it) about blocking sending events to the parent.
I've got an Image inside ScrollViewer. My point is to create zooming option for the image by using Ctrl + mouse wheel, but obviously the scrollbars of the ScrollViewer are moving while I am moving mouse wheel (the mouse wheel method is defined within the Image). Is there any possibility to block event sent from child to parent when Ctrl is down?
In your handler for the Image, you should set the event's Handled property to true when Ctrl is pressed. This will prevent the ScrollViewer from handling the mouse wheel event.
See http://msdn.microsoft.com/en-us/library/ms742806.aspx for more information, especially the section entitled "The Concept of Handled."
I have some straight WPF 3.5 controls handling left mouse clicks that I need to use within a Surface app (SDK 1.0). The problem I am facing is that do not work by default. I am thinking of wrapping each control in a SurfaceContentControl and translating ContactTouchDown or ContactTapGesture events to corresponding MouseDown events.
The problem boils down to - how to "inject" or simulate arbitrary routed mouse events? I have tried InputManager.Current.ProcessInput() but didn't get very far. Any help is appreciated.
Try to use AutomationPeer classes. For example ButtonAutomationPeer is for Button. The code below initiates a click.
ButtonAutomationPeer peer = new ButtonAutomationPeer(button);
IInvokeProvider provider = (IInvokeProvider)peer.GetPattern(PatternInterface.Invoke);
provider.Invoke();
evpo's idea is an interesting one (though if you're working with custom controls, they rarely come with AutomationPeer classes).
You can't simply 'inject' mouse input by sending WM_MOUSE* events to your app... WPF would indeed process the message but when it goes to figure out the position of mouse for that event, it will query the actual mouse API instead of trying what you stick in the WM.
So really all you can do is tell windows to move the actual mouse cursor and act as though the button is being clicked/released. Some code you can use for that is in http://www.codeproject.com/KB/system/globalmousekeyboardlib.aspx
That said, while you can technically do this, it sucks... you've got an expensive multitouch device but are 1) showing a mouse cursor on it 2) limiting arbitrary parts of it to being used 'single touch' (and only one of those arbitrary parts at a time and 3) coming up with an arbitrary method of determining which finger you will treat as the mouse-controlling one