How to convert or how to implement touch behavior - c#

I have implemented MouseDragElementBehavior to my application before but unfortunately I use the application now using touch panel.
Before, obviously I drag using the mouse, but now because I am using touch panel, the MouseDragElementBehavior won't work.
Is there a way to convert this? My only changes to my application is by using Touch Panel and no changes to my application at all.
The rest like what a mouse can do is also doable by touching but dragging is not supported.
Please help. Thanks

There currently isn't any official Drag for touch. You can however create your own thouch event for such by responding and combining touch events.
PreviewTouchDown For starting your own start drag function on element so add the TouchMove event here to the object
TouchMove For dragging the object visual
PreviewTouchUp For stop dragging stop the TouchMove event here
TouchEnter Check if the object you entered accept drops
Or you can of course google for libraries that already implemented this kind of behavior.
I have googled it a bit and found a good walktrought for windows applications.
And with the Touch Class you can get all the touch points in the application (multiple fingers) and implement your own behavior.

Related

Mouse/Touch events in Xamarin Forms

Is there a way to get mouse and/or touch events on UI elements using Xamarin Forms? So far I only found the TapGestureRecognizer class, but I want the user to be able to move UI elements (PanGestureRecognizer?) but I haven't been able to locate anything that could help me achieve that.
I am looking for a cross-platform solution if possible (hence the user of Xamarin Forms) but I am ok with creating a platform-specific component that integrates with Xamarin Forms (e.g., I already created a custom base page class to add an iOS-specific background gradient page).
I suggest you give MR Gestures a look. It's a Xamarin.Forms component that adds very robust support for gestures. It works on all Xamarin.Forms platforms. From the website:
MR.Gestures adds Down, Up, Tapping, Tapped, DoupleTapped, LongPressing, LongPressed, Panning, Panned, Swiped, Pinching, Pinched, Rotating and Rotated events to each and every layout, cell and view and to the ContentPage. These events will be raised when the user performs the corresponding touch gesture on the element.
It is not free, but at €10 it's a bargain. The documentation is great and the library works exactly as advertised.

Faking A Keyboard or Mouse Event

I've been trying to figure out how to fake, not simulate, keyboard and mouse input. By this I mean the system goes through the process as if the actual event occurred, such as a mouse click, but does not actually perform the event, such as clicking the mouse.
Or if you wanted to make the system think your mouse had moved even though it did not. Sort of a "virtual" move that doesn't actually happen/effect the mouse.
Is it possible to override the simulated mouse clicks and events to make them not actually click while the system thinks they have?
Here is a nice project that wraps the keyboard and mouse. Here is the mouse input simulator file for reference. To see the lower level work, navigate to the WindowsInput.Native namespace in that project.
Thanks guys for all of your help. I was finally able to achieve what I wanted via lrb's answer.
I used that library to fake input and in the grand scheme of things I was trying to make a mouse jiggler but not actually effecting the user's mouse in case the application was running while the user was using the mouse. Which is why I wanted to "fake" the mouse rather than move the actual mouse. Thanks again for everything this was amazing.
Icemanind I'm still curious about your idea with subscribing an event rather than having an event handler. This would allow me to induce something like a mouse click without actually clicking correct?

WPF - Making DragDrop and ManipulationEnabled co-exist

I have a somewhat unusual design problem in a WPF touch application.
I have a UserControl that holds an image, which I allow the user to freely move, resize, and rotate around the touch surface using multitouch by setting:
isManipulationEnabled = true
and then hooking up events for ManipulationStarting, ManipulationDelta, and ManipulationCompleted.
This is all good and works perfectly, but now I would like to add the ability for a user to drag this control into the WrapPanel on another control, which has a list of image files, and add this image to the list.
I tried using DragDrop events by calling DragDrop.DoDragDrop() on the ManipulationDelta event, but it locks the UI and the control until a drop occurs, which is not what I want.
Is there any way to properly do this without writing my own hit-testing code? I'm using WPF 4.0 and .NET 4.5 on VS 2013, and I'm not sure if the Surface SDK would help me in this case (nor could I properly install/load it to VS2013)
Found my solution: Use VisualTreeHelper.HitTest and call the HitTest in the event handler for ManipulationDelta, and then use your own logic for handling the drag over operation. Use ManipulationCompleted event to check the hit test again to complete the drop

Can I use wpf mouse event replace touch event?

My WPF project required touch screen. but now I haven't touch screen. So I use Mouse event (include Muse down, Mouse Move, Mouse up ..events) to all project. but I'm very worry when finish the project , and when use touch screen, the mouse event don't working. So my problem is :
1.Is touch event support mouse event? Am I need to change all Mouse event to Touch event in my project? (actually I'm very hope I can use the 2 event both in same time. I mean use mouse + use finger)
2.if don't support, Is there any way can easy handle the problem?
My project using .net4.0. thank you.
WPF MouseEvents already have an interface to handle with touch, so you dont need to make two methods.
When you touch an button it is recognized as it was a mouse click, so dont worry about this :)
You will have to implement just the "slide" touch events but not the clicked ones.

Metro-style Appbar in fullscreen WPF program

I am currently working on a desktop C# WPF application where the goal is to make it look and feel like a "real" Windows Store App.
I want to add an appbar that should be shown when the user swipes up from the bottom. To do this in a normal app you just position your finger outside the screen area, and swipe up.
But if I do that in a fullscreen WPF program I don't receive any TouchDown or TouchMove events - probably because the finger is already down when entering the actual screen area.
I have tried with the Manipulation framework also, but same result here. Even when I hook directly into the message queue using WndProc or other hooks I get no events at all.
The funny thing is that I can see the "touch cursor" move around the screen, so at least something in the underlying framework is notified.
Does anyone have an idea how to do this?
p.s. It is not an option for me just to use a windows store app instead, because of hardware connectivity issues ;-)
You will need to keep track of the cursor location coordinates, and see when the cursor (swipe) starts at the edge of the screen and moves in. When that triggers (with whatever trigger you want, distance covered most likely) you can fire up your Appbar.
There was a similar question asked on MSDN:
https://social.msdn.microsoft.com/Forums/vstudio/en-US/d85dcde7-839a-44d3-9f2a-8b47b947576c/swipe-gesture-and-page-change?forum=wpf

Categories

Resources