In a research project I need to obtain the touch event when the user is typing with a soft keyboard. Basically I am implementing the soft keyboard. So I was wondering if there is a way that I can set a touch Listener for the view of the existing soft keyboard and get both the touch event and the text?
The soft keyboard is a completely separate app. You cannot put a touch handler or get any touch input from it. The only way to do what you want is to write the keyboard yourself (or use an open source one and instrument it).
Related
I want to create a WPF window that will display on a touch sensitive screen and the window contains 80 buttons. I want my WPF app to recognize tabs on specific buttons. Do I need any specific API to do that or how will the tabs on the touch sensitive screen be transmitted to my WPF app?
Any suggestions?
Thanks
Windows 7 and its higher versions have the ability to receive input from multiple touch-sensitive devices. WPF applications can also handle touch input as other input, such as the mouse or keyboard, by raising events when a touch occurs.
WPF exposes two types of events when a touch occurs − touch events and manipulation events. Touch events provide raw data about each finger on a touchscreen and its movement. Manipulation events interpret the input as certain actions. Both types of events are discussed in this section.
WPF enables applications to respond to touch. For example, you can interact with an application by using one or more fingers on a touch-sensitive device, such as a touchscreen This walkthrough creates an application that enables the user to move, resize, or rotate a single object by using touch.
Source MSDN : https://msdn.microsoft.com/en-us/library/ee649090.aspx
Also read this codeproject article - http://www.codeproject.com/Articles/692286/WPF-and-multi-touch
WPF applications work on the touch screen without any needs for modifications. Of course you can add support for gestures like pinch-zooming etc. but tapping on buttons works out of the box. For the WPF application it doesn't matter if user is tapping with the finger or clicking with the mouse.
Every windows hook is set to a particular window, or is global. And if I'm not wrong, even the textbox is a window. So, is it possible to set a low level keyboard hook to a specific textbox?
My goal is to capture keydown event on my textbox, but I figured out that using basic method I'm not able to capture the PrintScreen key, so I'm trying to do it another way.
Thanks
PrintScreen is a key that triggers a system function, e.g. Copy screen to clipboard. The key needs to work no matter what UI control has the keyboard focus and is getting the rest of the keystrokes, e.g. your text box. The way to capture this key is with a keyboard hook. See this answer. I believe the code works in both Winforms and WPF.
My WPF project required touch screen. but now I haven't touch screen. So I use Mouse event (include Muse down, Mouse Move, Mouse up ..events) to all project. but I'm very worry when finish the project , and when use touch screen, the mouse event don't working. So my problem is :
1.Is touch event support mouse event? Am I need to change all Mouse event to Touch event in my project? (actually I'm very hope I can use the 2 event both in same time. I mean use mouse + use finger)
2.if don't support, Is there any way can easy handle the problem?
My project using .net4.0. thank you.
WPF MouseEvents already have an interface to handle with touch, so you dont need to make two methods.
When you touch an button it is recognized as it was a mouse click, so dont worry about this :)
You will have to implement just the "slide" touch events but not the clicked ones.
When the user selects a text anywhere in any application,I want to capture the selected text.
i wont to capture automatically the selected text.with out using Clrt + C.
can i do that ?
You may be able to use global windows hooks such as WH_MOUSE_LL to capture the mouse events.
A possible solution would be to capture the mouse up event, WM_LBUTTONUP, through the global windows hooks and then trigger a copy to the clipboard (such as programatically sending a ctrl+c)
This link gives an example of hooking into global windows events. This specific one is for keyboard events however it should be similar for mouse events.
Using global keyboard hook (WH_KEYBOARD_LL) in WPF / C#
This link contains suggestions for triggering os level copies to the clipboard.
Trigger OS to copy (ctrl+c or Ctrl-x) programmatically
This is neither an elegant solution or a complete solution as it will try to copy after every mouse click regardless of whether text is highlighted, but hopefully can be used as a starting point.
Is it possible to display/hide the On-Screen keyboard manually from code?
No. It was a design decision (documented here) to give the end user control of the keyboard being invoked. Therefore, the end user has to touch a text box (or the like) to invoke the virtual on-screen keyboard.
From that link:
"The invocation model of the touch keyboard is designed to put the
user in control of the keyboard. Users indicate to the system that
they want to input text by tapping on an input control instead of
having an application make that decision on their behalf. This reduces
to zero the scenarios where the keyboard is invoked unexpectedly,
which can be a painful source of UI churn because the keyboard can
consume up to 50% of the screen and mar the application's user
experience. To enable user-driven invocation, we track the coordinates
of the last touch event and compare them to the location of the
bounding rectangle of the element that currently has focus. If the
point is contained within the bounding rectangle, the touch keyboard
is invoked.
This means that applications cannot programmatically invoke the touch
keyboard via manipulation of focus."