I'm experiencing a strange bug(?) with my uwp app.
I have a page with multiple textboxes for user input that each have the InputScope set to number, which then opens the keyboard in tablet mode as expected. However if you tap on the next box the keyboard closes and a second tap is needed to open the keyboard. This also happens if the user hits tab to switch boxes.
I presume this has something to do with the Focus() event firing before the previous textbox has fired the loss of focus event but im unsure how to override the behavior.
How can i prevent the onscreen keyboard from closing, but also make sure the correct inputscope is still maintained?
Edit: Upon further investigation, the issue seems to be almost random. Sometimes you can move to different boxes and it remains open but other times it closes the keyboard every time.
This issue has been resolved with some others, in Windows 10 version 1803 released on 30th April, 2018.
You can try to play with InputPane that is a container for on-screen keyboard. For example, doing a small delay once the pane is trying to hide, then detect attempts to show the pane again (and cancel pending hide).
myPane = InputPane.GetForCurrentView();
myPane.Showing += myShowingHandler();
myPane.Hiding += myHidingHandler();
At least, you can receive a size of underlying screen region on the pane showing so you can put a placeholder (margin or dummy grid) on your UI to prevent it jumping up and down on keyboard show/hide.
I tried your case on my WP but I cannot reproduce your experience:
When I tap to other TextBox, keyboard stays opened and I can continue typing.
Keyboard hides as expected when I tap somewhere to whitespace.
Don't you have keyboard connected to your device?
Related
I'm looking for a way to force a winforms controls which already has focus to be (re)announced by the screen reader.
The situation is that I have a manually painted area which the user is able to navigate around using the arrow keys, but since it's only a single control the focus is always the same so the screen reader only announces the control on the initial focus.
I have tried Focus()ing (and Select()ing) on a second control then immediately Focus()ing back, but this doesn't trigger the screen reader. The only way I've managed to get that working is to get the second control to wait 100ms then refocus back from a background thread, but then the second control is announced to the user, and there's a minefield of edge cases and race conditions.
It looks like UIA LiveRegions is the solution for this for .NET 4.7+, but I'm stuck on 4.6.
Any ideas how I can force the screen reader to re-announce the control under focus?
Im devoloping a UWP app and I came across a problem.
In my input page when I tap a textbox the keyboard just "pushes" the UI upwards and this happens.
Normal screen:
Check at the top of the page, you can see the problem:
Does anyone know a fix for this? Thanks!
I think this is default behaviour of the keyboard and can't be changed.
If the keyboard would not push the content upwards, it would overlap the actual TextBox, which you want to fill with text.
If your problem is the statusbar getting overlapped/underlapped with the page-content, you could try to change the color of the statusbar, it seems to be transparent at the moment.
See this.
As I am working on screentime, I would like to block the screen after the user's allowed time gets over. My screen will be on the top after allowed time gets over.
In Microsoft family safety screentime even if the user presses windows button, start menu will appear behind the microsoft's blocking screen. I dont know how to do the same. They've not blocked any inputs. Whatever the user does will happen behind the screen.
In my app user cannot switch to another screen. I put window.topmost = true
in both OnLostFocus event and Deactivated event.
Everything is working fine except the start menu case.
What is actually happening when start menu appears, which event is taking place? How can I override it?
I have a WPF-program that has a grid with two columns. First one has buttons and second one has WindowsFormsHost-element that embeds an ActiveX component. One button hides the WindowsFormsHost-element and shows a SurfaceListBox on the same location on screen in the second column. If I have touched the WindowsFormsHost element just before pressing this button, it takes approximately 8 seconds from the last touch until the SurfaceListBox becomes responsible for touch gestures.
The thread is probably not blocked, because I can use the buttons in another column, and use use the ListBox with mouse.
The ListBox remains unresponsive for touch events forever, if I touch it within the 8 second waiting time. So it seems that somehow the ListBox does not get the touch events.
If I programmatically create another ListBox, it does not work either, for 8 seconds, if it is placed in same are on screen than the WindowsFormsHost was.
I noticed there is a method CaptureTouch() for UIElement, but I cannot get hold of to the TouchDevice that I could pass it as parameter. I have set ManipulationEnabled="true" for every UIElement and no TouchEvent will be fired.
I have also desperately used UpdateLayout() etc with no luck.
So I think the touch gestures are somehow routed wrong and after the waiting time it implicitly fixes the routing, but is there a way I could make the touch gestures work in the ListBox immediately?
The problem disappeared when I removed "focus tracking for launching on-screen keyboard" from my program.
So in case of somebody else struggles with the same problem,
http://www.infragistics.com/community/blogs/blagunas/archive/2013/12/17/showing-the-windows-8-touch-keyboard-in-wpf.aspx and SurfaceListBoxes aren't meant for each other.
I'm maintaining an application that shows a toolTip in certain conditions on a userControl. (When the mouse is over some area a timer starts, when it stops, and the mouse is still there, the toolTip displays a text, by calling "Show(..)".
This works fine.
A different applicatiion is holding this app as an MdiClient. The toolTip now only shows when the application is not active: If the user opens a different application on the computer, "WORD" for example, and then returns to mine without clicking on it, holding the mouse on the right region, then the text is displayed. Otherwise, although the "Show" is called, the "Popup" event is never raised.
Does anyone have an idea about how to solve this?
Thanks, Tali.
If I've understood the problem correctly, I think what you need to do is track when the window (which will show the tooltip) loses and gains focus. So when it loses focus, disable showing tooltips, then enable when it gains focus.