4 point multitouch gestures in WP7 (silverlight) - c#

I want to use 4 point multitouch gestures in my app. The app is in silverlight (not xna), but the gestures won't apply to any controls, they will just check if user drags 4 fingers to the left or to the right of the screen.
Are there any libraries that I can use? Or what is the easiest way to implement it on my own? Can I use XNA multitouch libraries?
Cheers

As you are probably aware the WP7 silverlight API makes the assumption of two contact points for multi touch i.e. PinchStarted, PinchDelta, and PinchCompleted.
Please Check out the TouchPanel class which is in the Microsoft.Xna.Framework.Input.Touch namespace.
//Determine the maximum number of touches permited (four for WP7):
TouchPanelCapabilities tc = TouchPanel.GetCapabilities();
if(tc.IsConnected)
{
return tc.MaximumTouchCount;
}
//To read multitouch data from the touch input device you can do the following:
// Process touch events
TouchCollection touchColl = TouchPanel.GetState();
foreach (TouchLocation t in touchColl)
{
if ((t.State == TouchLocationState.Pressed)
|| (t.State == TouchLocationState.Moved))
{
//You can check the coordinates of each point (and the previous coordinate TryGetPreviousLocation())
float xcoordiante = t.Position.X;
float ycoordiante = t.Position.Y;
//Determine if touch point was moved/pressed or released use the State property
TouchLocationState st = t.State;
}
}
More details can be found here: http://msdn.microsoft.com/en-us/library/ff827744.aspx
I have not seen any libraries that specifically target 4 point touch, however, if you are looking for libraries that help with multi touch debugging I would strongly recommend http://multitouch.codeplex.com/.

The Silverlight WP7 Toolkit is awesome for doing Gesture stuff.
Download WP7 Toolkit
Then check out this awesome tutorial

Related

Display compass - ArcGIS Runtime SDK - Xamarin.Forms

I am using the 1st release of the ArcGIS Runtime SDK for .Net - Xamarin.Forms (nuget package here).
One of the requirements is to display a compass that indicates the north. I haven't found any build-in feature for the moment. Is someone can point me out how to implement this functionality ?
So after few research, I've implemented a custom solution:
Find a compass icon that can rotate (see this article to add image resource to Xamarin.Form)
Add the image on top of the map:
<Image x:Name="NorthArrow" />
Rotate the image when the view point changed:
MapView.ViewpointChanged += (sender, args) =>
{
NorthArrow.Rotation = -MapView.MapRotation;
};
Complete solution here.

NAudio set Left and Right speakers levels through my code (Balance control)

I'm trying to control the volume of the left speaker and right speaker separately through my app. I'm using the library NAUDIO. At the moment i am trying to write a program that will change the master volume of the system depending on a buttons that exist in my app. off course, the windows form, but i am unable to understand how to control volume. I need it to be specific to the master volume. Is there any class to do so?
Volume mixer, Device, Speakers properties, Levels, Balance
Regards
I'm not sure if this is what you're looking for but you could do something like this...
MMDeviceEnumerator deviceiterator = new MMDeviceEnumerator();
MMDeviceCollection devices = deviceiterator.EnumerateAudioEndPoints(DataFlow.Render, DeviceState.Active);
foreach(MMDevice device in devices)
{
// Go through the devices you want to update and set your volume..
device.EndpointVolume.MasterVolumeLevelScalar = 0.8f;
}
According to NAudio, the max volume as a scalar is 1.0f. So setting it to 0.8f would essentially change your master control volume to 80.
Hope it helps. I'm still trying to figure out NAudio myself so good luck :-).

Dynamic-Data-Display / Getting x-coordinate of user input

My program is basically about analyzing videos.
A major part is to plot a diagram showing (f.e.) brightness per frame on y-axis and every frame number on x-axis. Because the program is written in C# and uses WPF, D³ was the way to go for plotting.
Now the user might see a peak signal in the diagram and wants to look on that single frame to understand why it's so bright (it might be just natural, or an encoding-artifact).
There comes my question: The most intuitive way for the user to click on the diagram where the peak is, which jumps the video preview (other GUI element) right to that frame. So I need the x-coordinate (=frame number) of the user click on the diagram.
It is possible to manually analyze the mouse-input event, but that would take much work (because the x-axis is different for each video and the entire diagram can be resized, so absolute coordinates are a no go).
But maybe something similar is already implemented by D³. I searched the documentary, but didn't find anything useful. The only piece of information was using a "DraggablePoint", but that's where the trail goes cold.
Does someone of you know how to get the x-coordinate without much work?
It sure is possible! The way that I have done it in the past is to add a CursorCoordinateGraph object to my plotters children, and it automatically tracks the mouse position on the graph in relation to the data. You can turn off the visual features of the CursorCoordinateGraph and use it for tracking only. Here's what it would look like:
CursorCoordinateGraph mouseTrack;
plotter.Children.Add(mouseTrack);
mouseTrack.ShowHorizontalLine = false;
mouseTrack.ShowVerticalLine = false;
And your mouse click event would look like this:
private void plotter_MouseLeftButtonDown(object sender, MouseButtonEventArgs e)
{
Point mousePos = mouseTrack.Position;
var transform = plotter.Viewport.Transform;
Point mousePosInData = mousePos.ScreenToData(transform);
double xValue = mousePosInData.X;
}
You can then use xValue and manipulate it however you would like to achieve your desired effect.

Application, improve performance of touch events

Basically, I have an application witch is 8000px by 8000px. We can zoom in to view a specific part, example on the radio, or we can zoom out to view everything.
Each part of the car is a control that we can manipulate with fingers, on a dual touch or multitouch monitor.
My problem is: for manipulating a control, for example the Volume button, the user needs to move the mouse exactly like in real life, so with a circular movement.
With the mouse everything is perfect, it responds instantly without any delay. I use the OnMouseLeftButtonDown, OnMouseMove, etc.
With the touch, it seems to be very difficult for the computer to get the touch position and there is a huge lag, especially when the user move 2 different button with 2 fingers at the same time. I use the OnTouchDown, OnTouchMove, etc...
The only difference between the mouse and the touch is when we need to get the position, with the Mouse I use: (e is a MouseButtonEventArgs)
Point currentPosition = e.GetPosition(this);
With the Touch I use: (e is a TouchEventArgs)
Point currentPosition = e.GetTouchPoint(this).Position;
Everything after this is the same.
I don't know if it's because I have too many control in the my application (over 5000 that we can manipulate, but when we zoom in on only 2 control it's the same thing) or because it is really difficult for the computer to get the position from a touch event....
Can someone help me with this? I need to find a solution to eliminate the lag.
I use Visual Studio 2010, Blend 4, .NET 4.0
Windows 7 64-bit
7 Gb RAM
Xeon 2.13 Ghz, 2 core, 8 thread
Screen: ELO technology, in a NEC 2490WUXi2 screen
This seems to be a bug. Take a look at this post.
http://social.msdn.microsoft.com/Forums/en-US/wpf/thread/756a2ccf-6cf1-4e4a-a101-30b6f3d9c2c9/#ed51226a-6319-49f8-be45-cf5ff48d56bb
Or
http://www.codeproject.com/Articles/692286/WPF-and-multi-touch
it's hard to say why you have such an issue - may be OnTouchMove is triggered more often than MouseMove and you should create some additional processing to smooth the touch positions data.
I'd try to comment all the code under the
Point currentPosition = e.GetTouchPoint(this).Position;
and look at the performance.
Another approach is to count how much OnTouchMove is triggered.
The problem is the touch device, try another one to see if the lag is still there.

WPF 4 touch events getting the center of a pinch gesture

I am using the .net 4 beta 2 touch libraries, and Im trying to implement a zooming feature in my WPF application.
I can get the zooming working just fine, but what I want is to zoom in on the center of the pinch gesture, and I cannot see anything in the API's about how to accomplish this.
Is there some methods or properties that expose the 2 contacts being using in the pinch gesture so that I can get the center of them?
EDIT:
I just investigated using the GetIntermediateTouchPoints method of TouchEventArgs which did not seem to give me what I want.
Thanks a lot
Mark
Assuming you are using the new TouchXxxxEvent routed events, they all take TouchEventArgs, which has TouchDevice property. If you call its GetTouchPoint() method, you can get the TouchPoint that represents the point of touch.
More details and sample code from Jaime Rodriguez.
Turns out that the properties I was after was right in front of me all along.
the Manipulation2DDeltaEventArgs class has OriginX and OriginX properties that specific the center point of the pinch gesture, so Im just using that and its all good :)
This answer is for .Net 4 release version.
In the case of using ManipulationDelta event the ManipulationDeltaEventArgs.ManipulationOrigin contains center point of the pinch gesture. The coordinates are relative to the ManipulationDeltaEventArgs.ManipulationContainer, which could be set in the handler of ManipulationStarting event.
Example code:
private void object_ManipulationDelta(object sender, ManipulationDeltaEventArgs e)
{
...
// If this was a multitouch gesture
// (like pinch/zoom - indicated by e.DeltaManipulation.Scale)
// the following line will get us point between fingers.
Point position = e.ManipulationOrigin;
...
}

Categories

Resources