Anybody knows how to embed UWP/WPF in Unity? - c#

I am building a disability service in Unity that will let paralyzed people move mouse cursor with their head movement only. Whatever is in Unity is just a function so im wondering if its possible to embed it in UWP software.
So someone would launch UWP software, click on "Start Unity" button and then the software would wrap itself into unity and function is now accessible (embedded software is clickable). There are a few examples of embedding browsers in Unity (example below), but can this be done with a software. Assuming I have access to software's code.
Thank you!

Related

How to use native android file-open-dialog in Unity?

I've seen this dialog to pick/open a file on android in some apps and it seems to me as the native one. But I can't find a way to use it in my own apps. The language of the attached screenshot is German, but I'm sure someone will recognize it. Screenshot of the file-dialog
.in Unity3d its possible open native file Dialog in game with Unity3d c# code?
Thanks for your guide. (sorry for my bad English)
I have found the solution to my problem and I want to guide others who have this problem
Go to the site below and use the free plugin.
Unity Native Gallery Plugin

Scanning for Bluetooth in Unity

I am trying to make a app in Unity and I am struggling to find any good resources on how to scan for Bluetooth (BLE) device.
Basically all I want to achieve now is to just press a button that calls a function that can scan for devices and list them along with the respective rssi.
I managed to do this in Ionic but it is proving a lot more difficult in Unity. So I am just wondering if someone has managed to do something like this before.
Also, is there a way to integrate a unity app into an ionic project?
Thank you

Embedding UWP in Unity and vice versa?

I am building a disability service in Unity that will let paralyzed people move mouse cursor with their head movement only. Whatever is in Unity is just a function so im wondering if its possible to embed it in UWP software.
So someone would launch UWP software, click on "Start Unity" button and then the software would wrap itself into unity and function is now accessible (embedded software is clickable). There are a few examples of embedding browsers in Unity (example below), but can this be done with a software. Assuming I have access to software's code.
Browser example
Thank you!

Using Unity3D to render a character onto a Kinect WPF application

I am working on a project which requires me to render a virtual character onto the kinect video feed in which the player appears.
I am attempting to use Unity3D to accomplish this. I have looked at Zigfu but I don't think this directly helps. I still want to be able to send data from my C# WPF program to the game engine (I am forking my project off from kinect Fusion Explorer). Ideally Unity would be rendering the character and movement, my WPF program would be sending information to Unity about the landscape and running the Kinect feed.
Has anyone attempted this or have any idea how this could be achieved?
If this is not possible with Unity, are there other game dev libraries I could use to render a character onto the Kinect feed?
Thanks
If you want to send data via network (sockets) you will face problems with the size of frames. So my opinion is to use WCF. I'm not sure if it works for you, but this's how I managed it in my project (sending position and orientation)

Integrating WPF, Unity3D (and Kinect)

I created a WPF project in Visual studio. The XAML markup is managed by C# code-behind. What i want to do is create a component on the user interface, which will show a 3D scene. I would like this 3D scene to be managed by Unity, because i need to take advantage of Unity's physics engine. The user must be able to interact with this 3D scene via gestures, recognized by Kinect (for example toss a ball).
Is there any way I can connect WPF, Unity3D and Kinect, so that the user will be able to manipulate 3D scene in such manner? If so, could you provide me with some examples/tutorials? If not, what is the best approach for letting user manipulate 3D scene with Kinect gestures?
I would see How get the Unity3D View In wpf winform as for using Unity3d with wpf. The tricky part will be the Kinect interaction.
You can interface with Kinect through WPF, and use Zigfu for unity. The downside to using WPF for the Kinect interaction is that you won't be able to send data to unity unless you use the new Kinect Client Server System and send all of your information to a webserver, then retrieve it in unity. That is a very bad idea, and will probably not be fast enough and experience serious lag.
The second option, using ZigFu in unity for the Kinect support, is to be considered. The issue with it is if you want to use the Kinect in WPF, then you would have to disconnect from unity. This is the only possible way I see being able to implement
Overall, this is achievable, but will be very difficult. I suggest you use ZigFu, but there are drawbacks to both methods.
This is quite a big and generic question. But I will suggest you to use a web deployment of your unity project and, on your project/References/COM there is a UnityWebPlayerAXLib Control that will help you a lot.

Categories

Resources