I adding to my Viewport3d a model (STL file) using HelixToolKit's ModelImporter function, but first nothing show up. To see added 3DModel I have to do zoom out. My question is, are there any way to do dynamically/programmatically zoom out or rotate camera to the newly added model or anything needed to show it immediately.
I am using helix toolkit with C# WPF for the project.
Update: There is a method called ZoomExtents for this purpose. Problem is resolved.
Update#2: Thanks for the warning #sideshowbarker;
//in this case currModel= Model3D and mainViewport=HelixViewport3D
ModelVisual3D device3D = new ModelVisual3D();
device3D.Content = currModel;
mainViewport.Children.Add(device3D);
mainViewport.ZoomExtents();
//after loading 3D-Model into viewport you can just call the ZoomExtents() method in namespace of HelixToolkit.Wpf
//Then your 3D Models will automatically fits into viewport frame.
Related
I am creating a mobile app using Unity. I would like to place a photo which fills the screen and then place some 3D objects on top of this photo. The photo is created during runtime and is saved to the persistance data folder.
What is the best method to achieve this? The options I see are Raw Image/Image/Sprite. However, I have been unable to achieve the above mentioned goal using either of these component types.
What about making a panel on a canvas, setting the image as background and the canvas to fit screen size?
Then place the objects on front of it, manually or as children of said canvas but higher in the hierarchy.
I don't really understand what you intend to do, just trying to help.
editing my original answer: you need to use RawImage to be able to easily load images that are not marked as Sprite in the editor
You load images from file like this :
Texture2D texture = new Texture2D(1,1); // the following needs a non null starting poin
var path=System.IO.Path.Combine(Application.streamingAssetsPath,"your_file.jpg");
byte[] bytes=System.IO.File.ReadAllBytes(path);
texture.LoadImage(bytes);
rawImage.texture=texture;
As far as keeping it filling the screen the AspectRatioFiter component is likely to do the job
To get 3d objects rendering in front, set your canvas to 'Screen Space - Camera' and point it to your camera. This will behave similar to word space in regards to rendernig order (will get 3d sroted) but canvas size will match camera viewport, so AspectRatioFitter will be able to do its job
I'm having a translucency issue using sharpdx for winrt.
First of all, my code:
On the Mainpage.xaml, I've add a swapchainPanel like this:
<Grid>
<SwapChainPanel x:Name="Panel" />
</Grid>
// There is nothing else...
I've used the same class as the sample from github, for the 3D models rendering:
// Initialisation part
graphicsDeviceManager = new GraphicsDeviceManager(this);
graphicsDeviceManager.PreferredGraphicsProfile = new FeatureLevel[] { FeatureLevel.Level_11_0, };
graphicsDeviceManager.DepthBufferShaderResource = true;
Still in the same class, when loading contents:
models = new List<Model>();
foreach (var modelName in new[] {"dude"})
{
model = Content.Load<Model>(modelName);
BasicEffect.EnableDefaultLighting(model, true);
models.Add(model);
}
model = models[0];
// Instantiate a SpriteBatch
spriteBatch = ToDisposeContent(new SpriteBatch(GraphicsDevice));
base.LoadContent();
And finally, the draw part:
// Clears the screen with the Color.CornflowerBlue
GraphicsDevice.Clear(Color.CornflowerBlue);
model.Draw(GraphicsDevice, world, view, projection);
base.Draw(gameTime);
Everything render fine, but this is what I got for a textured model (It's a bit tricky to notice the translucency on a screenshot...)
Here is what I have with a non-textured model...
I think it comes from rendering in Alpha mode. I've tried to set the BlendState but I get some Exceptions since I'm not using the right parameters.
Ok after some several gradual checks as suggested by #xoofx (thanks :) )
I found that there are some issues using SharpDx.
The ModelRendering sample come with .dae models. When I launch this sample, Everything workes fine and there is no issue.
When I load my model, exported on .dae, .fbx and .obj, transluncency appears with .dae, .fbx, but the toolkit compiler can't compile .obj files that have .mtl associated since some primitives and files tag aren't in the correct format. (even if I check the XNA compatibility in Blender when Exporting)
So what have I done to solve my problem ?
1 - Decrease or make sure my graphic profil is the right profil since I'm on a Intel graphic 4000 that doesn't support directX 11.1 yet...
graphicsDeviceManager.PreferredGraphicsProfile = new FeatureLevel[] { FeatureLevel.Level_11_0, };
2 - Delete everything that the toolkit compiler generated in the Debug directory. I think the compiler doesn't generate twice a file when it already exists/compiled.
3 - Finally, I've noticed that all the part in white/transparent on the pictures in my question are all textured. That kind of color comes from this combination:
I've apply a texture on a model or model part.
Unfortunatly, the Toolkit compiler can't find the texture or the texture value is void.
But when you apply a Texture on a model, surface Specular highlight will be activated, especially the MaterialSpecular (that you can see on a model's part property in VisualStudio. MaterialSpecular takes 4 values : R, G, B and A for Alpha always set the Alpha to 1. My aplha was 0 sometimes :( ).
So the light is freakily reflected and tadaaa we got a beautiful translucent model :)
Also, I Export my model into .fbx and make sure to display the default gray plain color on part that are not textured.
4 - Last but not least, Enable default lightning on the Model.
BasicEffect.EnableDefaultLighting(model, true);
Okay, so here we go. I'm attempting to make an application, using XNA as the base because of its renderer. One of the things that is necessary in this project is to open a new window (as a dialog), in which is embedded a separate XNA render panel. I'm using this as an interactive preview panel, so I absolutely need XNA to render in there. However, it seems XNA is not very well equipped to do this. I have tried various things myself, but to no avail (either producing errors and not rendering correctly, or rendering in the wrong aspect ratio, etc.). Normally, I would post code here, but since I have had such little luck, there isn't much to post.
My application currently consists of an XNA application embedded within a Form, and I have a button to open the preview panel, which in theory should pop up as a Form Dialog, containing the XNA renderer, to allow me to draw the preview. I have been trying this for several hours, and got nowhere, so I'm asking for a bit of help here.
Thanks, anyway.
EDIT: Okay, I've made a little progress, but I have 2 problems. Firstly, any textures drawn with a sprite batch appear the right dimensions, but filled with solid black. Also, when I open the dialog, and then close it, and close the application, I get an AccessViolationException. I strongly suspect the two errors are linked in some way.
Here is my code initialising the preview dialog. (a is a custom class that essentially consists of a LinkedList of Texture2D objects).
animPrev = new AnimationPreview(a);
animPrev.Show();
My AnimationPreview class is an extension of the Form class, and contains a PreviewControl object, which is an extension of the GraphicsDeviceControl found in the XNA Winforms sample. Note that my main form extends the XNA Game class, for various reasons.
The PreviewControl object is set up like this:
protected override void Initialize()
{
sb = new SpriteBatch(GraphicsDevice);
Application.Idle += delegate { Invalidate(); };
}
And the Draw method contains:
protected override void Draw()
{
GraphicsDevice.Clear(Microsoft.Xna.Framework.Graphics.Color.Violet);
if (frame != null)
{
sb.Begin();
sb.Draw(Image, Vector2.Zero, Color.White);
sb.End();
}
}
This clears the background of the form violet, as expected, and draws a black box of the same size as Image. This is not expected. Hopefully someone can help me out here.
NOTE: An acceptable alternative would be to convert XNA Texture2D objects to System.Drawing.Image objects. However, I am using XNA 3.1, so I can't just save the texture to a stream and reload it.
Actually, after having tried this, it's a bit dodgy, and very slow, so I'd rather not do it this way.
Did you take a look at the following official tutorials / samples?
XNA WinForms Series 1: Graphics Device
XNA WinForms Series 2: Content Loading
They should explain everything in my opinion. You even find downloadable source for the samples.
I come across a problem which seems a bug for me. I'm making an app that visualizes atoms in a crystal. That problem is that it draws a transparent object and hides the object behind.
Here is the code:
foreach (var atom in filteredAtoms)
{
var color = new Color();
color.ScR = (float)atom.AluminiumProbability;
//color.G = 50;
color.ScB = (float)atom.MagnesiumProbability;
//setting alpha channel but Opacity doens't work as well
color.ScA = (float)(1.0 - atom.VacancyProbability); //(float)1.0;//
DiffuseMaterial material = new DiffuseMaterial(new SolidColorBrush(color));
//material.Brush.Opacity = 1.0 - atom.VacancyProbability;
// make visuals and add them to
atomBuldier.Add(new Point3D(atom.X * Atom.ToAngstrom, atom.Y * Atom.ToAngstrom, atom.Z * Atom.ToAngstrom), material);
}
When I change the material to e.g. EmissiveMaterial there are no "cut" atoms. I googled this post, but the advices given don't apply to this case.
Is this a bug with 2D brush applied to 3D?
The full source code can be found here http://alloysvisualisation.codeplex.com the dll and a test file http://alloysvisualisation.codeplex.com/releases beta link.
Steps to reproduce:
Lunch app
Click Open file button
Open test file (xyzT2000.chmc)
Click Mask button
Check 11 (series of atoms are almost transparent)
Ckick Redraw
For the transparent atoms, you must disable z-buffer-writing. I'm unfamiliar with WPF, but you can probably set this in an Appearance or Material object or so.
The problem occurs because of the following:
When a transparent atom is rendered, it writes its depth to the z-buffer. Subsequent non-transparent atoms that are rendered and should appear, do not get written to the frame buffer, because their z-values fail the z-test, because of the z-values already in the z-buffer of the transparent atom.
In short, the graphics card treats the transparent atom as opaque and hides anything behind it.
Edit: Upon looking into WPF it seems pretty high-level, without direct control of z-buffer behavior.
According to this link, the emissive and specular materials do not write to the z-buffer, so using those is your solution when working with transparent objects.
I have been following the MSDN documentation on how to render a model with a basic effect.
Which is neat. I can change the rendering to display in wireframe by adding the following line to the example code, before the DrawModel method's double loop:
GraphicsDevice.RasterizerState = WIREFRAME_RASTERIZER_STATE;
Where I've initialized WIREFRAME_RASTERIZER_STATE in the constructor as
RasterizerState WIREFRAME_RASTERIZER_STATE = new RasterizerState() { CullMode = CullMode.None, FillMode = FillMode.WireFrame };
Is there an equally easy addition/modification I can make to display just the vertices in the imported model? From my understanding, Wireframe mode tells XNA to render lines instead of triangles, but unfortunately the RasterizerState method that I used above doesn't have a fill mode that displays just the vertices (it's SOLID or WIREFRAME).
Given that most of the draw functionality is hidden in the MSDN example, I was hoping someone could direct me as to how to simply render the points without connecting them.
FillMode.Point was removed in XNA 4.0. This blog post describes why, and provides work-arounds.
In your situation you will probably find that you have to process the model data to generate actual triangles to render in place of points.