I'd like to rotate a Texture in XNA. I know I can rotate it when it is drawn, but I would like the Texture2D variable to be the rotated texture. Is there any way to do this?
Use RenderTarget, draw your texture rotated into the RenderTarget, take the texture and save it.
You should provide a new shader that manage texture coords rotation. As the HLSL code of the basiceffect is public, it should be pretty easy to add this behaviour.
Basic Effect HLSL code
Passing an angle parameter to the shader, the transform should be:
newU = U*cos(alfa) - V*sin(alfa);
newV = U*sin(alfa) + V*cos(alfa);
One way would be to pass a rotation matrix to your shader and multiply your texcoords by that before calling the texture sampler.
I'm not sure if XNA/DirectX has the same concept as OpenGL's texture matrix.
Related
I have a prefab cube in Unity. I already run some script to position it, and move some of the vertices to change it's shape. Now, I know that to texture it, I have to do something like:
Texture2D myTexture = Resources.Load("sample") as Texture2D;
cube.GetComponent<Renderer>().material.mainTexture = (Texture)myTexture;
And that works very well. But now, I want to understand how to use UV mapping to assign 2 different texture to my cube (1 for top and 1 for side).
Texture2D topTexture = Resources.Load("topTex") as Texture2D;
Texture2D sideTexture = Resources.Load("sideTex") as Texture2D;
//And now, how do I say to only apply to which side?
cube.GetComponent<Renderer>().material.mainTexture = (Texture)topTexture;
cube.GetComponent<Renderer>().material.mainTexture = (Texture)sideTexture;
Your Vertices and UVs match up so that each position in the array of Vertices corresponds to the UV vector in the same position.
vertices[0] // will map to...
uvs[0]
Your vertices describe each of the vertices on your cube. You then essentially pin points of your texture to it. Remember that your texture co-ords are so that they have a min and max value of 0 and 1.
My main plane is Rectangle(0,0,10000,10000) for example.
My screen plane (ie virtual position) is Rectangle(1000,1000,1920,1080).
My Texture2D is Rectangle(1500,1200,200,100) in main plane.
I need to translate my Texture2D coordinates to my screen plane. I tried with Matrix.Translate without success.
I must get Texture2D = Rectangle(500,200,200,100) in screen plane.
In order to get the Texture2D from (1500, 1200) to (500, 200) you have to use a translation of (-1000, -1000) which are the inverse numbers from your screen plane's coordinates. In code your translation would be something like this
Matrix transform = Matrix.CreateTranslation(-screenPlane.x, -screenPlane.y, 0);
The theory is that you want to move the texture like if your camera was on (0, 0) instead of (1000, 1000). You have to move the texture by (-1000, -1000) in order to do so.
Check the web for 2D camera classes, always usefull to know how cameras work :)
This one for example: http://www.david-amador.com/2009/10/xna-camera-2d-with-zoom-and-rotation/
How can I render text in the form of an isometric projection? I understand the principle but I'm not sure how to actually transform a SpriteFont programmatically to do this.
Example of what I mean:
I'm not even sure what I should be searching for. It seems I could accomplish this by using an isometric projection matrix and a 3D mesh font, but that seems overcomplicated considering I'm working in 2D.
Any ideas?
SpriteBatch.Begin takes a Matrix parameter, transforming the sprites you draw (including SpriteFont) onto whichever plane you desire.
Unfortunately Matrix does not provide Create* methods for creating skew matrices. But it is simple enough to create such a matrix by hand. The following piece of code is tested and is pretty close to what you want:
Matrix skew = Matrix.Identity;
skew.M12 = (float)Math.Tan(MathHelper.ToRadians(36.87f));
Matrix rotate = Matrix.CreateRotationZ(MathHelper.ToRadians(270+26.565f));
sb.Begin(SpriteSortMode.Deferred, null, null, null, null, null, skew * rotate);
// ... draw your sprites here ...
sb.End();
The only difference to your diagram is that Y and Y' point in the opposite direction, because XNA's SpriteBatch works in "client" coordinates ((0,0) at top left, and Y+ is down).
You can use a matrix transformation together with a sprite batch to achieve this. You can read more about matrix translation here.
you all know the layerDepth value of the spriteBatch.draw() call. I'm using 3D vectors for my 2D game. Is it possible to get the layerDepth as the z value within the vertex shader? Or can I call the draw function with 3d vectors?
I need the depth of a sprite for postprocessing.
Yes, the layerDepth is passed as the Z position of each vertex.
Normally there is a range limit of between 0 and 1 for this parameter. Although I think this is only due to position near/far planes of the SpriteBatch default projection matrix. So it might not apply to you - I don't think the values are clamped or anything.
I need a simple shader code to use with my XNA application which just draws a given texture to the screen .. Essentially, I'm trying to do post processing, but my aim here is to not apply any post processing effect, but to just display the texture as itself ..
I tried the following pixel shader code but I'm getting some problem (explained below):
sampler textureSampler;
float4 PostProcessingPS(float2 texCoord: TEXCOORD0, float4 Position : POSITION0) : COLOR0
{
float4 color = tex2D(textureSampler,texCoord);
return color;
}
technique PostProcessingEffect
{
pass Pass1
{
PixelShader = compile ps_2_0 PostProcessingPS();
}
}
The issue I'm getting is that the entire texture is not drawn .. Only the borders are being drawn for some reason ?! And even regarding the borders, I'm not sure if they are being drawn properly .. I'm just saying they are being drawn because the pixel values there change as the scene itself changes ..
This is what it should be like:
And this is what I see:
Any ideas ?
My best advice is to just display the texture in your draw code. Why exactly can't you do this in the application instead of in the pixel shader?
If you REALLY want to do the pixel shader, you'll need to give more code. What is drawn is dependent on the texture coordinates, so i would need to see your draw code to answer this.
How are you drawing the full-screen quad? You may have the vertex winding opposite to the current cull mode so the triangles are being backface culled. You can check this by using GraphicsDevice.RenderState.CullMode = CullMode.None in XNA 3.x, or GraphicsDevice.RasterizerState = RasterizerState.CullNone in 4.0.
The diagonal line you're seeing down the center of the screen is because you're missing the half-pixel offset to correctly align texels to pixels on screen. Try adding an offset to your vertex positions when you pass them from the vertex shader to the pixel shader like this:
Position.x -= 1.0 / ViewportSize.x;
Position.y += 1.0 / ViewportSize.y;
Where ViewportSize contains the pixel dimensions of your viewport, (1024,768) for example.