I'm trying to draw a large graph (~3,000,000 vertices, ~5,000,000 edges) using OpenTK.
However I can't seem to get it working.
I creating a VBO containing the positions of all the vertices like so
// build the coords list
float[] coords = new float[vertices.Length * 3];
Dictionary<int, int> vertexIndexMap = new Dictioanry<int, int>();
int count = 0, i = 0;
foreach (Vertex v in vertices) {
vertexIndexMap[v.Id] = i++;
coords[count++] = v.x;
coords[count++] = v.y;
coords[count++] = v.z;
}
// build the index list
int[] indices = new int[edges.Length * 2];
count = 0;
foreach (Edge e in edges) {
indices[count++] = vertexIndexMap[e.First.Id];
indices[count++] = vertexIndexMap[e.Second.Id];
}
// bind the buffers
int[] bufferPtrs = new int[2];
GL.GenBuffers(2, bufferPtrs);
GL.EnableClientState(ArrayCap.VertexArray);
GL.EnableClientState(ArrayCap.IndexArray);
// buffer the vertex data
GL.BindBuffer(BufferTarget.ArrayBuffer, bufferPtrs[0]);
GL.BufferData(BufferTarget.ArrayBuffer, (IntPtr)(coords.Length * sizeof(float)), coords, BufferUsageHint.StaticDraw);
GL.VertexPointer(3, VertexPointerType.Float, 0, IntPtr.Zero); // tell opengl we have a closely packed vertex array
GL.BindBuffer(BufferTarget.ArrayBuffer, 0);
// buffer the index data
GL.BindBuffer(BufferTarget.ElementArrayBuffer, bufferPtrs[1]);
GL.BufferData(BufferTarget.ElementArrayBuffer, (IntPtr)(indices.Length * sizeof(int)), indices, BufferUsageHint.StaticDraw);
GL.BindBuffer(BufferTarget.ElementArrayBuffer, 0);
And I attempt to draw the buffers like so:
// draw the vertices
GL.BindBuffer(BufferTarget.ArrayBuffer, bufferPtrs[0]);
GL.Color3(Color.Blue);
GL.DrawArrays(PrimitiveType.Points, 0, coords.Length);
// draw the edges
GL.BindBuffer(BufferTarget.ElementArrayBuffer, bufferPtrs[1]);
GL.Color3(Color.Red);
GL.DrawElements(PrimitiveType.Lines, indices.Length, DrawElementsType.UnsignedInt, bufferPtrs[1]);
When I run this, all of the vertices draw as expected in all of their correct locations,
However about half of the edges are drawn joining a vertex to the origin.
To sanity check I tried drawing the edges with a Begin/End block, and they all drew correctly.
Could someone please point out how am I misusing the VBOs?
The last argument to your DrawElements() call is wrong:
GL.DrawElements(PrimitiveType.Lines, indices.Length, DrawElementsType.UnsignedInt,
bufferPtrs[1]);
Without an element array buffer bund, the last argument to DrawElements() is a pointer to the indices. If an element array buffer is bound (which is the case in your code), the last argument is an offset into the buffer. To use the whole buffer, the offset is 0:
GL.DrawElements(PrimitiveType.Lines, indices.Length, DrawElementsType.UnsignedInt, 0);
You probably also want to remove this call:
GL.EnableClientState(ArrayCap.IndexArray);
This is not for enabling vertex indices, but for color indices. This would be for color index mode, which is a very obsolete feature.
Related
I'm trying to adjust the RGB values to essentially darken or lighten a picture in my image processor project. It would be nice to incorporate both darken and lighten on one TrackBar control, with the neutral, unchanged image being when the slider is set to 0 in the middle.
However, I'm currently still trying to figure out how to do just one feature, for example, darkening, on one TrackBar.
After putting a TrackBar control (Min 0, Max 10) in my project I use the trackbar_scroll event to detect when the TrackBar is scrolled.
I've also coded it to darken the image by subtracting a certain byte value from RGB of each pixel of the image.
The scrollbar successfully darkens the image when you slide it right, however I want to also un-darken it when you slide the TrackBar to the left back to its original position.
private void trbBrightness_Scroll(object sender, EventArgs e)
{
if (bitmapImage == null) return;
Byte Red, Green, Blue;
int iWidth = 320;
int iHeight = 240;
for (int i = 0; i < iWidth; i++)
{
for (int j = 0; j < iHeight; j++)
{
Color pixel = ImageArray[i, j];
Red = pixel.R;
Green = pixel.G;
Blue = pixel.B;
Color newColor =
Color.FromArgb(255,
Red - Convert.ToByte(Red * (trbBrightness.Value * 0.1)),
Green - Convert.ToByte(Green * (trbBrightness.Value * 0.1)),
Blue - Convert.ToByte(Blue * (trbBrightness.Value * 0.1)));
ImageArray[i, j] = newColor;
Right now it just darkens the image like I want it to, however I expect, when slid left, the slider to basically undo the darkening when you slid it right.
Is there a way to detect when the value of the slider is increasing, do this, and when the slider value is decreasing, do this?
I'm assuming I'd have to somehow store the old trackbar value so I can compare it with the new one?
where can I get:
if (trackbar value is increasing)
{
}
else if (trackbar value is decreasing)
{
}
You just need to keep the original Bitmap safe and apply a ColorMatrix to the original bitmap ImageAttributes when you adjust the values.
Of course, you don't adjust what has already been adjusted, otherwise, you'll never get back to the original Brightness/Contrast values of your source bitmap.
Set the new values to the original Bitmap, then show only the results, preserving the original.
The Contrast component can be set to a range of -1.00f to 2.00f.
When the Contrast values goes below 0, you are inverting the Colors making a negative image. You need to decide if you are allowing this behavior. Otherwise, you can limit the Contrast to a minimum of 0.00f: applied to all color components, generates a gray blob (no contrast at all).
The values of the Brightness component can be set to a range of -1.00f to +1.00f. Above an below you have all-white and all-black results.
Using the identity Matrix as reference:
C = Contrast = 1 : B = Brightness = 0
C, 0, 0, 0, 0 1, 0, 0, 0, 0
0, C, 0, 0, 0 0, 1, 0, 0, 0
0, 0, C, 0, 0 0, 0, 1, 0, 0
0, 0, 0, 1, 0 0, 0, 0, 1, 0
B, B, B, 1, 1 0, 0, 0, 1, 1
(If you are used to the math definition of a Matrix or the way other platforms define it, you may think it's wrong. This is just the way a ColorMatrix is defined in the .Net/GDI dialect).
A sample of the code needed to adjust the Brightness and Contrast of Bitmap, using a ColorMatrix and a standard PictureBox control to present the results.
Sample result:
Procedure:
Assign an Image to a PictureBox control, then assign the same Image to a Bitmap object, here a field named adjustBitmap:
Bitmap adjustBitmap = null;
Somewhere (Form.Load(), maybe), assign an Image to a PicureBox and a copy of the same Image to the adjustBitmap Field, which will preserve the original Image color values.
Note: Dispose() of the adjustBitmap object when the Form closes (Form.FormClosed) event.
Add 2 TrackBar controls, one to adjust the Brightness and one for the Contrast (named trkContrast and trkBrightness here).
The Brigthness trackbar will have: Minimum = -100, Maximum = 100, Value = 0
The Contrast trackbar will have: Minimum = -100, Maximum = 200, Value = 100
Subscribe to and assign to both the same event handler for the Scroll event.
The handler code calls the method responsible for adjusting the Bitmap's Brigthness and Contrast, using the current values of the 2 TrackBar controls and the reference of the original Bitmap:
// Somewhere... assign the Bitmap to be altered to the Field
adjustBitmap = [Some Bitmap];
// Use a copy of the above as the PictureBox image
pictureBox1.Image = [A copy of the above];
private void trackBar_Scroll(object sender, EventArgs e)
{
pictureBox1.Image?.Dispose();
pictureBox1.Image = AdjustBrightnessContrast(adjustBitmap, trkContrast.Value, trkBrightness.Value);
}
The main method convert the int values of the TrackBars to floats in the ranges previously described and assings then to the Matrix array:
The new ColorMatrix is the assigned to an ImageAttribute class, which is used as parameter in the Graphics.DrawImage method overload that accepts an ImageAttribute.
using System.Drawing;
using System.Drawing.Imaging;
public Bitmap AdjustBrightnessContrast(Image image, int contrastValue, int brightnessValue)
{
float brightness = -(brightnessValue / 100.0f);
float contrast = contrastValue / 100.0f;
var bitmap = new Bitmap(image.Width, image.Height, PixelFormat.Format32bppArgb));
using (var g = Graphics.FromImage(bitmap))
using (var attributes = new ImageAttributes())
{
float[][] matrix = {
new float[] { contrast, 0, 0, 0, 0},
new float[] {0, contrast, 0, 0, 0},
new float[] {0, 0, contrast, 0, 0},
new float[] {0, 0, 0, 1, 0},
new float[] {brightness, brightness, brightness, 1, 1}
};
ColorMatrix colorMatrix = new ColorMatrix(matrix);
attributes.SetColorMatrix(colorMatrix);
g.DrawImage(image, new Rectangle(0, 0, bitmap.Width, bitmap.Height),
0, 0, bitmap.Width, bitmap.Height, GraphicsUnit.Pixel, attributes);
return bitmap;
}
}
I'm making 2D games in OpenTK (a C# wrapper for OpenGL 4), and all was well except for jagged edges of polygons and things jumping and stuttering instead of moving smoothly - so I'm trying to add in multisampling to antialias my textures.
My setup has several Cameras which render all their scene objects onto a FrameBufferObject texture (I would like this to be MSAA), which are then all drawn to the screen (no multisampling needed), one on top of the other.
Without multisampling, it worked fine, but now I tried to change all my Texture2D calls to Texture2DMultisample etc but now I get FBO Not Complete errors and it draws wrong. I believe I need to change my shaders too, but I want to solve this first.
The code below references a few classes like Texture that I've made, but I don't think that should impact this, and I don't want to clutter the post - will give mroe details if needed.
I set up the FBO for each camera with:
private void SetUpFBOTex()
{
_frameBufferTexture = new Texture(GL.GenTexture(), Window.W, Window.H);
GL.BindTexture(TextureTarget.Texture2DMultisample, _frameBufferTexture.ID);
GL.TexImage2DMultisample(TextureTargetMultisample.Texture2DMultisample, 0, PixelInternalFormat.Rgba8, Window.W, Window.H, true);
_frameBufferID = GL.GenFramebuffer();
}
and draw with:
public void Render(Matrix4 matrix)
{
GL.Enable(EnableCap.Multisample);
//Bind FBO to be the draw destination and clear it
GL.BindFramebuffer(FramebufferTarget.Framebuffer, _frameBufferID);
GL.FramebufferTexture2D(FramebufferTarget.Framebuffer, FramebufferAttachment.ColorAttachment0, TextureTarget.Texture2DMultisample, _frameBufferTexture.ID, 0);
GL.ClearColor(new Color4(0,0,0,0));
GL.Clear(ClearBufferMask.ColorBufferBit | ClearBufferMask.DepthBufferBit);
//draw stuff here
foreach (Layer l in Layers)
l.Render(Matrix);
GL.BindFramebuffer(FramebufferTarget.Framebuffer, 0);
//Bind the FBO to be drawn
_frameBufferTexture.Bind();
//Translate to camera window position
Matrix4 fbomatrix = matrix * Matrix4.CreateTranslation(_window.x, _window.y, 0) * FBOMatrix;
//Bind shader
shader.Bind(ref fbomatrix, DrawType);
//Some OpenGL setup nonsense, binding vertices and index buffer and telling OpenGL where in the vertex struct things are, pointers &c
GL.BindBuffer(BufferTarget.ArrayBuffer, vertexBuffer);
GL.EnableVertexAttribArray(shader.LocationPosition);
GL.VertexAttribPointer(shader.LocationPosition, 2, VertexAttribPointerType.Float, false, Stride, 0);
if (shader.LocationTexture != -1)
{
GL.EnableVertexAttribArray(shader.LocationTexture);
GL.VertexAttribPointer(shader.LocationTexture, 2, VertexAttribPointerType.Float, false, Stride, 8);
}
GL.EnableVertexAttribArray(shader.LocationColour);
GL.VertexAttribPointer(shader.LocationColour, 4, VertexAttribPointerType.UnsignedByte, true, Stride, 16);
GL.BindBuffer(BufferTarget.ElementArrayBuffer, indexBuffer);
//Draw the damn quad
GL.DrawArrays(DrawType, 0, Vertices.Length);
//Cleanup
GL.DisableVertexAttribArray(shader.LocationPosition);
if (shader.LocationTexture != -1)
GL.DisableVertexAttribArray(shader.LocationTexture);
GL.DisableVertexAttribArray(shader.LocationColour);
}
Ok #Andon gets credit for this - if you write it as an answer I'll mark that as the solution. I was indeed doing antialiasing with 0 samples!
I'm posting the working antialiased drawing to multiple FBOS code for future OpenTK googlers.
private void SetUpFBOTex()
{
_frameBufferTexture = new Texture(GL.GenTexture(), Window.W, Window.H);
GL.BindTexture(TextureTarget.Texture2DMultisample, _frameBufferTexture.ID);
GL.TexImage2DMultisample(TextureTargetMultisample.Texture2DMultisample, 8, PixelInternalFormat.Rgba8, Window.W, Window.H, false);
_frameBufferID = GL.GenFramebuffer();
}
public void Render(Matrix4 matrix)
{
//Bind FBO to be the draw destination and clear it
GL.BindFramebuffer(FramebufferTarget.Framebuffer, _frameBufferID);
GL.FramebufferTexture2D(FramebufferTarget.Framebuffer, FramebufferAttachment.ColorAttachment0, TextureTarget.Texture2DMultisample, _frameBufferTexture.ID, 0);
GL.ClearColor(new Color4(0,0,0,0));
GL.Clear(ClearBufferMask.ColorBufferBit);
//draw stuff here
foreach (Layer l in Layers)
l.Render(Matrix);
//unbind FBO to allow drawing to screen again
GL.BindFramebuffer(FramebufferTarget.Framebuffer, 0);
//Bind the FBO to be drawn
GL.BindTexture(TextureTarget.Texture2DMultisample, _frameBufferTexture.ID);
//Translate to camera window position
Matrix4 fbomatrix = matrix * Matrix4.CreateTranslation(_window.x, _window.y, 0) * FBOMatrix;
//Rotate camera FBO texture
if (_rotationAngle != 0f)
{
fbomatrix = Matrix4.CreateTranslation(RotationCentre.x, RotationCentre.y, 0) * fbomatrix;
fbomatrix = Matrix4.CreateRotationZ(_rotationAngle) * fbomatrix;
fbomatrix = Matrix4.CreateTranslation(-RotationCentre.x, -RotationCentre.y, 0) * fbomatrix;
}
shader.Bind(ref fbomatrix, DrawType);
//Some OpenGL setup nonsense, binding vertices and index buffer and telling OpenGL where in the vertex struct things are, pointers &c
GL.BindBuffer(BufferTarget.ArrayBuffer, vertexBuffer);
GL.EnableVertexAttribArray(shader.LocationPosition);
GL.VertexAttribPointer(shader.LocationPosition, 2, VertexAttribPointerType.Float, false, Stride, 0);
if (shader.LocationTexture != -1)
{
GL.EnableVertexAttribArray(shader.LocationTexture);
GL.VertexAttribPointer(shader.LocationTexture, 2, VertexAttribPointerType.Float, false, Stride, 8);
}
GL.EnableVertexAttribArray(shader.LocationColour);
GL.VertexAttribPointer(shader.LocationColour, 4, VertexAttribPointerType.UnsignedByte, true, Stride, 16);
GL.BindBuffer(BufferTarget.ElementArrayBuffer, indexBuffer);
//Draw the damn quad
GL.DrawArrays(DrawType, 0, Vertices.Length);
//Cleanup
GL.DisableVertexAttribArray(shader.LocationPosition);
if (shader.LocationTexture != -1)
GL.DisableVertexAttribArray(shader.LocationTexture);
GL.DisableVertexAttribArray(shader.LocationColour);
}
I have a wrapper class to control Shader code, here's the bind call:
internal void Bind(ref Matrix4 matrixMVP)
{
//Set this shader as active shader
GL.UseProgram(programID);
//Load position matrix into vertex shaders
GL.UniformMatrix4(LocationMVPMatrix, false, ref matrixMVP);
//Load active texture into fragment shaders
GL.Uniform1(LocationSampler, 0);
}
Fragment shader:
/// <summary>
/// Test for a Multisampled fragment shader - http://www.opentk.com/node/2251
/// </summary>
public const string fragmentShaderTestSrc =
#"
#version 330
uniform sampler2DMS Sampler;
in vec2 InTexture;
in vec4 OutColour;
out vec4 OutFragColor;
int samples = 16;
float div= 1.0/samples;
void main()
{
OutFragColor = vec4(0.0);
ivec2 texcoord = ivec2(textureSize(Sampler) * InTexture); // used to fetch msaa texel location
for (int i=0;i<samples;i++)
{
OutFragColor += texelFetch(Sampler, texcoord, i) * OutColour; // add color samples together
}
OutFragColor*= div; //devide by num of samples to get color avg.
}
";
Vertex shader:
/// <summary>
/// Default vertex shader that only applies specified matrix transformation
/// </summary>
public const string vertexShaderDefaultSrc =
#"
#version 330
uniform mat4 MVPMatrix;
layout (location = 0) in vec2 Position;
layout (location = 1) in vec2 Texture;
layout (location = 2) in vec4 Colour;
out vec2 InVTexture;
out vec4 vFragColorVs;
void main()
{
gl_Position = MVPMatrix * vec4(Position, 0, 1);
InVTexture = Texture;
vFragColorVs = Colour;
}";
I want to draw a Wireframe of an 3d object, using OpenTK,
but i don't want to draw it by opentk method, but by my own I guess i have to do it by drawing on opentk buffor, but don't know how to start. I mean, i have a method to draw, but how to override the drawing from OpenTK?
// Push current Array Buffer state so we can restore it later
GL.PushClientAttrib(ClientAttribMask.ClientVertexArrayBit);
GL.Color3(Color.SkyBlue);
//GL.ClientActiveTexture(TextureUnit.Texture0);
//GL.BindTexture(TextureTarget.Texture2D, tex);
GL.BindBuffer(BufferTarget.ArrayBuffer, dataBuffer);
// Normal buffer
GL.NormalPointer(NormalPointerType.Float, 0, (IntPtr)(normOffset * sizeof(float)));
// TexCoord buffer
GL.TexCoordPointer(2, TexCoordPointerType.Float, 0, (IntPtr)(texcoordOffset * sizeof(float)));
// Vertex buffer
GL.VertexPointer(3, VertexPointerType.Float, 0, (IntPtr)(vertOffset * sizeof(float)));
// Index array
GL.BindBuffer(BufferTarget.ElementArrayBuffer, indexBuffer);
GL.DrawElements(BeginMode.Triangles, mesh.Tris.Length * 3, DrawElementsType.UnsignedInt, IntPtr.Zero);
// GL.DrawElements(BeginMode.Lines, mesh.Tris.Length * 3, DrawElementsType.UnsignedByte, IntPtr.Zero);
// Restore the state
GL.PopClientAttrib();
As of now I'm drawing my debug performance graphs with 1px rectangles stretched to necessary height, but drawing a lot of data this way causes significant performance loss.
Currently the logic is: collect all timings for current frame, place them into the Queue<float>s and draw a graph for each queue by drawing 300 stretched 1px sprites. There are 4 graphs, so it's 1200 sprites in debug overlay alone, which is resource consuming.
Is there a better way to draw graphs that at least won't require drawing so many sprites?
Line List
You could use VertexPositionColor arrays to store individual graph values, then use GraphicsDevice.DrawUserIndexedPrimitives<VertexPositionColor> together with a defined line list (indices) to draw them with orthographic projection.
The .gif is resized 50% due to file size.
I'm drawing these samples (4 graphs with 300 value points/pixels each) at 60fps.
Triangle Strip
If you need to fill the graphs below the line, you could draw a triangle strip instead (with points at bottom of the graph).
Line List Code
Here is the relevant code for the first graph rendered above:
Matrix worldMatrix;
Matrix viewMatrix;
Matrix projectionMatrix;
BasicEffect basicEffect;
VertexPositionColor[] pointList;
short[] lineListIndices;
protected override void Initialize()
{
int n = 300;
//GeneratePoints generates a random graph, implementation irrelevant
pointList = new VertexPositionColor[n];
for (int i = 0; i < n; i++)
pointList[i] = new VertexPositionColor() { Position = new Vector3(i, (float)(Math.Sin((i / 15.0)) * height / 2.0 + height / 2.0 + minY), 0), Color = Color.Blue };
//links the points into a list
lineListIndices = new short[(n * 2) - 2];
for (int i = 0; i < n - 1; i++)
{
lineListIndices[i * 2] = (short)(i);
lineListIndices[(i * 2) + 1] = (short)(i + 1);
}
worldMatrix = Matrix.Identity;
viewMatrix = Matrix.CreateLookAt(new Vector3(0.0f, 0.0f, 1.0f), Vector3.Zero, Vector3.Up);
projectionMatrix = Matrix.CreateOrthographicOffCenter(0, (float)GraphicsDevice.Viewport.Width, (float)GraphicsDevice.Viewport.Height, 0, 1.0f, 1000.0f);
basicEffect = new BasicEffect(graphics.GraphicsDevice);
basicEffect.World = worldMatrix;
basicEffect.View = viewMatrix;
basicEffect.Projection = projectionMatrix;
basicEffect.VertexColorEnabled = true; //important for color
base.Initialize();
}
To draw it:
foreach (EffectPass pass in basicEffect.CurrentTechnique.Passes)
{
pass.Apply();
GraphicsDevice.DrawUserIndexedPrimitives<VertexPositionColor>(
PrimitiveType.LineList,
pointList,
0,
pointList.Length,
lineListIndices,
0,
pointList.Length - 1
);
}
For the triangle strip graph, modify the code to display a triangle strip, and for each point in the graph curve put one on the bottom of the graph.
In my application i have a problem with passing my world, view and projection matrix to the shader. I have set up a little engine to perform those tasks and i am using PIX and visual studio to debug what i get as output.
First i post the code that relates to the vertices and indices:
Rendering.Geometry.MeshGeometry<uint> geom = Rendering.Geometry.MeshGeometry<uint>.Create(device);
var elem = Rendering.Geometry.VertexElement.CreatePosition3D(device);
float[] vertices = new float[9]
{
0, 0, -3,
0, 0, 3,
0, 5, 0,
};
elem.DataStream.WriteRange(vertices);
geom.AddVertexElement(elem);
var triangle = geom.Triangles.AddFace();
triangle.P1 = 0;
triangle.P2 = 1;
triangle.P3 = 2;
The geometry seems to be correct because when i debug my draw call in PIX i get the correct values for the vertices (0/0/-3)/(0/0/3)/(0/5/0) so i think index buffer, vertex buffer, input layout and polygon topology are all set up correctly.
Now in PIX i have that interesting Pre-VS, Post-VS view. Pre-VS as i told everything looks fine, the vertices are correct in the right order. When i go to Post-VS and debug a vertex i end up in my shader where i can go through the instructions.
Now what is not correct are the matrices passed to it with the constant buffer. Here is my shader:
cbuffer MatrixBuffer
{
float4x4 worldMatrix;
float4x4 viewMatrix;
float4x4 projectionMatrix;
};
struct VertexInputType
{
float4 position : POSITION;
};
struct PixelInputType
{
float4 position : SV_POSITION;
};
PixelInputType BasicEffectVS(VertexInputType input)
{
PixelInputType output = (PixelInputType)0;
float4x4 worldViewProj = worldMatrix * viewMatrix * projectionMatrix;
output.position = mul(input.position, worldViewProj);
output.position.w = 1.0f;
return output;
}
When i have a look in PIX for the three matrices i see that except for the worldMatrix they have completely wrong values (even NaN is contained) for viewMatrix and projectionMatrix. The way i set the matrices in my application is the following:
basicEffect.WorldMatrix = SlimDX.Matrix.Identity;
basicEffect.ViewMatrix = SlimDX.Matrix.Transpose(SlimDX.Matrix.LookAtLH(new SlimDX.Vector3(20, 5, 0), new SlimDX.Vector3(0, 5, 0), new SlimDX.Vector3(0, 1, 0)));
basicEffect.ProjectionMatrix = SlimDX.Matrix.Transpose(SlimDX.Matrix.PerspectiveFovLH((float)Math.PI / 4, ((float)f.ClientSize.Width / f.ClientSize.Height), 1.0f, 100.0f));
Debugging them in VS gives me the correct values. I then follow the SetValue call on the shader until i get to the actual writing of bytes. Everything is fine there!
The buffer is created the following way:
holder.buffer = new SlimDX.Direct3D11.Buffer(mShader.Device, new BufferDescription()
{
BindFlags = BindFlags.ConstantBuffer,
SizeInBytes = buffer.Description.Size,
Usage = ResourceUsage.Dynamic,
CpuAccessFlags = CpuAccessFlags.Write
});
Even worse:
If i add another matrix parameter to my shader and i set a hardcoded matrix for that value, like:
Matrix mat = new Matrix()
{
M11 = 1,
M12 = 2,
...
};
in PIX i get exactly the values i expect. So my function setting values to the shader must be right.
Anyone has an idea where this comes from?
Make sure you remove this line:
output.position.w = 1.0f;
This is the projective component, since you already multiplied by your projection matrix you need to send it as it is to the pixel shader.
Also I would be quite careful with all the transposes, i'm not sure they are really needed.