My app is displaying an image in full screen by using OpenGL shader code as shown below. The vertex shader I used here is copied from somewhere. Can anyone explain why here used vTexCoord = (a_position.xy+1)/2;? When I'm trying with vTexCoord = a_position.xy, my OpenGL output is split into four rectangles and only it's top right portion is showing the image. Other three sides are seem as blurred. What change should I do to work it with vTexCoord = a_position.xy ?
Some important functions used in the project are shown below. Please check and help to correct.
float[] vertices = {
// Left bottom triangle
-1f, -1f, 0f,
1f, -1f, 0f,
1f, 1f, 0f,
// Right top triangle
1f, 1f, 0f,
-1f, 1f, 0f,
-1f, -1f, 0f
};
private void CreateShaders()
{
/***********Vert Shader********************/
vertShader = GL.CreateShader(ShaderType.VertexShader);
GL.ShaderSource(vertShader, #"attribute vec3 a_position;
varying vec2 vTexCoord;
void main() {
vTexCoord = (a_position.xy+1)/2;
gl_Position = vec4(a_position, 1);
}");
GL.CompileShader(vertShader);
/***********Frag Shader ****************/
fragShader = GL.CreateShader(ShaderType.FragmentShader);
GL.ShaderSource(fragShader, #"precision highp float;
uniform sampler2D sTexture;varying vec2 vTexCoord;
void main ()
{
vec4 color= texture2D (sTexture, vTexCoord);
gl_FragColor =color;
}");
GL.CompileShader(fragShader);
}
private void InitBuffers()
{
buffer = GL.GenBuffer();
positionLocation = GL.GetAttribLocation(program, "a_position");
positionLocation1 = GL.GetUniformLocation(program, "sTexture");
GL.EnableVertexAttribArray(positionLocation);
GL.BindBuffer(BufferTarget.ArrayBuffer, buffer);
GL.BufferData(BufferTarget.ArrayBuffer, (IntPtr)(vertices.Length * sizeof(float)), vertices, BufferUsageHint.StaticDraw);
GL.VertexAttribPointer(positionLocation, 3, VertexAttribPointerType.Float, false, 0, 0);
}
public void DrawImage(int image)
{
GL.Viewport(new Rectangle(0, 0, ScreenWidth, ScreenHeight));
GL.MatrixMode(MatrixMode.Projection);
GL.PushMatrix();
GL.LoadIdentity();
//GL.Ortho(0, 1920, 0, 1080, 0, 1);
GL.MatrixMode(MatrixMode.Modelview);
GL.PushMatrix();
GL.LoadIdentity();
GL.Disable(EnableCap.Lighting);
GL.Enable(EnableCap.Texture2D);
GL.ActiveTexture(TextureUnit.Texture0);
GL.BindTexture(TextureTarget.Texture2D, image);
GL.Uniform1(positionLocation1, 0);
GL.Begin(PrimitiveType.Quads);
GL.TexCoord2(0, 1);
GL.Vertex3(0, 0, 0);
GL.TexCoord2(0, 0);
GL.Vertex3(1920, 0, 0);
GL.TexCoord2(1, 1);
GL.Vertex3(1920, 1080, 0);
GL.TexCoord2(1, 0);
GL.Vertex3(0, 1080, 0);
GL.End();
RunShaders();
GL.Disable(EnableCap.Texture2D);
GL.PopMatrix();
GL.MatrixMode(MatrixMode.Projection);
GL.PopMatrix();
GL.MatrixMode(MatrixMode.Modelview);
glControl1.SwapBuffers();
}
private void RunShaders()
{
GL.UseProgram(program);
GL.DrawArrays(PrimitiveType.Triangles, 0, vertices.Length / 3);
}
Can anyone explain why here used vTexCoord = (a_position.xy+1)/2;?
The vertex coordinates in you example are in range [-1, 1], for the x and component. This matches to the normalized device space. The normalized device space is a cube from the left-lower-front (-1, -1, -1), to the right-top-back (1, 1, 1), this is the area wich is "visible". This area is mapped to the viewport.
This causes that the vertex coordinates, in your example, form a rectangle, which covers the entire viewport.
If the texture coordinates should wrap the entire texture to the quad, then the texture coordinates (u, v) have to be in range [0, 1]. (0, 0) is the lower left of the texture and (1, 1) the upper right.
See also How do OpenGL texture coordinates work?
So the x and y component of a_position have to be mapped form the range [-1, 1], to the range [0, 1], to be used as uv coordinates for the texture lookup:
u = (a_position.x + 1) / 2
v = (a_position.y + 1) / 2
What change should I do to work it with vTexCoord = a_position.xy?
This is not possible, but you can generate a separate texture coordinate attribute, which is common:
attribute vec3 a_position;
attribute vec2 a_texture;
varying vec2 vTexCoord;
void main() {
vTexCoord = a_texture;
gl_Position = vec4(a_position, 1);
}
float[] vertices = {
// x y z u v
// Left bottom triangle
-1f, -1f, 0f, 0f, 0f
1f, -1f, 0f, 1f, 0f
1f, 1f, 0f, 1f, 1f
// Right top triangle
1f, 1f, 0f, 1f, 1f
-1f, 1f, 0f, 0f, 1f
-1f, -1f, 0f 0f, 0f
};
buffer = GL.GenBuffer();
GL.BindBuffer(BufferTarget.ArrayBuffer, buffer);
GL.BufferData(BufferTarget.ArrayBuffer, (IntPtr)(vertices.Length * sizeof(float)), vertices, BufferUsageHint.StaticDraw);
positionLocation = GL.GetAttribLocation(program, "a_position");
tetureLocation = GL.GetAttribLocation(program, "a_texture");
GL.EnableVertexAttribArray(positionLocation);
GL.EnableVertexAttribArray(tetureLocation);
int stride = sizeof(float) * 5; // 5 because of (x, y, z, u, v)
int offsetUV = sizeof(float) * 3; // 3 because the u and v coordinates are the 4th and 5th coordinate
GL.VertexAttribPointer(positionLocation, 3, VertexAttribPointerType.Float, false, stride, 0);
GL.VertexAttribPointer(tetureLocation, 2, VertexAttribPointerType.Float, false, stride, (IntPtr)(offsetUV));
Related
I've been trying to look into different examples for texturing in OpenTK, however, little to no code examples use the same approach as I desire or a lot of pointless workarounds are required that do not fit my needs. I am simply trying to draw images in OpenTK without their UVs being distorted or malformed. Or rather, how do I malform them to fit the primitive (in this case quad/square) wherever its positioned at in the 2D world?
Consider this image (It's my texture I'm trying to fit inside a quad primitive):
This is the unwanted result. As you can see, it is cropped. I don't care about the wrapping because I plan on fitting the whole image inside the square (No aspect ratio needed). Different wrapping settings did nothing. The image's center is still outside the square.
The transparency and palette is my thing to worry about, I only need help fitting the whole image inside the square!
This is my code for loading textures:
public Texture(Bitmap image)
{
ID = GL.GenTexture();
GL.ActiveTexture(TextureUnit.Texture0);
GL.BindTexture(TextureTarget.Texture2D, ID);
BitmapData data = image.LockBits(new Rectangle(0, 0, image.Width, image.Height), ImageLockMode.ReadOnly, System.Drawing.Imaging.PixelFormat.Format32bppArgb);
GL.TexImage2D(TextureTarget.Texture2D, 0, PixelInternalFormat.Rgba, image.Width, image.Height, 0, OpenTK.Graphics.OpenGL.PixelFormat.Bgra, PixelType.UnsignedByte, data.Scan0);
GL.TexParameter(TextureTarget.Texture2D, TextureParameterName.TextureMinFilter, (int)TextureMinFilter.Linear);
GL.TexParameter(TextureTarget.Texture2D, TextureParameterName.TextureMagFilter, (int)TextureMagFilter.Linear);
GL.TexParameter(TextureTarget.Texture2D, TextureParameterName.TextureWrapS, (int)TextureWrapMode.Repeat);
GL.TexParameter(TextureTarget.Texture2D, TextureParameterName.TextureWrapT, (int)TextureWrapMode.Repeat);
GL.GenerateMipmap(GenerateMipmapTarget.Texture2D);
}
Then here's my code for loading the vertex data into the shaders and drawing the primitive:
List<float> vertex_data = {
-.25f, -.25f,
-.25f, .25f,
.25f, .25f,
.25f, -.25f
};
// Load the 2D object
GL.UseProgram(program);
GL.BindBuffer(BufferTarget.ArrayBuffer, vbo);
GL.BufferData(BufferTarget.ArrayBuffer, vertex_data.Count * sizeof(float), vertex_data.ToArray(), BufferUsageHint.DynamicDraw);
GL.EnableVertexAttribArray(attr_pos);
GL.VertexAttribPointer(attr_pos, 2, VertexAttribPointerType.Float, false, 2 * sizeof(float), 0);
GL.EnableVertexAttribArray(attr_uv);
GL.VertexAttribPointer(attr_uv, 2, VertexAttribPointerType.Float, false, 2 * sizeof(float), 0);
// ^^ Using the same position data for the UV as well.
// ...
// Drawing the 2D object
GL.UseProgram(program);
GL.BindBuffer(BufferTarget.ArrayBuffer, vbo);
GL.ActiveTexture(TextureUnit.Texture0);
GL.BindTexture(TextureTarget.Texture2D, ID);
GL.DrawArrays(PrimitiveType.Quads, 0, 4);
And the vertex and fragment shaders:
vert:
#version 330 core
layout(location = 1) in vec3 vertex_pos;
layout(location = 2) in vec2 vertex_uv;
out vec2 uv;
uniform mat4 mvp;
void main() {
gl_Position = mvp * vec4(vertex_pos, 1);
uv = vertex_uv;
}
frag:
#version 330 core
in vec2 uv;
out vec4 color;
uniform sampler2D texture0;
void main() {
color = texture(texture0, uv);
}
All you need to do is to specifytexture coordinates in range [0.0, 1.0]:
List<float> vertex_data = {
// x y u v
-.25f, -.25f, 0.0f, 0.0f,
-.25f, .25f, 0.0f, 1.0f,
.25f, .25f, 1.0f, 1.0f,
.25f, -.25f, 1.0f, 0.0f,
};
// Load the 2D object
GL.UseProgram(program);
GL.BindBuffer(BufferTarget.ArrayBuffer, vbo);
GL.BufferData(BufferTarget.ArrayBuffer, vertex_data.Count * sizeof(float),
vertex_data.ToArray(), BufferUsageHint.DynamicDraw);
GL.EnableVertexAttribArray(attr_pos);
GL.VertexAttribPointer(attr_pos, 2, VertexAttribPointerType.Float, false,
4 * sizeof(float), 0);
GL.EnableVertexAttribArray(attr_uv);
GL.VertexAttribPointer(attr_uv, 2, VertexAttribPointerType.Float, false,
4 * sizeof(float), 2 * sizeof(float));
The texture coordinate (0, 0) address the bottom left edge of the texture and the texture coordinate (1, 1) address the top right edge of the texture.
You must associate the texture coordinate (0, 0) to the bottom left of the quad and the texture coordinate (1, 1) the top right of the quad.
The 4th argument of GL.VertexAttribPointe (strid) specifies the byte offset between consecutive generic vertex attributes. Since each attribute consists of 4 elements of type flaot (x, y, u, v), this is 4*sizeof(float). The last argument is the byte offset of the attribute. The offset of the vertex coordinates is 0 and and the offset of the texture coordinates is 2*sizeof(float)
Of course you can use 2 separate the attribute arrays. In this case, stride is 2*sizeof(float) for the vertex coordinates and texture coordinates. The offset of the vertex coordinates is 0 and and the offset of the texture coordinates is the size of all the vertex coordinates (8*sizeof(float)):
Use GL.BufferSubData to initialize buffer data:
List<float> vertex_data = {
// x y
-.25f, -.25f,
-.25f, .25f,
.25f, .25f,
.25f, -.25f,
}
List<float> texture_data = {
// u v
0.0f, 0.0f,
0.0f, 1.0f,
1.0f, 1.0f,
1.0f, 0.0f,
};
// Load the 2D object
GL.UseProgram(program);
GL.BindBuffer(BufferTarget.ArrayBuffer, vbo);
GL.BufferData(BufferTarget.ArrayBuffer,
vertex_data.Count * sizeof(float) + texture_data.Count * sizeof(float),
IntPtr.Zero, BufferUsageHint.DynamicDraw);
GL.BufferSubData(BufferTarget.ArrayBuffer, 0,
vertex_data.Count * sizeof(float), vertex_data.ToArray())
GL.BufferSubData(BufferTarget.ArrayBuffer, vertex_data.Count * sizeof(float),
texture_data.Count * sizeof(float), texture_data.ToArray())
GL.EnableVertexAttribArray(attr_pos);
GL.VertexAttribPointer(attr_pos, 2, VertexAttribPointerType.Float, false,
2 * sizeof(float), 0);
GL.EnableVertexAttribArray(attr_uv);
GL.VertexAttribPointer(attr_uv, 2, VertexAttribPointerType.Float, false,
2 * sizeof(float), vertex_data.Count * sizeof(float));
I'm having trouble setting up my camera in 3D space.
Here's my code:
private void SetupViewPort()
{
GL.Viewport(0, 0, glControl1.Width, glControl1.Height);
GL.MatrixMode(MatrixMode.Projection);
GL.LoadIdentity();
GL.Ortho(0, 1000,0,1000, 0, 1);
GL.MatrixMode(MatrixMode.Modelview);
GL.LoadIdentity();
Vector3d eyePos = new Vector3d(0, 0, 1);
Vector3d point = new Vector3d(500, 500, 0.01);
Vector3d up = new Vector3d(0, 0 , 1);
Matrix4d mat = Matrix4d.LookAt(eyePos, point, up);
//mat.Invert();
GL.LoadMatrix(ref mat);
}
I'm expecting to see shapes that I've drawn onto the 2D plane. But I get a blank screen every time.
Here's the code where my shapes are drawn:
private void glControl1_Paint(object sender, PaintEventArgs e)
{
if (!loaded)
return;
GL.Clear(ClearBufferMask.ColorBufferBit | ClearBufferMask.DepthBufferBit);
GL.Enable(EnableCap.DepthTest);
GL.DepthMask(true);
GL.ClearDepth(1.0);
GL.Color3(Color.Yellow);
GL.Begin(PrimitiveType.Triangles);
GL.Vertex2(0, 0);
GL.Vertex2(0, 600);
GL.Vertex2(600, 600);
GL.Vertex2(100, 100);
GL.Vertex2(50, 70);
GL.Vertex2(200, 100);
GL.End();
glControl1.SwapBuffers();
}
An orthonormal projection matrix and a lookAt view matrix don't play well together. Since you are drawing 2D just leave the lookAt matrix out and use the identity matrix.
I cannot render triangles for the life of me with a VBO in OpenTK. I am loading my data to the VBO in glControl_Load() event. I get a background screen with no triangles when running. The data is a from a mesh m.OpenGLArrays(out data, out indices) outputs a list of floats and ints. The list of floats for the vertices T1v1, T1v2, T1v3, T2v1, T2v2, T2v3, .... , all three vertices for each triangle back to back.
However given a blank screen with the code when I comment the "intermediate" rendering code everything renders fine....??? What am I doing wrong?
private void glControl_Load(object sender, EventArgs e)
{
loaded = true;
glControl.MouseMove += new MouseEventHandler(glControl_MouseMove);
glControl.MouseWheel += new MouseEventHandler(glControl_MouseWheel);
GL.ClearColor(Color.DarkSlateGray);
GL.Color3(1f, 1f, 1f);
m.OpenGLArrays(out data, out indices);
this.indicesSize = (uint)indices.Length;
GL.GenBuffers(1, out VBOid[0]);
GL.GenBuffers(1, out VBOid[1]);
SetupViewport();
}
private void SetupViewport()
{
if (this.WindowState == FormWindowState.Minimized) return;
glControl.Width = this.Width - 32;
glControl.Height = this.Height - 80;
Frame_label.Location = new System.Drawing.Point(glControl.Width / 2, glControl.Height + 25);
GL.MatrixMode(MatrixMode.Projection);
//GL.LoadIdentity();
GL.Ortho(0, glControl.Width, 0, glControl.Height, -1, 1); // Bottom-left corner pixel has coordinate (0, 0)
GL.Viewport(0, 0, glControl.Width, glControl.Height); // Use all of the glControl painting area
GL.Enable(EnableCap.DepthTest);
GL.BindBuffer(BufferTarget.ArrayBuffer, VBOid[0]);
GL.BufferData(BufferTarget.ArrayBuffer, (IntPtr)(data.Length * sizeof(float)), data, BufferUsageHint.StaticDraw);
GL.BindBuffer(BufferTarget.ArrayBuffer, 0);
float aspect_ratio = this.Width / (float)this.Height;
projection = Matrix4.CreatePerspectiveFieldOfView(MathHelper.PiOver4, aspect_ratio, 1, 1024);
GL.MatrixMode(MatrixMode.Projection);
GL.LoadMatrix(ref projection);
}
private void glControl_Paint(object sender, PaintEventArgs e)
{
if (loaded)
{
GL.Clear(ClearBufferMask.ColorBufferBit |
ClearBufferMask.DepthBufferBit |
ClearBufferMask.StencilBufferBit);
modelview = Matrix4.LookAt(0f, 0f, -200f + zoomFactor, 0, 0, 0, 0.0f, 1.0f, 0.0f);
var aspect_ratio = Width / (float)Height;
projection = Matrix4.CreatePerspectiveFieldOfView(MathHelper.PiOver4, aspect_ratio, 1, 512);
GL.MatrixMode(MatrixMode.Projection);
GL.LoadMatrix(ref projection);
GL.MatrixMode(MatrixMode.Modelview);
GL.LoadMatrix(ref modelview);
GL.Rotate(angleY, 1.0f, 0, 0);
GL.Rotate(angleX, 0, 1.0f, 0);
GL.EnableClientState(ArrayCap.VertexArray);
GL.BindBuffer(BufferTarget.ArrayBuffer, VBOid[0]);
GL.Color3(Color.Yellow);
GL.VertexPointer(3, VertexPointerType.Float, Vector3.SizeInBytes, new IntPtr(0));
GL.DrawArrays(PrimitiveType.Triangles, 0, data.Length);
GL.BindBuffer(BufferTarget.ArrayBuffer, 0);
GL.DisableClientState(ArrayCap.VertexArray);
//GL.Color3(Color.Yellow);
//GL.PolygonMode(MaterialFace.Front, PolygonMode.Fill);
//GL.Begin(PrimitiveType.Triangles);
//for (int i = 0; i < this.md.mesh.Count; i++)
//{
// GL.Normal3(this.md.mesh[i].normal);
// GL.Vertex3(this.md.mesh[i].vertices[0]);
// GL.Vertex3(this.md.mesh[i].vertices[1]);
// GL.Vertex3(this.md.mesh[i].vertices[2]);
//}
//GL.End();
//GL.EndList();
glControl.SwapBuffers();
Frame_label.Text = "Frame: " + frameNum++;
}
}
If something doesn't seem right, then it probably isn't. I seriously questioned my understanding of opengl and spent hours looking at this. However it was just a simple error of forgetting to iterate a count variable in a for loop to transfer the mesh from one object to another. Each triangle had identical vertices! Always expect the unexpected when it comes to debugging!
Alright I searched other peoples questions and could not find a solution to my problem. I am using OpenTK in C# and GLSL 330. It is producing the error message
error c0000: syntax error, unexpected '?' at token '?'
For some reason it doesn't like something I'm doing. So, here is my code I hope someone can tell
me what I'm doing wrong.
public static string vertexShaderSource = #"
#version 330
uniform mat4 pvm;
in vec4 Position;
in vec2 texCoord;
out vec2 texCoordV;
void main()
{
texCoordV = texCoord;
gl_Position = Position * pvm;
}";
public static string fragmentShaderSource = #"
#version 330
in vec2 texCoordV;
out vec4 colorOut;
void main()
{
colorOut = vec4(texCoord, 0.0, 0.0);
}";
public void Initalize()
{
style = GUI_Skin.styles[0];
vertices = new Vector3[6];
vertices[0] = new Vector3(0, 0, 0f);
vertices[1] = new Vector3(100, 0, 0f);
vertices[2] = new Vector3(0, 100, 0f);
vertices[3] = new Vector3(100, 0, 0f);
vertices[4] = new Vector3(0, 100, 0f);
vertices[5] = new Vector3(100, 100, 0f);
GL.GenBuffers(1, out vertHandle);
GL.BindBuffer(BufferTarget.ArrayBuffer, vertHandle);
GL.BufferData<Vector3>(BufferTarget.ArrayBuffer,
new IntPtr(vertices.Length * Vector3.SizeInBytes),
vertices, BufferUsageHint.StaticDraw);
texCoords = new Vector2[6];
texCoords[0] = new Vector2(0,0);
texCoords[1] = new Vector2(1, 0);
texCoords[2] = new Vector2(0, 1);
texCoords[3] = new Vector2(1, 0);
texCoords[4] = new Vector2(0, 1);
texCoords[5] = new Vector2(1, 1);
GL.GenBuffers(1, out texHandle);
GL.BindBuffer(BufferTarget.ArrayBuffer, texHandle);
GL.BufferData<Vector2>(BufferTarget.ArrayBuffer,
new IntPtr(texCoords.Length * Vector2.SizeInBytes),
texCoords, BufferUsageHint.StaticDraw);
}
public void Draw()
{
GL.EnableVertexAttribArray(vertHandle);
GL.BindBuffer(BufferTarget.ArrayBuffer, vertHandle);
GL.VertexAttribPointer(0, 3, VertexAttribPointerType.Float, false, Vector3.SizeInBytes, 0);
GL.EnableVertexAttribArray(texHandle);
GL.BindBuffer(BufferTarget.ArrayBuffer, texHandle);
GL.VertexAttribPointer(0, 2, VertexAttribPointerType.Float, false, Vector2.SizeInBytes, 0);
GL.DrawArrays(PrimitiveType.Triangles, 0, 6);
GL.DisableVertexAttribArray(vertHandle);
GL.DisableVertexAttribArray(texHandle);
}
Alright so the issues have been fixed. Thanks to the helpful comments above.
Lets start with the shader. The # symbol before the string declaration had to be removed and after every line \n had to be inserted. Also, I was calling transpose when I draw with the shader. Which could be fixed by changing the order of matrices.
public static void Run()
{
int uniformLocation = GL.GetUniformLocation(shaderProgramHandle, "pvm");
Matrix4 mat;
GL.GetFloat(GetPName.ProjectionMatrix, out mat);
GL.UniformMatrix4(uniformLocation, false, ref mat);
GL.UseProgram(shaderProgramHandle);
}
I changed from GL.UniformMatrix4(uniformLocation, true, ref mat); to GL.UniformMatrix4(uniformLocation, false, ref mat); and in the shader itself the order of gl_Position was changed from Position * pvm; to pvm * Position;
public static string vertexShaderSource = "#version 330\n" +
"uniform mat4 pvm;\n" +
"in vec4 Position;\n" +
"in vec2 texCoord;\n" +
"out vec2 texCoordV;\n" +
"void main()\n" +
"{\n" +
"texCoordV = texCoord;\n" +
"gl_Position = pvm * Position;\n" +
"}\n";
public static string fragmentShaderSource = "#version 330\n" +
"in vec2 texCoordV;\n" +
"out vec4 colorOut;" +
"void main()\n" +
"{\n" +
"colorOut = vec4(texCoordV, 0.0, 0.0);\n" +
"}\n" ;
After this was fixed I was getting an error where the rendering surface went white. The error was located within the Draw() function. Basically I wasn't assigning the array locations properly.
public void Draw()
{
GL.EnableVertexAttribArray(0);
GL.BindBuffer(BufferTarget.ArrayBuffer, vertHandle);
GL.VertexAttribPointer(0, 3, VertexAttribPointerType.Float, false, Vector3.SizeInBytes, 0);
GL.EnableVertexAttribArray(1);
GL.BindBuffer(BufferTarget.ArrayBuffer, texHandle);
GL.VertexAttribPointer(1, 2, VertexAttribPointerType.Float, false, Vector2.SizeInBytes, 0);
GL.DrawArrays(PrimitiveType.Triangles, 0, 6);
GL.DisableVertexAttribArray(0);
GL.DisableVertexAttribArray(1);
}
I want to draw a texture in my shader but get an exception (see below).
I have following code:
int vertexArray;
//Pointer to Buffers
int vertexBuffer;
int colorBuffer;
int coordBuffer;
int texUniform; //Pointer to Uniform
int texture; //Pointer to Texture
Init
GL.Enable(EnableCap.Texture2D);
texture = LoadPNG("Resources\\Test.png");
//...
GL.TexParameter(TextureTarget.Texture2D, TextureParameterName.TextureMagFilter, (float)All.Nearest);
GL.TexParameter(TextureTarget.Texture2D, TextureParameterName.TextureMinFilter, (float)All.Nearest);
GL.Hint(HintTarget.PerspectiveCorrectionHint, HintMode.Nicest);
vertexArray = GL.GenVertexArray();
GL.BindVertexArray(vertexArray);
float[] TexCoords = new float[] {
0.0f, 0.0f,
1.0f, 0.0f,
0.0f, 1.0f,
}; //(Array.Length = 2*3)
//Arrays for Vertex (3*3) and Color (4*3)
//GenBuffer, BindBuffer and BufferData for Color and Vertex
coordBuffer = GL.GenBuffer();
GL.BindBuffer(BufferTarget.TextureBuffer, coordBuffer);
GL.BufferData(BufferTarget.TextureBuffer, (IntPtr)(sizeof(float) * TexCoords.Length), TexCoords, BufferUsageHint.StaticDraw);
//Load shader
texUniform = GL.GetUniformLocation(shaderProgram, "tex");
GL.Uniform1(texUniform, 0);
GL.ActiveTexture(TextureUnit.Texture0);
Draw
GL.UseProgram(shaderProgram);
GL.BindTexture(TextureTarget.Texture2D, texture);
GL.EnableVertexAttribArray(0);
GL.BindBuffer(BufferTarget.ArrayBuffer, vertexBuffer);
GL.VertexAttribPointer(0, 3, VertexAttribPointerType.Float, false, 0, 0);
GL.EnableClientState(ArrayCap.VertexArray);
GL.EnableVertexAttribArray(1);
GL.BindBuffer(BufferTarget.ArrayBuffer, colorBuffer);
GL.VertexAttribPointer(1, 4, VertexAttribPointerType.Float, false, 0, 0);
GL.EnableClientState(ArrayCap.ColorArray);
GL.BindBuffer(BufferTarget.TextureBuffer, coordBuffer);
GL.TexCoordPointer(2, TexCoordPointerType.Float, Vector2.SizeInBytes, 0);
GL.EnableClientState(ArrayCap.TextureCoordArray);
GL.DrawArrays(PrimitiveType.Triangles, 0, 6); //<------ Exception
GL.DisableVertexAttribArray(0);
GL.DisableVertexAttribArray(1);
I get a System.AccessViolationException at GL.DrawArrays(...);. I suspect that i haven't loaded a buffer correctly or used a pointer in an incorrect way. The exception is caused by changes i did to get a texture with texture coordinates into the shader, that means vertex and color buffer are working.
I'm not sure what i am doing wrong. I tried different things with the shader but it seems it doesn't matter what i am doing with the shader...
At my last try:
Vertex Shader
#version 330 core
layout(location = 0) in vec3 position;
layout(location = 1) in vec4 color;
layout(location = 2) in vec2 texCoord;
out vec4 vColor;
out vec2 texCoords[];
void main(){
gl_Position = vec4(position, 1.0);
texCoords[0] = texCoord;
vColor = color;
}
Fragment Shader
#version 330 core
in vec4 vColor;
in vec2 texCoords[];
uniform sampler2D tex;
out vec4 fColor;
void main(void)
{
//fColor = vColor;
fColor = texture2D(Texture0, texCoords[0].st);
}
GetShaderInfoLog and GetProgramInfoLog do not return any errors when i comment GL.DrawArrays(...) and run the application.
What is wrong with my code?
Do not enable client state vertex arrays.
Replace the following:
GL.EnableClientState(ArrayCap.VertexArray);
...
GL.EnableClientState(ArrayCap.ColorArray);
...
GL.EnableClientState(ArrayCap.TextureCoordArray);
With:
GL.EnableVertexAttribArray(0);
...
GL.EnableVertexAttribArray(1);
...
GL.EnableVertexAttribArray(2);
At present, you are telling GL to source vertex attributes from glVertexPointer (...), glColorPointer (...) and glTexCoordPointer (...), none of which you actually have setup.
You might be able to get away with enabling the client state: ArrayCap.VertexArray because many drivers alias that to attribute 0, but the others are a recipe for disaster. Nevertheless, until you remove the EnableClientState (...) calls you are going to continue crashing.
Update:
I missed something in your texture coordinate setup...
You also need to replace this line:
GL.TexCoordPointer(2, TexCoordPointerType.Float, Vector2.SizeInBytes, 0);
With this:
GL.VertexAttribPointer(2, 2, VertexAttribPointerType.Float, false, 0, 0);