This should hopefully be a simple question. So I finally figured out how to render stuff in 3D in OpenTK. Great! Only problem is, it doesn't quite look how I expect. I'm drawing a sphere using the Polar method, and drawing using PrimitiveType.Polygon.
Here's the algorithm for calculating the coordinates. What I'm doing is stepping through each phi then theta in the sphere, incrementally adding more adjacent quads to my final point list:
Point 1: Theta1, Phi1
Point 2: Theta1, Phi2
Point 3: Theta2, Phi2
Point 4: Theta2: Phi1
protected static RegularPolygon3D _create_unit(int n)
{
List<Vector3> pts = new List<Vector3>();
float theta = 0.0f;
float theta2 = 0.0f;
float phi = 0.0f;
float phi2 = 0.0f;
float segments = n;
float cosT = 0.0f;
float cosT2 = 0.0f;
float cosP = 0.0f;
float cosP2 = 0.0f;
float sinT = 0.0f;
float sinT2 = 0.0f;
float sinP = 0.0f;
float sinP2 = 0.0f;
List<Vector3> current = new List<Vector3>(4);
for (float lat = 0; lat < segments; lat++)
{
phi = (float)Math.PI * (lat / segments);
phi2 = (float)Math.PI * ((lat + 1.0f) / segments);
cosP = (float)Math.Cos(phi);
cosP2 = (float)Math.Cos(phi2);
sinP = (float)Math.Sin(phi);
sinP2 = (float)Math.Sin(phi2);
for (float lon = 0; lon < segments; lon++)
{
current = new List<Vector3>(4);
theta = TWO_PI * (lon / segments);
theta2 = TWO_PI * ((lon + 1.0f) / segments);
cosT = (float)Math.Cos(theta);
cosT2 = (float)Math.Cos(theta2);
sinT = (float)Math.Sin(theta);
sinT2 = (float)Math.Sin(theta2);
current.Add(new Vector3(
cosT * sinP,
sinT * sinP,
cosP
));
current.Add(new Vector3(
cosT * sinP2,
sinT * sinP2,
cosP2
));
current.Add(new Vector3(
cosT2 * sinP2,
sinT2 * sinP2,
cosP2
));
current.Add(new Vector3(
cosT2 * sinP,
sinT2 * sinP,
cosP
));
pts.AddRange(current);
}
}
var rtn = new RegularPolygon3D(pts);
rtn.Translation = Vector3.ZERO;
rtn.Scale = Vector3.ONE;
return rtn;
}
And so my Sphere class looks like this:
public class Sphere : RegularPolygon3D
{
public static Sphere Create(Vector3 center, float radius)
{
var rp = RegularPolygon3D.Create(30, center, radius);
return new Sphere(rp);
}
private Sphere(RegularPolygon3D polygon) : base(polygon)
{
}
}
I should also mention, that the color of this sphere is not constant. I 2 dimensions, I have this code that works great for gradients. In 3D...not so much. That's why my sphere has multiple colors. The way the 2d gradient code works, is there is a list of colors coming from a class I created called GeometryColor. When the polygon is rendered, every vertex gets colored based off the list of colors within GeometryColor. So if there are 3 colors the user wished to gradient between, and there were 6 vertices (hexagon), then the code would assign the first 2 vertices color 1, the 2nd two color 2, then the last 2 color 3. The following code shows how the color for the vertex is calculated.
public ColorLibrary.sRGB GetVertexFillColor(int index)
{
var pct = ((float)index + 1.0f) / (float)Vertices.Count;
var colorIdx = (int)Math.Round((FillColor.Colors.Count - 1.0f) * pct);
return FillColor.Colors[colorIdx];
}
Anyway, here's the output I'm getting...hope somebody can see my error...
Thanks.
Edit: If I only use ONE Vertex color (i,e instead of my array of 4 diff colors), then I get a completely smooth sphere...although without lighting and stuff its hard to tell its anything but a circle lol)
Edit....so somehow my sphere is slightly see through...even though all my alphas are set to 1.0f and I'm doing depth testing..
GL.DepthMask(true);
GL.Enable(EnableCap.DepthTest);
GL.ClearDepth(1.0f);
GL.DepthFunc(DepthFunction.Lequal);
Final edit: OK, it has SOMETHING to do with my vertices I'm guessing, because when I use PrimitiveType.Quads it works perfectly....
Related
This question is in between computer graphic, probability, and programming, but since I am coding it for an Unity project in C# I decided to post it here. Sorry if not appropriate.
I need to solve this problem: given a object on a 3d sphere at a certain position, and given a range of degrees, sample points on the sphere uniformly within the given range.
For example:
Left picture: the cube represents the center of the sphere, the green sphere is the starting position. I want to uniformly cover all surface of the circle within a certain degree, for example from -90 to 90 degrees around the green sphere. My approach (right picture) doesn't work as it over-samples points that are close to the starting position.
My sampler:
Vector3 getRandomEulerAngles(float min, float max)
{
float degree = Random.Range(min, max);
return degree * Vector3.Normalize(new Vector3(Random.Range(min, max), Random.Range(min, max), Random.Range(min, max)));
}
and for covering the top half of the sphere I would call getRandomEulerAngles(-90, 90).
Any idea?
Try this:
public class Sphere : MonoBehaviour
{
public float Radius = 10f;
public float Angle = 90f;
private void Start()
{
for (int i = 0; i < 10000; i++)
{
var randomPosition = GetRandomPosition(Angle, Radius);
Debug.DrawLine(transform.position, randomPosition, Color.green, 100f);
}
}
private Vector3 GetRandomPosition(float angle, float radius)
{
var rotationX = Quaternion.AngleAxis(Random.Range(-angle, angle), transform.right);
var rotationZ = Quaternion.AngleAxis(Random.Range(-angle, angle), transform.forward);
var position = rotationZ * rotationX * transform.up * radius + transform.position;
return position;
}
}
We can use a uniform sphere sampling for that. Given two random variables u and v (uniformly distributed), we can calculate a random point (p, q, r) on the sphere (also uniformly distributed) with:
float azimuth = v * 2.0 * PI;
float cosDistFromZenith = 1.0 - u;
float sinDistFromZenith = sqrt(1.0 - cosDistFromZenith * cosDistFromZenith);
(p, q, r) = (cos(azimuth) * sinDistFromZenith, sin(azimuth) * sinDistFromZenith, cosDistFromZenith);
If we put our reference direction (your object location) into zenithal position, we need to sample v from [0, 1] to get all directions around the object and u in [cos(minDistance), cos(maxDistance)], where minDistance and maxDistance are the angle distances from the object you want to allow. A distance of 90° or Pi/2 will give you a hemisphere. A distance of 180° or Pi will give you the full sphere.
Now that we can sample the region around the object in zenithal position, we need to account for other object locations as well. Let the object be positioned at (ox, oy, oz), which is a unit vector describing the direction from the sphere center.
We then build a local coordinate system:
rAxis = (ox, oy, oz)
pAxis = if |ox| < 0.9 : (1, 0, 0)
else : (0, 1, 0)
qAxis = normalize(cross(rAxis, pAxis))
pAxis = cross(qAxis, rAxis)
And finally, we can get our random point (x, y, z) on the sphere surface:
(x, y, z) = p * pAxis + q * qAxis + r * rAxis
Adapted from Nice Schertler, this is the code I am using
Vector3 GetRandomAroundSphere(float angleA, float angleB, Vector3 aroundPosition)
{
Assert.IsTrue(angleA >= 0 && angleB >= 0 && angleA <= 180 && angleB <= 180, "Both angles should be[0, 180]");
var v = Random.Range(0F, 1F);
var a = Mathf.Cos(Mathf.Deg2Rad * angleA);
var b = Mathf.Cos(Mathf.Deg2Rad * angleB);
float azimuth = v * 2.0F * UnityEngine.Mathf.PI;
float cosDistFromZenith = Random.Range(Mathf.Min(a, b), Mathf.Max(a, b));
float sinDistFromZenith = UnityEngine.Mathf.Sqrt(1.0F - cosDistFromZenith * cosDistFromZenith);
Vector3 pqr = new Vector3(UnityEngine.Mathf.Cos(azimuth) * sinDistFromZenith, UnityEngine.Mathf.Sin(azimuth) * sinDistFromZenith, cosDistFromZenith);
Vector3 rAxis = aroundPosition; // Vector3.up when around zenith
Vector3 pAxis = UnityEngine.Mathf.Abs(rAxis[0]) < 0.9 ? new Vector3(1F, 0F, 0F) : new Vector3(0F, 1F, 0F);
Vector3 qAxis = Vector3.Normalize(Vector3.Cross(rAxis, pAxis));
pAxis = Vector3.Cross(qAxis, rAxis);
Vector3 position = pqr[0] * pAxis + pqr[1] * qAxis + pqr[2] * rAxis;
return position;
}
I'm in the process of setting up a relatively simple voxel-based world for a game. The high level idea is to first generate voxel locations following a fibonacci grid, then rotate the cubes such that the outer surface of the fibonacci grid resembles a sphere, and finally size the cubes such that they roughly cover the surface of the sphere (overlap is fine). See below the code for generating the voxels along the fibonacci grid:
public static Voxel[] CreateInitialVoxels(int numberOfPoints, int radius)
{
float goldenRatio = (1 + Mathf.Sqrt(5)) / 2;
Voxel[] voxels = new Voxel[numberOfPoints];
for (int i = 0; i < numberOfPoints; i++)
{
float n = i - numberOfPoints / 2; // Center at zero
float theta = 2 * Mathf.PI * n / goldenRatio;
float phi = (Mathf.PI / 2) + Mathf.Asin(2 * n / numberOfPoints);
voxels[i] = new Voxel(new Location(theta, phi, radius));
}
return voxels;
}
This generates a sphere that looks roughly like a staircase
So, my current approach to get this looking a bit more spherical is to basically rotate each cube in each pair of axes, then combine all of the rotations:
private void DrawVoxel(Voxel voxel, GameObject voxelContainer)
{
GameObject voxelObject = Instantiate<GameObject>(GetVoxelPrefab());
voxelObject.transform.position = voxel.location.cartesianCoordinates;
voxelObject.transform.parent = voxelContainer.transform;
Vector3 norm = voxel.location.cartesianCoordinates.normalized;
float xyRotationDegree = Mathf.Atan(norm.y / norm.x) * (180 / Mathf.PI);
float zxRotationDegree = Mathf.Atan(norm.z / norm.x) * (180 / Mathf.PI);
float yzRotationDegree = Mathf.Atan(norm.z / norm.y) * (180 / Mathf.PI);
Quaternion xyRotation = Quaternion.AngleAxis(xyRotationDegree, new Vector3(0, 0, 1));
Quaternion zxRotation = Quaternion.AngleAxis(zxRotationDegree, new Vector3(0, 1, 0));
Quaternion yzRotation = Quaternion.AngleAxis(yzRotationDegree, new Vector3(1, 0, 0));
voxelObject.transform.rotation = zxRotation * yzRotation * xyRotation;
}
The primary thing that I am getting caught on is that each of these rotations seems to work fine for me in isolation, but when combining them things tend to go a bit haywire (pictures below) I'm not sure exactly what the issue is. My best guess is that I've made some sign/rotation mismatch in my rotations so they don't combine right. I can get two working, but never all three together.
Above are the pictures of one and two successful rotations, followed by the error mode when I attempt to combine them. Any help either on telling me that the approach I'm following is too convoluted, or helping me understand what the right way to combine these rotations would be would be very helpful. Cartesian coordinate conversion below for reference.
[System.Serializable]
public struct Location
{
public float theta, phi, r;
public Vector3 polarCoordinates;
public float x, y, z;
public Vector3 cartesianCoordinates;
public Location(float theta, float phi, float r)
{
this.theta = theta;
this.phi = phi;
this.r= r;
this.polarCoordinates = new Vector3(theta, phi, r);
this.x = r * Mathf.Sin(phi) * Mathf.Cos(theta);
this.y = r * Mathf.Sin(phi) * Mathf.Sin(theta);
this.z = r * Mathf.Cos(phi);
this.cartesianCoordinates = new Vector3(x, y, z);
}
}
I managed to find a solution to this problem, though it's still not clear to me what the issue with the above code is.
Unity has an extremely handy function called Quaternion.FromToRotation that will generate the appropriate rotation if you simply pass in the appropriate destination vector.
In my case I was able to just do:
voxelObject.transform.rotation = Quaternion.FromToRotation(new Vector3(0, 0, 1), voxel.location.cartesianCoordinates);
I am using Eigen to calculate the best fit of a set of points to a plane. What I need to do with this data, is then rotate the set of points so they lie flat, negating the rotation value.
My code is:
cv::Point2f plane_from_points(const std::vector<Vector3> & c)
{
// copy coordinates to matrix in Eigen format
size_t num_atoms = c.size();
Eigen::Matrix< Vector3::Scalar, Eigen::Dynamic, Eigen::Dynamic > coord(3, num_atoms);
for (size_t i = 0; i < num_atoms; ++i) coord.col(i) = c[i];
// calculate centroid
Vector3 centroid(coord.row(0).mean(), coord.row(1).mean(), coord.row(2).mean());
// subtract centroid
coord.row(0).array() -= centroid(0); coord.row(1).array() -= centroid(1); coord.row(2).array() -= centroid(2);
// we only need the left-singular matrix here
// http://math.stackexchange.com/questions/99299/best-fitting-plane-given-a-set-of-points
auto svd = coord.jacobiSvd(Eigen::ComputeThinU | Eigen::ComputeThinV);
Vector3 plane_normal = svd.matrixU().rightCols<1>();
float x = plane_normal[0];
float y = plane_normal[1];
float z = plane_normal[2];
float angle = atan2(x, z) * 180 / PI;
float angle2 = atan2(y, z) * 180 / PI;
cv::Point ret(angle, angle2);
return ret;
}
Then, in C#, I convert the angle values to a quaternion, to rotate my object:
public static Quaternion QuatFromEuler(double yaw, double pitch, double roll)
{
yaw = Deg2Rad(yaw);
pitch = Deg2Rad(pitch);
roll = Deg2Rad(roll);
double rollOver2 = roll * 0.5f;
double sinRollOver2 = (double)Math.Sin((double)rollOver2);
double cosRollOver2 = (double)Math.Cos((double)rollOver2);
double pitchOver2 = pitch * 0.5f;
double sinPitchOver2 = (double)Math.Sin((double)pitchOver2);
double cosPitchOver2 = (double)Math.Cos((double)pitchOver2);
double yawOver2 = yaw * 0.5f;
double sinYawOver2 = (double)Math.Sin((double)yawOver2);
double cosYawOver2 = (double)Math.Cos((double)yawOver2);
Quaternion result = new Quaternion();
result.W = cosYawOver2 * cosPitchOver2 * cosRollOver2 + sinYawOver2 * sinPitchOver2 * sinRollOver2;
result.X = cosYawOver2 * sinPitchOver2 * cosRollOver2 + sinYawOver2 * cosPitchOver2 * sinRollOver2;
result.Y = sinYawOver2 * cosPitchOver2 * cosRollOver2 - cosYawOver2 * sinPitchOver2 * sinRollOver2;
result.Z = cosYawOver2 * cosPitchOver2 * sinRollOver2 - sinYawOver2 * sinPitchOver2 * cosRollOver2;
return result;
}
This gives me:
angles: -177 -126
quat: -0.453834928533952,-0.890701198505913,-0.0233238317256566,0.0118840858439476
Which, when i apply it, looks nothing like it should. (I expect a roughly 45 degree rotation in one axis, I get a 180 degree flip)
I have tried switching the axes to check for coordinate space mismatch(which is likely), but I cannot get this to work. Am I doing something wrong?
I have checked the 3d points that i pass into the algorithm, and they are correct, so my issue is either in the point-to-plane code, or the quaternion conversion.
Any help would be much appreciated. Thank you.
If you want to calculate the quaternion which rotates one plane to another, simply compute the quaternion that rotates the normal to the other:
#include <Eigen/Geometry>
int main() {
using namespace Eigen;
// replace this by your actual plane normal:
Vector3d plane_normal = Vector3d::Random().normalized();
// Quaternion which rotates plane_normal to UnitZ, or the plane to the XY-plane:
Quaterniond rotQ = Quaterniond::FromTwoVectors(plane_normal, Vector3d::UnitZ());
std::cout << "Random plane_normal: " << plane_normal.transpose() << '\n';
std::cout << "rotated plane_normal: " << (rotQ * plane_normal).transpose() << '\n';
}
Also, don't store your angles in degrees, ever (it may sometimes make sense to output them in degrees ...).
And more importantly: Stop using Euler Angles!
I am having some issues with the way in which I do my frustum culling. The current way does resolve in culling but there is a really odd effect. When I get too close to my main parent object, I am using a scenegraph to render everything, the objects starts flickering really fast. This is gone when I move away. I have tried a lot of things but have no ideas anymore. Do you guys have any thoughts. Any help is greatly appreciated.
I first create a boundingbox consisting of eight points around my models. These are correct as far as i am aware after a lot of testing.
This is my way of calculating the points of the frustum planes.
THe camera position is the position in world space and the orientation is the direction it is looking at. Furhtermore the cameraUp and cameraRight are calculated every time the camera is rotated.
Vector3 pos = Camera.staticPosition;
Vector3 view = Camera.staticOrientation;
Vector3 upVector3 = Camera.cameraUp;
Vector3 rightVector3 = Camera.cameraRight;
float toRadians = (float)Math.PI / 180.0f;
float nearDis = .1f;
float farDistance = 1000f;
float fov = Game.FOV;
float aspectRatio = 1.3f;
//Get with and height of near and far plane
float tanDiv = 2 * (float) Math.Tan(fov*toRadians / 2);
float heightNear = tanDiv * nearDis;
float widthNear = heightNear * aspectRatio;
float heightFar = tanDiv * farDistance;
float widthFar = heightFar * aspectRatio;
// get the centre points of the planes so they can be used to calculate the edge points
Vector3 centreNear = pos + view * nearDis;
Vector3 centreFar = pos + view * farDistance;
// get the halfht values of the width and hegiht to make sure you can get the points
float hNearHalf = heightNear / 2;
float wNearHalf = widthNear / 2;
float hFarHalf = heightFar / 2;
float wFarHalf = widthFar / 2;
Vector3 nearTopLeft = centreNear + (upVector3 * hNearHalf) - (rightVector3 * wNearHalf);
Vector3 nearTopRight = centreNear + (upVector3 * hNearHalf) + (rightVector3 * wNearHalf);
Vector3 nearBottomLeft = centreNear - (upVector3 * hNearHalf) - (rightVector3 * wNearHalf);
Vector3 nearBottomRight = centreNear - (upVector3 * hNearHalf) + (rightVector3 * wNearHalf);
Vector3 farTopLeft = centreFar + (upVector3 * hFarHalf) - (rightVector3 * wFarHalf);
Vector3 farTopRight = centreFar + (upVector3 * hFarHalf) + (rightVector3 * wFarHalf);
Vector3 farBotomLeft = centreFar - (upVector3 * hFarHalf) - (rightVector3 * wFarHalf);
Vector3 farBottomRight = centreFar - (upVector3 * hFarHalf) + (rightVector3 * wFarHalf);
I store my frustum points in an array. First thhe nearplane points, then the farplane, topplane, bottomplane, leftplane and lastly the rightplane. Then i loop through all the six planes and 8 points of the boundinbox if the point is on the right side of the plane increment the value of that key in the dictionary.
Vector3[] frustumPoints = new Vector3[18]
{
nearTopLeft, nearTopRight, nearBottomLeft, farTopLeft, farTopRight, farBotomLeft, nearTopLeft, farTopLeft,
nearTopRight, nearBottomLeft, farBotomLeft, nearBottomRight, nearTopLeft, nearBottomLeft, farTopLeft,
nearTopRight, nearBottomRight, farTopRight
};
Dictionary<Vector3, int> count = new Dictionary<Vector3, int>(8);
for (int value = 0; value < 8; value++)
{
count.Add(cubePositions[value], 0);
}
for (int x = 0; x < 18; x += 3)
{
Vector3 normal = NormalPlane(frustumPoints[x], frustumPoints[x + 1], frustumPoints[x + 2]);
for (int y = 0; y < 8; y++)
{
Vector3 pointPlane = frustumPoints[x] - cubePositions[y];
float dot = Vector3.Dot(pointPlane, normal);
if (dot <= 0 && x % 6 == 0)
{
count[cubePositions[y]]++;
}
else if (dot >= 0 && x % 3 == 0)
{
count[cubePositions[y]]++;
}
}
}
This is my method to get the normals of the plane
Vector3 NormalPlane(Vector3 pointOne, Vector3 pointTwo, Vector3 pointThree)
{
Vector3 normal;
Vector3 edgeOne = pointTwo - pointOne; // calculate vector from point one to point two
Vector3 edgeTwo = pointThree - pointOne; // calculate vector from point one to point three
normal = Vector3.Normalize(Vector3.Cross(edgeOne, edgeTwo)); // calculate the cross product of the two given vectors. Then normalize it so you have normal of plane
return normal; // return the normal
}
If for one of these points on the cube the count is six, the points is within all planes and thus the frustum and I draw the object. If none of the points is equal to 6 the objects don't get drawn.
Problem is i have no idea where I am making a mistake so do you guys have any ideas?
Thanks in advance,
Jeromer
If for one of these points on the cube the count is six, the points is within all planes and thus the frustum and I draw the object. If none of the points is equal to 6 the objects don't get drawn.
Your logic here seems to be
"if there is no vertex of the cube lying inside the frustum, the cube is not visible."
However. This is just wrong. Even if all 8 vertices of the cube lie outside the frustum, the cube still can intersect the frustum, as demonstrated in this 2D sketch:
*----------*
| |
+-----------------+--+ |
\ | / |
\ |/ |
\ / |
\ /| |
\ / *----------*
\ /
\ /
+----+
Hence you cull stuff which is potentially visible.
The usual logic for frustum culling is to cull the box only if all of its vertices are rejected by the same plane. (This will result in some odd cases where the box is completely outside and not culled, but these situations are quite unlikely and usually not a big deal to worry about.)
I am using the following to create a circle using VertexPositionTexture:
public static ObjectData Circle(Vector2 origin, float radius, int slices)
{
/// See below
}
The texture that is applied to it doesn't look right, it spirals out from the center. I have tried some other things but nothing does it how I want. I would like for it to kind-of just fan around the circle, or start in the top-left end finish in the bottom-right. Basically wanting it to be easier to create textures for it.
I know that are MUCH easier ways to do this without using meshes, but that is not what I am trying to accomplish right now.
This is the code that ended up working thanks to Pinckerman:
public static ObjectData Circle(Vector2 origin, float radius, int slices)
{
VertexPositionTexture[] vertices = new VertexPositionTexture[slices + 2];
int[] indices = new int[slices * 3];
float x = origin.X;
float y = origin.Y;
float deltaRad = MathHelper.ToRadians(360) / slices;
float delta = 0;
float thetaInc = (((float)Math.PI * 2) / vertices.Length);
vertices[0] = new VertexPositionTexture(new Vector3(x, y, 0), new Vector2(.5f, .5f));
float sliceSize = 1f / slices;
for (int i = 1; i < slices + 2; i++)
{
float newX = (float)Math.Cos(delta) * radius + x;
float newY = (float)Math.Sin(delta) * radius + y;
float textX = 0.5f + ((radius * (float)Math.Cos(delta)) / (radius * 2));
float textY = 0.5f + ((radius * (float)Math.Sin(delta)) /(radius * 2));
vertices[i] = new VertexPositionTexture(new Vector3(newX, newY, 0), new Vector2(textX, textY));
delta += deltaRad;
}
indices[0] = 0;
indices[1] = 1;
for (int i = 0; i < slices; i++)
{
indices[3 * i] = 0;
indices[(3 * i) + 1] = i + 1;
indices[(3 * i) + 2] = i + 2;
}
ObjectData thisData = new ObjectData()
{
Vertices = vertices,
Indices = indices
};
return thisData;
}
public static ObjectData Ellipse()
{
ObjectData thisData = new ObjectData()
{
};
return thisData;
}
ObjectData is just a structure that contains an array of vertices & an array of indices.
Hope this helps others that may be trying to accomplish something similar.
It looks like a spiral because you've set the upper-left point for the texture Vector2(0,0) in the center of your "circle" and it's wrong. You need to set it on the top-left vertex of the top-left slice of you circle, because 0,0 of your UV map is the upper left corner of your texture.
I think you need to set (0.5, 0) for the upper vertex, (1, 0.5) for the right, (0.5, 1) for the lower and (0, 0.5) for the left, or something like this, and for the others use some trigonometry.
The center of your circle has to be Vector2(0.5, 0.5).
Regarding the trigonometry, I think you should do something like this.
The center of your circle has UV value of Vector2(0.5, 0.5), and for the others (supposing the second point of the sequence is just right to the center, having UV value of Vector2(1, 0.5)) try something like this:
vertices[i] = new VertexPositionTexture(new Vector3(newX, newY, 0), new Vector2(0.5f + radius * (float)Math.Cos(delta), 0.5f - radius * (float)Math.Sin(delta)));
I've just edited your third line in the for-loop. This should give you the UV coordinates you need for each point. I hope so.