C# (XNA Game Studio) Getting offscreen objects to not render - c#

I am currently developing a game for windows phone 7. The map for my game consists of a 2 dimensional array that holds a voxel in each element. The code for the rendering of the map is as follows.
//Draw Map
for (int i = 0; i < 499; i++)
{
for (int j = 0; j < 499; j++)
{
spriteBatch.Draw(groundVoxelTexture, voxels[i, j].Position, Color.White);
}
}
I should also mention that the player stays centered on the screen and the map moves around the player.
The problem is that this creates A LOT of textures for the phone to render and ends up causing so much lag that the phone I run it on locks up. So is there a way to make the objects that are off screen not render?

Sounds like you want to do some Frustum Culling
I'm not familier with xna for a phone app, but I'm sure it's the same proccess.
//Draw Map
BoundingFrustum bf = new BoundingFrustum(View * Projection);
for (int i = 0; i < 499; i++)
{
for (int j = 0; j < 499; j++)
{
if (bf.Intersects(new BoundingSphere(voxels[i, j].Position, voxelRadius)))
spriteBatch.Draw(groundVoxelTexture, voxels[i, j].Position, Color.White);
}
}
If the view stays in place and the map moves around, you should be able to use the same BoundingFrustum from frame to frame.

Using boundingfrustum is a bit hardcore in this situation.
is much better calculate the grid indexes that need to be drawn.
you are using spritebatch so you are working in 2D, the easy to me would be to use a view transform and work with it inverse to get indexes, but I think you are not using it.
So I suppose that your voxels have a size "S" and your player is in position P, and the screen has size (W,H) and there is no zoom option.
MinX= (int) ((P.X - W * 0.5) / S) - 1;
MinY= (int) ((P.Y - H * 0.5) / S) - 1;
MaxX= MinX+ W/S + 2;
MaxY= MinY+ H/S + 2;
for (int i = MinX; i < MaxX; i++)
{
for (int j = MinY; j < MaxY; j++)
{
spriteBatch.Draw(groundVoxelTexture, voxels[i, j].Position, Color.White);
}
}

Related

Centering cards on screen

I followed a unity tutorial to make a memory game but after adding more cards, they now go off the right of the screen.
The code uses a single card in the Unity designer and all other cards are instantiated and positioned from that (using startpos)
public const float offset = 2.5f;
Vector3 startpos = originalCard.transform.position;
for (int i = 0; i < numCols; i++)
{
for (int j = 0; j < numRows; j++)
{
card = Instantiate(originalCard) as MemoryCard;
float posX = (offset * i) + startpos.x;
float posY = - (offset * j) + startpos.y;
card.transform.position = new Vector3(posX, posY, startpos.z);
}
}
I would like to center the cards in the screen but not sure how given that the screen dimensions via Screen.width are in pixels but the Vector code is not and the x component of the original card (startpos.x) is actually negative.
Easiest way is to move the original card in your scene view. If you click your camera u can see it's bounds so click the camera, then move you starting card as far left as you can and still see it. If they still don't all fit in camera, you can change the card width and do some math to see how many cards you can fit.
The alternative solution, but also easy, is to adjust your camera bounds to encompass the cards where they are.

How would I subdivide an arbitrary cube into smaller cubes?

I'm trying to write an algorithm that will split an arbitrary quad into smaller quads that all have the same x, y, and z scales (so, cubes). Right now I have code that splits quads into scaled down versions of themselves, but I'd like the ratio to be 1:1:1. How would I modify the code below to do that?
for (int x=0; x < 2; x++) {
for (int y=0; y < 2; y++) {
for (int z=0; z < 2; z++) {
GameObject newCube = Instantiate(gameObject);
newCube.transform.localScale = new Vector3(
newCube.transform.localScale.x/2,
newCube.transform.localScale.y/2,
newCube.transform.localScale.z/2
);
newCube.transform.position = new Vector3(
newCube.transform.position.x + ((x-0.5f) * newCube.transform.localScale.x),
newCube.transform.position.y + ((y-0.5f) * newCube.transform.localScale.y),
newCube.transform.position.z + ((z-0.5f) * newCube.transform.localScale.z)
);
}
}
Destroy(gameObject);
If I understood you correctly, you want to make squares from a rectangle (actually the 3D equivalent of those, but whatever).
So your inner squares must have a side, at most, half of the SMALLER side of the rectangle. And, since they are squares, all the sides must have the same size. So, you must find which is the smaller side of x, y and z, and create your cubes with all sides set to half of that value.
Putting that into your code:
for (int x=0; x < 2; x++) {
for (int y=0; y < 2; y++) {
for (int z=0; z < 2; z++) {
GameObject newCube = Instantiate(gameObject);
var cubeSize = Math.Min(oldQuad.x, Math.Min(oldQuad.y, oldQuad.z)) / 2;
newCube.transform.localScale = new Vector3(
cubeSize,
cubeSize,
cubeSize
);
newCube.transform.position = new Vector3(
newCube.transform.position.x + ((x-0.5f) * newCube.transform.localScale.x),
newCube.transform.position.y + ((y-0.5f) * newCube.transform.localScale.y),
newCube.transform.position.z + ((z-0.5f) * newCube.transform.localScale.z)
);
}
}
Destroy(gameObject);
Since you told nothing about how you want to position them, I keep that part the same.

Using perlin noise field, to get random positions for objects to spawn

I am trying to make a system that uses a perlin noise field to generate random positions that are then saved, or saved as the are generated, to a list. Then using that list the game will use those positions to spawn in objects within the level. I think that I might be on the right track, but I might have a few things in the wrong places.
Here is a link to an image of all the things that I think are relevant:
the blue dots should be the saved locations that are added to the list. But i am only getting one position 5 times, which is the correct number of positions but they are all the same position that is the problem.
// --- Random Generation of Objects --- //
Color[] colourMap = new Color[mapWidth * mapHeight];
for(int y = 0; y < mapHeight; y++) {
for(int x = 0; x < mapWidth; x++) {
float currentHeight = noiseMap[x, y];
for(int i = 0; i < regions.Length; i++) {
// --- Random Gen of Asteroids --- //
if(currentHeight <= regions[i].height) {
colourMap[y * mapWidth + x] = regions[i].colour;
Vector3 ping = new Vector3(x, 0, y);
asteroids.Add(ping);
Debug.Log(asteroids[i]);
break;
}
}
}
}
Anyway thank you for the help and reading this far, please let me know if you need anything more.

creating gabor texture in c#

I am trying to create a texture (in a 3-D byte array) that is a coloured gabor patch. I am using OpenTK to map the texture. The texture mapping is working fine, but the texture that is created by my code below is not what I need.
The code I have come up with is as follows:
for (int x = 0; x < size; x++)
{
for (int y = 0; y < size; y++)
{
double sin_term = 0.5*(double)Math.Sin(10 * 3.14159 * ((double)x / (double)size));
sin_term += 0.5;
double gauss = 0.5+Math.Exp(-((Math.Pow(x,2)+Math.Pow(y,2))/(2*Math.Pow(sigma,2))));
double gabor = sin_term * gauss;
byteTexture2[j,i,0] = (byte)(((double)Colour.R * gabor));
byteTexture2[j,i,1] = (byte)(((double)Colour.G * gabor));
byteTexture2[j,i,2] = (byte)(((double)Colour.B * gabor));
}
}
My maths isn't alll that good, so I may be off track but I was trying to multiply the sine wave by the gaussian. The sine wave term seems to work OK by itself but the gaussian may be where it is having problems.
Any help would be much appreciated.
Have found MATLAB code for this problem but no c/c++/c# code
Thanks.
I recently coded up a Gabor filter kernel for use in OpenCV (using C++). Here is my code for the kernel:
/// compute Gabor filter kernels
for (int i = 0; i < h; i++){
x = i - 0.5*(h - 1);
for (int j = 0; j < h; j++) {
y = j - 0.5*(h - 1);
gaborKernelCos.at<float>(i, j) = exp((-16 / (h*h))*(x*x + y*y))*cos((2 * M_PI*w / h)*(x*cos(q) + y*sin(q))) / (h*h);
gaborKernelSin.at<float>(i, j) = exp((-16 / (h*h))*(x*x + y*y))*sin((2 * M_PI*w / h)*(x*cos(q) + y*sin(q))) / (h*h);
}
}
Where the input parameters are the kernel size h, wave number w, and filter orientation q. Note the wave number is related to the filter pixel wavelength by l = h/w. Also, my value for sigma is simply a constant multiple of h.
This shouldn't really produce anything wildly different from your code as far as I can tell. Does your value for sigma make sense? It should probably be at most sigma = 0.5*size.

Isometric Tiles Of Different Sizes Not Rendering Correctly

I am using Unity 5 to create an isometric game. I have generated a grid of tiles and it works well. However, when I use two different tiles to fill in the grid (their image sizes are slightly different), I get gaps in between the tiles. The obvious solution would be to create the tiles so that they are all the same image size, but this would prevent me from creating anything on a tile that is larger than the size of a tile (eg. a tree).
Here are some images to demonstrate:
With only one type of tile:
With two types of tile:
This is the code I use to create the map:
private void CreateMap() {
float tileWidth;
float tileHeight;
int orderInLayer = 0;
SpriteRenderer r = floorTiles [0].GetComponent<SpriteRenderer> ();
tileWidth = r.bounds.max.x - r.bounds.min.x;
tileHeight = r.bounds.max.y - r.bounds.min.y;
for (int i = 0; i < map.GetLength(0); i++) {
orderInLayer += 1;
for (int j = 0; j < map.GetLength (1); j++) {
Vector2 position = new Vector2 ((j * tileWidth / 2) + (i * tileWidth / 2) + (tileWidth / 2), (j * tileHeight / 2) - (i * tileHeight / 2) + (tileHeight/ 2));
r = map[i,j].GetComponent<SpriteRenderer>();
r.sortingOrder = orderInLayer;
Instantiate(map[i, j], position, Quaternion.identity);
}
}
}
Any help would be greatly appreciated, I cannot seem to fix it!
You appear to be calculating a position for each of your tiles from scratch every time you create one. If you have 2 different sized tiles, then your calculation comes out different, hence the gaps in your tiles. This is because you're only using the width/height of the current tile, failing to take into account any previous tiles that may be a shorter/longer height/width.
Given you have varying heights AND widths you'll need a way to calculate the correct position for both to prevent gaps in the X and Y direction. I've mocked up something here, but it's untested. More of a concept(?) I guess.
float tileHeight = 0;
float tileWidth = 0;
Vector2 position = new Vector2(0,0);
Dictionary<int, float> HeightMap = new Dictionary<int, float>();
for (int iRow = 0; iRow < map.GetLength(0); iRow++)
{
position.x = 0;
orderInLayer += 1;
for (int jColumn = 0; jColumn < map.GetLength (1); jColumn++)
{
position.y = HeightMap[jColumn];
r = map[iRow, jColumn].GetComponent<SpriteRenderer>();
tileWidth = r.bounds.max.x - r.bounds.min.x;
tileHeight = r.bounds.max.y - r.bounds.min.y;
r.sortingOrder = orderInLayer;
position.x += tileWidth / 2;
position.y += tileHeight / 2;
Instantiate(map[iRow, jColumn], position, Quaternion.identity);
HeightMap[jColumn] = position.y;
}
}
I leave the best way of storing the height, or instantiating the contents of the HeightMap dictionary to however you see fit.

Categories

Resources