Perlin Noise wave effect on a 3d sphere - c#

I have been trying to figure out how to create wave effect on a 3d sphere using Perlin Noise
I have found some tutorials on how to do it on a plane, however, none on a 3d object,
This code works just fine on a plane, does anyone know how to adapt it on a 3d sphere ?
Thank you in advance for your help
using UnityEngine;
using System.Collections;
public class PerlinTerrain : MonoBehaviour {
public float perlinScale;
public float waveSpeed;
public float waveHeight;
public float offset;
void Update () {
CalcNoise();
}
void CalcNoise() {
MeshFilter mF = GetComponent<MeshFilter>();
MeshCollider mC = GetComponent<MeshCollider>();
mC.sharedMesh = mF.mesh;
Vector3[] verts = mF.mesh.vertices;
for (int i=0; i< verts.Length; i++) {
float pX = (verts[i].x * perlinScale) + (Time.timeSinceLevelLoad * waveSpeed) + offset;
float pZ = (verts[i].z * perlinScale) + (Time.timeSinceLevelLoad * waveSpeed) + offset;
verts[i].y = Mathf.PerlinNoise(pX, pZ) * waveHeight;
}
mF.mesh.vertices = verts;
mF.mesh.RecalculateNormals();
mF.mesh.RecalculateBounds();
}
}

The easiest way would be to apply the 2D noise to the surface. You could do this by using the vertex normal to know which way you should move the position to.
In your code you are only moving the Y position. For a plane this means you moved the vertex in its normal direction by the amount that the noise function gives you. For a spheere you could do something like this:
verts[i] = verts[i] + (Mathf.PerlinNoise(pX, pZ) * waveHeight * mF.mesh.normals[i].normalized);
This will work for any mesh because it takes in account its original position and just moves the vertex in the direction of its normal.
This will move the vertex up which means it will always be higher than original position, you can apply a simple offset to the noise function to make it go lower as well
verts[i] = verts[i] + ((Mathf.PerlinNoise(pX, pZ) - 0.5f) * waveHeight * mF.mesh.normals[i].normalized);
I noticed you are animating this noise, its important to take the original mesh each update and not the new one that had noise applied to it. So in the end it would look something like this
using UnityEngine;
using System.Collections;
public class PerlinTerrain : MonoBehaviour
{
public float perlinScale;
public float waveSpeed;
public float waveHeight;
public float offset;
Vector3[] baseVertices;
private void OnEnable()
{
MeshFilter mF = GetComponent<MeshFilter>();
baseVertices = mF.mesh.vertices;
}
void Update()
{
CalcNoise();
}
void CalcNoise()
{
MeshFilter mF = GetComponent<MeshFilter>();
mF.sharedMesh.vertices = baseVertices;
mF.sharedMesh.RecalculateNormals();
Vector3[] verts = mF.sharedMesh.vertices;
for (int i = 0; i < verts.Length; i++)
{
float pX = (verts[i].x * perlinScale) + (Time.timeSinceLevelLoad * waveSpeed) + offset;
float pZ = (verts[i].z * perlinScale) + (Time.timeSinceLevelLoad * waveSpeed) + offset;
verts[i] = verts[i] + ((Mathf.PerlinNoise(pX, pZ)) * waveHeight * mF.sharedMesh.normals[i].normalized);
}
mF.sharedMesh.vertices = verts;
}
}
But you still have 1 problem. There are multiple vertices that have the same position but have different normal. You will have to figure out which vertices are shared and calculate its normal.
Group the vertices by its position and take the average normal and only iterate over each unique vertex once.

Because it would be called simplex noise
https://en.wikipedia.org/wiki/Simplex_noise
Perlin is the specific name of 2D version

Related

Unity3d lineRenderer is invisible on some parts

I have 3 Spheres and a lineRenderer which is connected to the spheres. I'm using Bezier curves to get a smooth curves between the spheres.
The code is looking like this:
using System.Collections;
using System.Collections.Generic;
using UnityEngine;
public class LineController : MonoBehaviour
{
private LineRenderer lineRenderer;
public Transform p0;
public Transform p1;
public Transform p2;
void Start()
{
lineRenderer = GetComponent<LineRenderer>();
lineRenderer.sortingOrder = 1;
}
void Update()
{
DrawQuadraticBezierCurve(p0.position, p1.position, p2.position);
}
void DrawQuadraticBezierCurve(Vector3 point0, Vector3 point1, Vector3 point2)
{
lineRenderer.positionCount = 100;
float t = 0f;
Vector3 B = new Vector3(0, 0, 0);
for (int i = 0; i < lineRenderer.positionCount; i++)
{
B = (1 - t) * (1 - t) * point0 + 2 * (1 - t) * t * point1 + t * t * point2;
lineRenderer.SetPosition(i, B);
t += (1 / (float)lineRenderer.positionCount);
}
}
}
The Problem is, that the is invisible on some parts when I move the spheres.
I don't know why this is happening. I'm using Mixed Reality Toolkit. In the best case the line should be displayed as an 3D object like a long cube.
Maybe anybody is familiar with this kind of "error"?

How can I find a Plane size and position and spawn random objects over it?

I want to spawn random objects over the terrain and also over the plane or only on one of them.
The spawning objects for the terrain are working but I'm not sure how to do it with the plane.
To start I can't even find how to get the plane size width and length.
using System.Collections;
using System.Collections.Generic;
using UnityEngine;
public class SideQuests : MonoBehaviour
{
public Terrain terrain;
public Plane plane;
public GameObject prefab;
public GameObject parent;
[Range(10, 1000)]
public int numberOfObjects = 10;
public float yOffset = 10f;
private float terrainWidth;
private float terrainLength;
private float xTerrainPos;
private float zTerrainPos;
private float planeWidth;
private float planeLength;
private float xPlanePos;
private float zPlanePos;
void Start()
{
//Get terrain size
terrainWidth = terrain.terrainData.size.x;
terrainLength = terrain.terrainData.size.z;
//Get terrain position
xTerrainPos = terrain.transform.position.x;
zTerrainPos = terrain.transform.position.z;
//Get plane size
planeWidth = plane.
generateObjectOnTerrain();
}
void generateObjectOnTerrain()
{
for (int i = 0; i < numberOfObjects; i++)
{
//Generate random x,z,y position on the terrain
float randX = UnityEngine.Random.Range(xTerrainPos, xTerrainPos + terrainWidth);
float randZ = UnityEngine.Random.Range(zTerrainPos, zTerrainPos + terrainLength);
float yVal = Terrain.activeTerrain.SampleHeight(new Vector3(randX, 0, randZ));
//Apply Offset if needed
yVal = yVal + yOffset;
//Generate the Prefab on the generated position
GameObject objInstance = Instantiate(prefab, new Vector3(randX, yVal, randZ), Quaternion.identity);
objInstance.name = "Quest";
objInstance.tag = "Quest";
objInstance.transform.parent = parent.transform;
}
}
}
Your code uses Plane which has an infinite size:
A plane is an infinitely large, flat surface that exists in 3D space and divides the space into two halves known as half-spaces.
So to answer your question, the length and width of a Plane can be referenced with Mathf.Infinity
You can project any position to a location on the plane by using Plane.ClosestPointOnPlane. If you can generate random positions, such as with Random.insideUnitSphere you can generate random points on the plane that way:
float spawnRange = 50f;
Vector3 spawnCenter = Vector3.zero;
Vector3 randomPointOnPlane = plane.ClosestPointOnPlane(spawnCenter
+ spawnRange * Random.insideUnitSphere);
(Note that if you use insideUnitSphere, you will have more spawns in the center of the sphere than the edges.)
Then, if you have a height you want them to spawn over the plane, you can add that:
float spawnHeight = 1f;
Vector3 randomPointOverPlane = Vector3.up * spawnHeight + randomPointOnPlane;

How would you lerp the positions of two matrices?

In my code, I create 2 matrices that store information about two different meshes, using the following code:
Mesh mesh;
MeshFilter filter;
public SphereMesh SphereMesh;
public CubeMesh CubeMesh;
public Material pointMaterial;
public Mesh pointMesh;
public List<Matrix4x4> matrices1 = new List<Matrix4x4>();
public List<Matrix4x4> matrices2 = new List<Matrix4x4>();
[Space]
public float normalOffset = 0f;
public float globalScale = 0.1f;
public Vector3 scale = Vector3.one;
public int matricesNumber = 1; //determines which matrices to store info in
public void StorePoints()
{
Vector3[] vertices = mesh.vertices;
Vector3[] normals = mesh.normals;
Vector3 scaleVector = scale * globalScale;
// Initialize chunk indexes.
int startIndex = 0;
int endIndex = Mathf.Min(1023, mesh.vertexCount);
int pointCount = 0;
while (pointCount < mesh.vertexCount)
{
// Create points for the current chunk.
for (int i = startIndex; i < endIndex; i++)
{
var position = transform.position + transform.rotation * vertices[i] + transform.rotation * (normals[i].normalized * normalOffset);
var rotation = Quaternion.identity;
pointCount++;
if (matricesNumber == 1)
{
matrices1.Add(Matrix4x4.TRS(position, rotation, scaleVector));
}
if (matricesNumber == 2)
{
matrices2.Add(Matrix4x4.TRS(position, rotation, scaleVector));
}
rotation = transform.rotation * Quaternion.LookRotation(normals[i]);
}
// Modify start and end index to the range of the next chunk.
startIndex = endIndex;
endIndex = Mathf.Min(startIndex + 1023, mesh.vertexCount);
}
}
The GameObject starts as a Cube mesh, and this code stores the info about the mesh in matrices1. Elsewhere in code not shown, I have it so that the Mesh changes to a Sphere and then changes matricesnumber to 2, then triggers the code above to store the info of the new Sphere mesh in matrices2.
This seems to be working, as I'm able to use code like Graphics.DrawMesh(pointMesh, matrices1[i], pointMaterial, 0);
to draw 1 mesh per vertex of the Cube mesh. And I can use that same line (but with matrices2[i]) to draw 1 mesh per vertex of the Sphere mesh.
The Question: How would I draw a mesh for each vertex of the Cube mesh (info stored in matrices1) on the screen and then make those vertex meshes Lerp into the positions of the Sphere mesh's (info stored in matrices2) vertices?
I'm trying to hack at it with things like
float t = Mathf.Clamp((Time.time % 2f) / 2f, 0f, 1f);
matrices1.position.Lerp(matrices1.position, matrices2.position, t);
but obviously this is invalid code. What might be the solution? Thanks.
Matrix4x4 is usually only used in special cases for directly calculating transformation matrices.
You rarely use matrices in scripts; most often using Vector3s, Quaternions and functionality of Transform class is more straightforward. Plain matrices are used in special cases like setting up nonstandard camera projection.
In your case it seems to me you actually only need to somehow store and access position, rotation and scale.
I would suggest to not use Matrix4x4 at all but rather something simple like e.g.
// The Serializable makes them actually appear in the Inspector
// which might be very helpful in you case
[Serializable]
public class TransformData
{
public Vector3 position;
public Quaternion rotation;
public Vector3 scale;
}
Then with
public List<TransformData> matrices1 = new List<TransformData>();
public List<TransformData> matrices2 = new List<TransformData>();
you could simply lerp over
var lerpedPosition = Vector3.Lerp(matrices1[index].position, matrices2[index].position, lerpFactor);
And if you then need it as a Matrix4x4 you can still create it ad-hoc like e.g.
var lerpedMatrix = Matrix4x4.Translation(lerpedPosition);
or however you want to use the values.

Controlling both angles of a Ray

I'm new to Unity and have a problem I can't figure out. I want objects to spawn randomly at a distance of 20 from a FPS player. You could say the objects need to spawn on the surface of a half sphere with the player as the center. But: not all of that sphere can be used. The "highest" part is too high for objects to spawn, so basically it's a sphere with the top cut off.
What I tried:
thePosition = Random.onUnitSphere * 20 + object2.transform.position;
Obviously, this takes into account the whole sphere (should be only half a sphere) and doesn't take into account the "cut off" part.
So I thought: I basically want to make a ray that can pivot on the ground (so the max angle is 360°), and can go up and down, with a max angle of 90°. Think of it like a canon that can turn (pivot) and go up/down with an angle. Here's an image of what I mean:
So I tried:
Vector3 raydirection = new Vector3 (1f, 1f, 0);
raydirection = Quaternion.Euler (45, 0, 0) * raydirection;
Ray ray = new Ray (player.transform.position, raydirection);
thePosition = ray.GetPoint (20);
But that doesn't allow me to control the pivot angle (angle 1) and the "up-down" angle (angle 2) separately.
So my question is: how can I make it so that I can control both angles of this ray? Because if I can do that, I can just take a random number between 0 and 360 for the pivoting part, and between 0 and 90 for the up/down part.
Any help is much appreciated!
Coincidentally, I needed something very similar to this. The following Behavior will spawn a certain prefab (objectToSpawn) exactly spawnCount times within the set parameters.
The helper class (bottom code) generates a Vector from Yaw, Pitch and a Vector (basically the distance in your case).
What it does:
Pick a random direction (yaw and pitch) within set parameters
Pick a random distance (sounds like you can omit this step)
Calculate the vector
Spawn object
using System.Collections;
using System.Collections.Generic;
using UnityEngine;
public class RandomSpawner : MonoBehaviour {
public GameObject objectToSpawn;
public int spawnCount;
public float minDistance = 2;
public float maxDistance = 10;
public float minPitchDegrees = 0;
public float maxPitchDegrees = 45;
public float minYawDegrees = -180;
public float maxYawDegrees = 180;
void Start ()
{
for (int i = 0; i < spawnCount; i++)
{
float distance = minDistance + Random.value * (maxDistance - minDistance);
float yaw = minYawDegrees + Random.value * (maxYawDegrees - minYawDegrees);
float pitch = minPitchDegrees + Random.value * (maxPitchDegrees - minPitchDegrees);
Vector3 position = RotationHelper.ConvertYawPitch (Vector3.forward * distance, yaw, pitch);
Instantiate (objectToSpawn, position, Quaternion.identity);
}
}
}
using System.Collections;
using System.Collections.Generic;
using UnityEngine;
public static class RotationHelper {
public static Vector3 ConvertYawPitch(Vector3 vector, float yaw, float pitch)
{
Quaternion yawRotation = Quaternion.AngleAxis (yaw, Vector3.up);
Vector3 yawedZAxis = yawRotation * Vector3.left;
Quaternion pitchRotation = Quaternion.AngleAxis (pitch, yawedZAxis);
Vector3 yawedVector = yawRotation * vector;
Vector3 position = pitchRotation * yawedVector;
return position;
}
}
In your specific case, the parameters should be:
minDistance = 20
maxDistance = 20
minPitchDegrees = 0
maxPitchDegrees = 0-90, whatever the angle is after you "cut off the top"
minYawDegrees = -180
maxYawDegrees = 180
I want objects to spawn randomly at a distance of 20 from a FPS
player.
What I understood from this is that you want to spawn objects on the ground, distant 20 units away from the player, in random directions.
You could say the objects need to spawn on the surface of a half
sphere with the player as the center.
Now, this is just another way to make things complex. No need to use the sphere to solve this.
If you want to spawn objects on the surface, easiest solution will be to get a random angle in relation with Vector3.up and walk for 20 units to find the desired point.
Script:
public class Spawner : MonoBehaviour
{
public Transform player;
public Transform prefab;
[Range(10,50)]
public float distance = 20f;
IEnumerator Start()
{
while (true)
{
yield return new WaitForSeconds(0.05f);
Spawn();
}
}
[ContextMenu("Spawn")]
public void Spawn()
{
Vector3 spawnPoint = FindPoint(player.position, distance, Random.Range(0, 360));
Instantiate(prefab, spawnPoint, Quaternion.identity, transform);
}
[ContextMenu("Clear")]
public void Clear()
{
foreach (var item in transform.GetComponentsInChildren<Transform>())
{
if (item != transform)
DestroyImmediate(item.gameObject);
}
}
Vector3 FindPoint(Vector3 center, float radius, int angle)
{
return center + Quaternion.AngleAxis(angle, Vector3.up) * (Vector3.right * radius);
}
}
Result:
Calculates random point based on player's position:
Hope this helps :)

Unity 5 renderer.material.setTexture("_PARALLAXMAP"); Won't change the heightmap

So I have been working on a piece of code that generates a perlin noise texture and applies it to a plane in order to create waves. But I can't get it to set the heightmap texture of the material. I have included material.EnableKeyword("_PARALLAXMAP"); but it does nothing. I have tried this with the normal map as well, without results. Here's the full code.
using UnityEngine;
using System.Collections;
public class NoiseGenerator : MonoBehaviour {
private Texture2D noiseTex;
private float x = 0.0F;
private float y = 0.0F;
public int scale = 10;
private Color[] pixels;
public float speed;
public float move = 0.0F;
void Start () {
Renderer render = GetComponent<Renderer>();
noiseTex = new Texture2D(scale,scale);
render.material = new Material(Shader.Find("Standard"));
render.material.EnableKeyword("_PARALLAXMAP");
render.material.SetTexture("_PARALLAXMAP", noiseTex);
pixels = new Color[noiseTex.width * noiseTex.height];
}
void Update () {
float y = 0.0F;
while (y < noiseTex.height)
{
float x = 0.0F;
while (x < noiseTex.width)
{
float xCoord = move + x / noiseTex.width * scale;
float yCoord = move + y / noiseTex.height * scale;
float sample = Mathf.PerlinNoise(xCoord, yCoord);
pixels[Mathf.RoundToInt(y) * noiseTex.width + Mathf.RoundToInt(x)] = new Color(sample, sample, sample);
x++;
}
y++;
}
noiseTex.SetPixels(pixels);
noiseTex.Apply();
move = move + speed;
}
}
You need to include a Material that use this Parallax variant to notify Unity about you need this.
This can be used in the scene or include in the resource folder.
If not Unity omit this on build how unused.
Just use
ur_material.SetFloat("_Parallax",[value])

Categories

Resources