Unity circular sector prefab - c#

I need to create prefab, that will allow me to spawn many objects of shape of circular sector with given radius and angle. How to do it?

Since the shapes you want are simple, but chosen dynamically, I think the easiest solution is to have a script create the meshes on the fly when you instanciate your prefab.
Your prefab needs to have a MeshFilter and a custom script. This script will have to compute the vertices that you want (I suppose that you know how to compute the vertices using radius and angle) and assign them to the Mesh.
See the doc for a complete example on how to use the Mesh class.
You can also find an example project using procedural meshes here

Related

Unity. "Stretching" of an object instantiated from prefab

I am trying to spawn obstacles on the road. To do so, I generate a road by spawning its parts, and each part itself spawns several obstacles in bounds of themselves. But some prefabs after being instantiated got strange "stretching" effects. I don't know how to explain it, so I recorded a small video link. Also, if I spawn that same object by "drag n drop" to the scene, this bug never appears.
This how I spawn obstacles:
Vector3 size = GetComponent<Renderer>().bounds.size;
Vector3 pos = new Vector3(Random.Range(-0.3f*size.x,0.3f*size.x), 30, Random.Range(-0.3f*size.z,0.3f*size.z));
Debug.Log(pos + transform.position +"");
GameObject newBottle = Instantiate(minus_prefabs[Random.Range(0, minus_prefabs.Length)], new Vector3(transform.position.x+pos.x,transform.position.y + pos.y,transform.position.z+pos.z), Quaternion.Euler(0, 0, 0));
newBottle.transform.SetParent(transform);
This happens because the parent GameObject of the model has diffrent sizes on X, Y and Z. This results in some weird stretching. Try going in your prefab and set the scale of your main GameObject to X:1 Y:1 Z:1 for example.
Also see: https://www.unity3dtips.com/how-to-fix-objects-stretching-when-rotated/#:~:text=Cause%20of%20the%20object%20stretching,0.01%2C%200.002%2C%200.004%E2%80%9D.
This also happens to me all the Time. Thankfully its easy to fix!
EDIT: When you instantiate the prefab, also make sure it has the same sizes on every axis.
As a general rule, whenever possible, don't use any Scale other than 1,1,1 unless you need to and it should be on a leaf node of the hierarchy or prefab not a parent node. It will make things go much smoother. If you need to change the size of a mesh (because the noob modeler didn't know how to follow the life-sized scaling metrics in their modeling program, or you just want it smaller or larger), you can do that in the Import settings on the FBX.

Differences between instantiate as GameObject and instantiate as Transform

So I'm following video tutorials on Unity and the first teacher, that I had for the 2D part, always instantiate the bullets as GameObject while the second one, on the 3D part, always instantiate the bullets as Transform.
And the result seems to be the same but what are the differences? Is it just personal preferences?
You can consider the GameObject as a place holder of different components and Transform is a specific component.
In the image below GameObject is the cube itself and in that cube we have attached different components like Transform, Cube(Mesh Filter) Mesh Renderer and Box Collider
So now in your case you can Instantiate by referencing the GameObject as a whole or its Transform component. Usually because people will possibly modify the Transform of the GameObject they will shorten the path a bit and directly Instantiate it by referring to its Transform. However it will be the same if you would Instantiate it as a GameObject only in that case if you want to modify the Transform of the object you would need to firstly access the component then modify it.
There's not a significant difference, it's partially personal preference, and partially convenience if you are going to be referring to one more than the other.
It can mean less typing to use Transform if you'll be referring to the transform more, or use GameObject if you'll be referring to the gameobject more.

Merge colliders of different gameobjects

I'm trying to create an auto spawned of game object out of predefined prefab.
The prefab contains a polygon collider.
The spawned instantiates the new object right at the end of the previous one, in a way that it looks like a single piece.
I'm trying to understand how to merge both of their colliders to be like a single polygon collider defined on the merged object.
Thank you for helping.
Btw if there's a code involved (which I'm sure there is), I would be glad if you could write it in c#.
Thank you!!
So from reading your Q and information provided i would say that you want one polygon collider to be exactly on the other. If this is your goal, when they collide, you can us the transform function and move it to the exact position of the other.

How can I make multiple meshes in unity by script?

I made a water swirl and surface mesh in unity by script, and now I need to add wall which is a simple mesh. But I don't know how to add another mesh though. I wanted to find something like meshfilter.addmesh, but there is no such methods though. I 'm wondering how can I add another mesh or mesh filter to add that wall. I use C# btw.
From the description of your problem (which is not clear enough) it looks like you should create another game object with MeshFilter and MeshRenderer components (you can do it either by hands or by script) for your wall.
If for some reason you REALLY want to have several meshes in one MeshFilter then your only option is to use submeshes (see Mesh.subMeshCount, Mesh.SetIndices, Mesh.SetTriangles, Renderer.materials and maybe Mesh.CombineMeshes).

How do you transform a Skinned Mesh during mesh loading and processing?

I created my own skinned mesh loader. It's working fine, but my problem is I don't know how to transform (scale & rotate) the skinned mesh so that the transformations are "baked" onto the vertices. If it were just a geometry, transforming the vertices are a piece of cake, but now that skinning info is involved, if I do a scale for example, my mesh gets all stretched. I know I need to transform my skinning data too, but which parts? All the Bind Pose matrices? The Inverse Bind Pose Matrices? I can't seem to understand how to go about this.
My implementation is in C# & OpenTK and I am specifically loading Skinned Collada files exported from Blender 2.6.
Thanks in advance.
I don't know C# and OpenTK, but I try to help on the theoretical side. Vertices are transformed by weighted global transform matrix. To form a global transform, you need to concatenate local transform of each joints. To create a local transform, you need to concatenate the local translate, rotate and scale. The weight would come from the joint. So I think you need to get joint local rotation/translation/scaling of your bind pose, then manipulate those local matrix and form them to global matrix. After that, you apply weights to the global transformation then transform the vertices.
The following link may be similar to your question.
COLLADA: Inverse bind pose in the wrong space?
I created this collada file player, but use C++.
http://www.youtube.com/watch?v=bXBfVl-msYw

Categories

Resources