I am building a free climbing system for a 3d Game like in Genshin Impact but I can't get my player to climb up the edge,
I used double raycast to detect if it's an edge or not, if there is an edge, the animation is triggered and I used matchTarget as shown here
private bool edgeDetect()
{
RaycastHit firstHit;
Vector3 position = transform.position + detectionOffcet;
if (Physics.Raycast(position, transform.forward, out firstHit, 2f))
{
if (Physics.Raycast(firstHit.point + (transform.forward * playerRadius) + (Vector3.up * 0.6f * playerHeight), Vector3.down, out var secondHit, playerHeight))
{
Targetposition = secondHit.point+edgeOffcet;
m_Animator.SetTrigger("edge");
Vector3 climbEnd = secondHit.point;
m_Animator.MatchTarget(climbEnd, Quaternion.LookRotation(-firstHit.normal), AvatarTarget.Body, new MatchTargetWeightMask(Vector3.one, 0), 0, 0.09f);
return true;
}
}
return false;
}
but this is not working properly, the player always falls and does not reach the edge, he gets stuck , Also if the wall is diagonal the detection system will be wrong,
how how can I fix this
Related
I've made a code for the enemy where it will go left and right and stop at the ledge, but for some reason the enemy will just disappear into thin air, but the enemy can still hit you, when I am going near where the enemy is located.
Here is the code I've made for the Enemy AI:
using System.Collections;
using System.Collections.Generic;
using UnityEngine;
public class PatrollingEnemy : MonoBehaviour
{
public float speed;
private bool movingRight = true;
public Transform groundDetection; //detects the ground if it is there...
// Update is called once per frame
void Update()
{
transform.Translate(Vector2.right * speed * Time.deltaTime);
RaycastHit2D groundInfo = Physics2D.Raycast(groundDetection.position, Vector2.down, 2f);
if (groundInfo.collider == false)
{
if(movingRight == true)
{
transform.eulerAngles = new Vector2(0, -200);
movingRight = false;
}
else
{
transform.eulerAngles = new Vector2(0, 0);
movingRight = true;
}
}
}
}
Here is the link if you want to see how it works with this code: Link for Bug
It occurs because the z-axis value of the enemy transform decreases by time until it's less than the CAMERA z-axis value and moves outside the camera view.
The main problem is that you set transform.eulerAngles = new Vector2(0, -200); when the enemy moves to the left direction and when you combine this code with the transform.Translate(Vector2.right * speed * Time.deltaTime);, the enemy moves in its right vector with -200 angle and so after some times, it goes outside the camera view (Because the camera z-index is probably set to -10 by default).
You need to set the local angle of the y axis value to 180:
...
if (groundInfo.collider == false)
{
if(movingRight == true)
{
transform.eulerAngles = new Vector2(0, 180); // Change -200 to 180
movingRight = false;
}
...
}
...
ANOTHER WAY: if you want to change the enemy face direction to the left side, you can set localScale.x to -1 instead of using rotation.
Here is a simple example:
// Change face direction to the right when Moving RIGHT
transform.localScale= new Vector3(transform.localScale.x, transform.localScale.y, transform.localScale.z);
// Change face direction to the left when Moving LEFT
transform.localScale= new Vector3(-transform.localScale.x, transform.localScale.y, transform.localScale.z);
I am currently making a Katamari/Billy Hatcher game where the player has to roll spheres around. When the game starts the player has normal platformer controls until it approaches a sphere and if the player presses the "attach" button, the player becomes a child of the sphere. The issue I am having is whenever this happens, the player rotates with the sphere. I tried freezing the player's rigid body so it can stop rotating but that just stops the sphere's rotation. Is there any way to stop the rotation of the player, while keeping the sphere rotating?
Picture:
enter image description here
Here's my scripts for the process:
Rigidbody hitRB;
public Vector3 offset = new Vector3(0, 0, 1);
public LayerMask pickupMask;
bool isAttached;
private void TouchingGum()
{
RaycastHit hit = new RaycastHit();
foreach (GameObject gumball in gumBalls)
{
if (Input.GetButtonDown("Attach") && Physics.Raycast(transform.position,transform.forward, out hit, attachRequireDistance, pickupMask))
{
isAttached = true;
Debug.Log(true);
}
else
{
isAttached = false;
}
}
if (isAttached)
{
hitRB = hit.collider.gameObject.GetComponent<Rigidbody>();
Vector3 heldOffset = transform.right * offset.x + transform.up * offset.y + transform.forward * offset.z;
hitRB.isKinematic = true;
hitRB.MovePosition(player.transform.position + heldOffset);
}
else if(!isAttached && !hitRB == null)
{
hitRB.isKinematic = false;
}
If you can, don’t use parent/child relationships in these situations. I feel that there is always a better way. One way of accomplishing this is by taking the player’s transform, and adding the forward direction to an offset:
Vector3 offset = new Vector3(0, 0, 1);
LayerMask pickupMask; //change to mask of objects that can be picked up in editor.
public bool held;
void Update()
{
RaycastHit hit;
if (Input.GetKey(KeyCode.Mouse0) && Physics.Raycast(transform.position, transform.forward, out hit, 2, pickupMask))
{
held = true;
}
else
{
held = false
}
Rigidbody hitRB = hit.collider.gameObject.GetComponent<Rigidbody>();
if (held)
{
Vector3 heldOffset = (transform.right * offset.x) + (transform.up * offset.y) + (transform.forward * offset.z);
// if it still glitches, remove the previous line and add the line after this one.
Vector3 heldOffset = transform.forward * offset.z;
hitRB.isKinematic = true;
hit.collider.gameObject.transform.position = transform.position + heldOffset;
}
if else (!held && !hitRB == null)
{
hitRB.isKinematic = false;
}
}
This script uses raycast and input to detect if the player is clicking the left mouse button and is looking at an object within 2 distance and with a certain layer mask. Then it will set the velocity to the offset plus the player’s position. In other words, this gets the object you looked at while pressing left click, and holds it in front of you (or whatever the offset is).
This question is about Unity3D.
I want to create a navigation similar to Google Earth where you click and drag on a sphere and let the camera orbit accordingly. It is important that the point that was grabbed is always under the mouse position while dragging. The navigation should also work if I zoom close to the sphere. I do not want to rotate the sphere itself. Just exactly like Google Earth does it.
My attempt is to project the mouse position to the sphere if I start to drag. On the next frame I do the same and calculate the angle between the start drag and end drag position.
private void RotateCamera(Vector3 dragStart, Vector3 dragEnd)
{
// calc the rotation of the drag
float angle = Vector3.Angle(dragStart, dragEnd);
// rotate the camera around the sphere
Camera.main.transform.RotateAround(sphere), Vector3.up, angle);
}
I thought of using Unitys RotateAround method to rotate the camera with the calculated angle. Unfortunately I do not have the rotation vector (using Vector3.up in the example is obviously wrong).
Does somebody know how I can calculate this vector to apply it for the method? Am I on the right direction to implement the Google Earth navigation?
Thank You!
UPDATE
I am very close with a new solution. I project the drag vectors to a down and a right plane to get the angles. Afterwards I rotate the camera around up and left. This works well until I reach the poles of the sphere. The camera rotates a lot around itself if I reach a pole.
private void RotateCamera(Vector3 dragStart, Vector3 dragEnd)
{
Vector3 plane = Vector3.down;
var a = Vector3.ProjectOnPlane(dragStart, plane);
var b = Vector3.ProjectOnPlane(dragEnd, plane);
float up = Vector3.SignedAngle(a, b, plane);
plane = Vector3.right;
a = Vector3.ProjectOnPlane(dragStart, plane);
b = Vector3.ProjectOnPlane(dragEnd, plane);
float left = Vector3.SignedAngle(a, b, plane);
Camera.main.transform.RotateAround(_sphere, Vector3.up, up);
Camera.main.transform.RotateAround(_sphere, Vector3.left, left);
}
Turns out that is was easier than I expected. I thought about calculating the rotation axis and came to the conclusion that is must be the cross product of the start and end vector. Take a look at the solution. The RotateCamera method is where the math magic happens :)
public class GoogleEarthControls : MonoBehaviour
{
private const int SpehreRadius = 1;
private Vector3? _mouseStartPos;
private Vector3? _currentMousePos;
void Start () {
// init the camera to look at this object
Vector3 cameraPos = new Vector3(
transform.position.x,
transform.position.y,
transform.position.z - 2);
Camera.main.transform.position = cameraPos;
Camera.main.transform.LookAt(transform.position);
}
private void Update()
{
if (Input.GetMouseButtonDown(0)) _mouseStartPos = GetMouseHit();
if (_mouseStartPos != null) HandleDrag();
if (Input.GetMouseButtonUp(0)) HandleDrop();
}
private void HandleDrag()
{
_currentMousePos = GetMouseHit();
RotateCamera((Vector3) _mouseStartPos, (Vector3)_currentMousePos);
}
private void HandleDrop()
{
_mouseStartPos = null;
_currentMousePos = null;
}
private void RotateCamera(Vector3 dragStartPosition, Vector3 dragEndPosition)
{
// in case the spehre model is not a perfect sphere..
dragEndPosition = dragEndPosition.normalized * SpehreRadius;
dragStartPosition = dragStartPosition.normalized * SpehreRadius;
// calc a vertical vector to rotate around..
var cross = Vector3.Cross(dragEndPosition, dragStartPosition);
// calc the angle for the rotation..
var angle = Vector3.SignedAngle(dragEndPosition, dragStartPosition, cross);
// roatate around the vector..
Camera.main.transform.RotateAround(transform.position, cross, angle);
}
/**
* Projects the mouse position to the sphere and returns the intersection point.
*/
private static Vector3? GetMouseHit()
{
// make sure there is a shepre mesh with a colider centered at this game object
// with a radius of SpehreRadius
RaycastHit hit;
if (Physics.Raycast(Camera.main.ScreenPointToRay(Input.mousePosition), out hit))
{
return hit.point;
}
return null;
}
}
Basic rotation based on mouse drag, based on what you have:
Transform camTransform = Camera.main.transform;
if (Input.GetMouseButton(0))
{
camTransform.RotateAround(currentLookTargetTransform.position, -camTransform.right * Input.GetAxis("Mouse Y") + camTransform.up * Input.GetAxis("Mouse X"), 120 * Time.deltaTime);
}
You can multiply the relative direction by the mouse change value to get the axis. Then you can supplant your clamp points in; but the point was to rotate it relatively.
I am trying to make a simple game where I am shooting a projectile when the user touches the screen, I first spawn 8 projectiles (sprites) and when the user touches the screen I would like the top projectile to fly in the touch direction. I was able to do this; However, every time I shoot, the projectile goes in the wrong direction, here is an image which will illustrate the issue.
Obviously the image is still here but the object will continue flying until it goes out of the screen and then gets destroyed.
Here is the code snippet that handles this
GameplayController.cs
if (Input.GetMouseButtonDown(0))
{
Vector3 position = new Vector3(Input.mousePosition.x, Input.mousePosition.y, 0);
position = Camera.main.ScreenToWorldPoint(position);
GameObject target;
target = new GameObject();
target.transform.position = position;
Arrow comp = currentArrows[0].GetComponent<Arrow>();
comp.setTarget(target.transform);
comp.GetComponent<Arrow>().arrowSpeed = 12;
comp.GetComponent<Arrow>().shoot = true;
currentArrows.RemoveAt(0);
Destroy(target);
}
I know I am getting the mouse input here and not the phone touch and that's fine for me, later I will convert it to the touch input.
Arrow.cs
public bool shoot = false;
public float arrowSpeed = 0.0f;
public Vector3 myDir;
public float speed = 30.0f;
private Transform target;
// Use this for initialization
void Start () {
}
// Update is called once per frame
void Update () {
if(shoot)
{
transform.position += transform.right * arrowSpeed * Time.deltaTime;
}
}
public void setTarget(Transform targetTransform)
{
this.target = targetTransform;
Vector3 vectorToTarget = target.position - transform.position;
float angle = Mathf.Atan2(vectorToTarget.y, vectorToTarget.x) * Mathf.Rad2Deg;
Quaternion q = Quaternion.AngleAxis(angle, Vector3.forward);
transform.rotation = Quaternion.Slerp(transform.rotation, q, Time.deltaTime * speed);
}
private void OnBecameInvisible()
{
print("Disappeared");
Destroy(gameObject);
Gameplay.instance.isShooting = false;
}
Vector3 position = new Vector3(Input.mousePosition.x, Input.mousePosition.y, 0);
I think that your problem is that you're getting the screen coordinates by click, not the world coordinates, which is actually two different things. In order to direct your projectile correctly you need to convert your screen coordinates to world like it's done here.
The next thing is how you move the projectile:
transform.position += transform.right * arrowSpeed * Time.deltaTime;
You're moving the projectile to the right and then rotating it somehow. Maybe you should try to move it with Vector3.Lerp, which will be easier and rotate it with Transform.LookAt.
I want to move an instance of a gameObject along the outline of another gameobject. Its a 2D Project in Unity.
My current Code:
Vector3 mousePosition = m_camera.ScreenToWorldPoint(Input.mousePosition);
RaycastHit2D hit = Physics2D.Raycast(new Vector2(mousePosition.x, mousePosition.y), new Vector2(player.transform.position.x, player.transform.position.y));
if (hit.collider != null && hit.collider.gameObject.tag == "Player") {
if (!pointerInstance) {
pointerInstance = Instantiate(ghostPointer, new Vector3(hit.point.x, hit.point.y, -1.1f), Quaternion.identity);
} else if(pointerInstance) {
pointerInstance.gameObject.transform.position = new Vector3(hit.point.x, hit.point.y, -1.1f);
pointerInstance.gameObject.transform.eulerAngles = new Vector3(0f, 0f, hit.normal.x);
}
}
Unfortunately, the gameObject doesn't rotate towards the mouse and the position on the left side of the playerObject is also sometimes off. I tried to use Instantiate() with Quaternion.LookRotation(hit.normal), but no luck either.
Here a rough sketch of what I want to achieve:
Any help is appreciated. Thanks!
it's better to use Mathematical way instead of physical way(Raycasting),because in raycasting you have to throw ray several time for checking hit point and rotate your object,it makes lag in your game.
Attach this script to your instantiated object:
using UnityEngine;
using System.Collections;
public class Example : MonoBehaviour
{
public Transform Player;
void Update()
{
//Rotating Around Circle(circular movement depend on mouse position)
Vector3 targetScreenPos = Camera.main.WorldToScreenPoint(Player.position);
targetScreenPos.z = 0;//filtering target axis
Vector3 targetToMouseDir = Input.mousePosition - targetScreenPos;
Vector3 targetToMe = transform.position - Player.position;
targetToMe.z = 0;//filtering targetToMe axis
Vector3 newTargetToMe = Vector3.RotateTowards(targetToMe, targetToMouseDir, /* max radians to turn this frame */ 2, 0);
transform.position = Player.position + /*distance from target center to stay at*/newTargetToMe.normalized;
//Look At Mouse position
var objectPos = Camera.main.WorldToScreenPoint(transform.position);
var dir = Input.mousePosition - objectPos;
transform.rotation = Quaternion.Euler(0, 0, Mathf.Atan2(dir.y, dir.x) * Mathf.Rad2Deg);
}
}
Useful explanations
Atan2:
atan2(y,x) gives you the angle between the x-axis and the vector (x,y), usually in radians and signed such that for positive y you get an angle between 0 and π, and for negative y the result is between −π and 0.
https://math.stackexchange.com/questions/67026/how-to-use-atan2
Returns the angle in radians whose Tan is y/x.
Return value is the angle between the x-axis and a 2D vector starting at zero and terminating at (x,y).
https://docs.unity3d.com/ScriptReference/Mathf.Atan2.html
Mathf.Rad2Deg:
Radians-to-degrees conversion constant (Read Only).
This is equal to 360 / (PI * 2).