I'm making a game in Unity where the game's main camera is controlled by the phone orientation. The problem I have is that the gyroscope's data is very noisy and make the camera rotate in a very jittery way. I tried on different phone to make sure the hardware is not the problem(Galaxy s6 and Sony Xperia T2).
I've tried the following but none seems to work:
-Slerp between current rotation and new attitude(too much jitter no matter the what I multiply Time.deltaTime by)
//called at each update
transform.rotation = Quaternion.Slerp(transform.rotation, new Quaternion(-Input.gyro.attitude.x, -Input.gyro.attitude.y,
Input.gyro.attitude.z, Input.gyro.attitude.w), 60 * Time.deltaTime);
-Average the last gyro samples (either Euleur angles average or Quaternion average; both cases still offer too much jitter no matter how many samples I track)
//called at each update
if (vectQueue.Count >= 5)
{
vectQueue.Enqueue(Input.gyro.attitude.eulerAngles);
vectQueue.Dequeue();
foreach (Vector3 vect in vectQueue) {
avgr = avgr + vect;
}
avgr = new Vector3(avgr.x/5, avgr.y/5,avgr.z/5);
transform.rotation = Quaternion.Slerp(transform.rotation, Quaternion.Euler(avgr),Time.deltaTime*100);
}
else
{
vectQueue.Enqueue(Input.gyro.attitude.eulerAngles);
}
-Round the gyroscope data(best solution so far to prevent jitter but obviously this isn't smooth at all)
-Apply high/low pass filter (doesn't seem to do anything)
public float alpha = 0.5f;
//called at each update
v1 = Input.gyro.attitude.eulerAngles;
if (v2== null)
{
v2 =v1;
v3 = Input.gyro.attitude.eulerAngles;
}
else
{
v3 = (1 - alpha) * v3 + (1 - alpha)*(v1 - v2);
v2 = v1;
transform.Rotate(v3);
}
-Copy paste code from people like this; but the jitter is way too strong in all cases.
So now I'm at a point where I guess I'll try learning kalman filters and implement it in unity but obviously I'd prefer a black box solution to save time.
Always use noise cancellation when using analogue input , you can do so by calculating the difference between gyro values in current frame and gyro values in previous frame , and if the difference is greater then desired amount (0.002 or 0.03 might be good) rotate your camera on gyro values.
This will eventually solve your problem of jittering .Hope you will get it
Found the problem; any of the ways I mentioned earlier are valid under normal circumstances(use a combination for optimal effects). The thing that was causing the insane jitter was the position of the camera. It had very large numbers as coordinates and I guess this was messing up the engine.
hello although i think you have found a solution for that i had an idea about how to smoothen it up using Quaternion.Lerp.
#the rot is used to face forward
Quaternion rot = new Quaternion(0,0,1,0)
Quaternion.lerp(transform.rotation* rot,gyro*rot,10)
Related
I am trying to get a car to move by itself in Unity (C#). I can get the car to travel forwards by using:
transform.position += transform.forward * Time.deltaTime * speed;
However, I'm not sure how to get the car to turn. I would like it to behave in the following way:
Imagine a car enters a cross roads (square of size 1x1 and negligible height) at point (0.5,0,0), i'd like it to exit the cross road at point (0,0,0.5) by simulating the turning of the vehicle in a circular pattern.
Does anybody know how to do this? I am getting problems with my mathematics.
As always, all help is appreciated :).
First of all you gotta detect the car position >
> if (car.transform.position == 'WANTED POSITION') {
while(car.transform.position.WANTEDSCALE != 'WANTED ROTATION'){
car.transform.Rotate (new Vector3 (0f, 0f, 'ROTATION SPEED' )); //rotatesZ
}
}
WANTEDPOSITION = VECTOR 3 (0.5,0,0) as you wanted it
WANTEDSCALE = X/Y/Z = i dont know which one of them you want to rotate(probably z)
WANTED ROTATION = the rotation you would like to be at the end of the 'autopilot'
ROTATION SPEED = rotates at the speed you want... put it low and go testing until a number you want comes
Sorry for not coding everything right now... i just gave you the all logic... hit me up if you have any doubts or send me the whole code and maybe i will adjust it for you
I'm creating an fps game in unity and chose to not use the included First Person Controller script to reduce complexity. The camera object will constantly be set to the players position and can rotate. I have created a script containing the following code for movement:
float v = Input.GetAxis("Vertical");
float h = Input.GetAxis("Horizontal");
Vector2 rotation = new Vector2(
Mathf.Cos(cam.transform.eulerAngles.y * deg2rad), (Mathf.Sin(cam.transform.eulerAngles.y * deg2rad))
);
rb.velocity = (
new Vector3(
(-rotation.x * h * speed) + (rotation.y * v * speed),
rb.velocity.y,
(-rotation.y * h * speed) + (-rotation.x * v * speed)
)
);
When I test out the game, the movement is correct along the x-axis in both directions, but is unusual when the players y rotation becomes something other than being aligned with the x-axis (Like moving the player backwards will actually move them forwards and visa-versa).
I'm open to an alternative besides using trig functions for the movement, as I have already used transform.forward and transform.right, but they didn't work entirely.
First thing I'd say is that unless you're intending to learn trig and geometrical functions then you should not reinvent the wheel. As you've stated you want to create a FPS game so really you should leverage the scripts and prefabs that others have created to enable the creation of your game.
If you don't want to use the inbuilt FPS Controller script I'd recommend using a free Asset package named '3rdPerson+Fly'. It appears a bit complex at first however you'll learn about states and stackable behaviours/modes which is going to get you an outcome much faster than creating from scratch. You'll also get the flexibility and openness that comes with a non-inbuilt package. This way you can peek at the inner workings if desired or simply build on top of them. Don't fall for NIH (Not Invented Here) syndrome and stand on the shoulders of giants instead :)
Good luck!
The issue you're having is likely caused by the fact that sin and cos can't determine by themselves which "quadrant" they're in. For example 30 degree angle is the same as a 150 degree angle as far as Sin is concerned.
This video is a fast and good explanation of the issue:
https://www.youtube.com/watch?v=736Wkg8uxA8
I am trying to rotate two Quaternions by the same amount (using the same Quaternion). I have an absoluteRotation and a relativeRotation (relative to its parent), I want to rotate the absoluteRotation towards another Transform's absoluteRotation, and then change the stored relative location as well as the stored absoluteRotation to match again:
void LateUpdate()
{
Quaternion newAbsRotation = Quaternion.RotateTowards(absRotationChain[0], boneChain[0].rotation, 0.5f);
Quaternion modX = Quaternion.Inverse(absRotationChain[0]) * newAbsRotation;
if(newAbsRotation == (absRotationChain[0] * modX)) {
Debug.Log("Equal!");
} else {
Debug.Log("Unequal!");
}
absRotationChain[0] = newAbsRotation;
// absRotationChain[0] = (absRotationChain[0] * modX) // doesn't work
}
However, I am unable to create a relative Quaternion for the rotation between both absolute Quaternions. I've read on multiple sources that it is supposed to be q' = q1-1 * q2, and in fact, the code above never prints out "Unequal!". However, this changes when I replace
absRotationChain[0] = newAbsRotation
with
absRotationChain[0] = (absRotationChain[0] * modX)
which is supposed to be equal. Instead of just producing the same result, the Quaternion iteratively approaches very quickly (during half a second) a (0, 0, 0, 0) Quaternion. If I then also replace the third parameter of the RotateTowards-function (0.5f) with 11f, then my Quaternion becomes (NaN, NaN, NaN, NaN).
I worked for hours on this problem now and can't find the reason for this issue, let alone a solution :(
I finally solved this problem!!
Obviously Unity has a precision problem with Quaternions.
First of all the Quaternion == Quaternion comparision I made wasn't really safe and just an approximation (they even wrote it in the docs!). So I found out that both were actually different at a very very insignificant digit. I guess Unity doesn't properly normalize the Quaternions.
In the end, using
Quaternion.Euler((absRotationChain[0] * modX).eulerAngles)
proved to be a decent workaround -.-'
Edit: Oh look, knowing that it is a normalization issue brought me this link from a guy having the same problem: https://gamedev.stackexchange.com/questions/74437/unity-quaternion-rotate-unrotate-error
I have my code I wrote for displaying a mirror as a plane with a texture from a RenderTarget2D each frame.
This all works perfectly fine, but I still think that there is something wrong in the way the reflection goes (like, the mirror isn't looking exacly where it's supposed to be looking).
There's a screenshot of the mirror that doesn't really look bad, the distort mainly occurs when the player gets close to the mirror.
Here is my code for creating the mirror texture, notice that the mirror is rotated by 15 degrees on the X axis.
RenderTarget2D rt;
...
rt = new RenderTarget2D(device, (int)(graphics.PreferredBackBufferWidth * 1.5), (int)(graphics.PreferredBackBufferHeight * 1.5));
...
device.SetRenderTarget(rt);
device.Clear(Color.Black);
Vector3 camerafinalPosition = camera.position;
if (camera.isCrouched) camerafinalPosition.Y -= (camera.characterOffset.Y * 6 / 20);
Vector3 mirrorPos = new Vector3((room.boundingBoxes[8].Min.X + room.boundingBoxes[8].Max.X) / 2, (room.boundingBoxes[8].Min.Y + room.boundingBoxes[8].Max.Y) / 2, (room.boundingBoxes[8].Min.Z + room.boundingBoxes[8].Max.Z) / 2);
Vector3 cameraFinalTarget = new Vector3((2 * mirrorPos.X) - camera.position.X, (2 * mirrorPos.Y) - camerafinalPosition.Y, camera.position.Z);
cameraFinalTarget = Vector3.Transform(cameraFinalTarget - mirrorPos, Matrix.CreateRotationX(MathHelper.ToRadians(-15))) + mirrorPos;
Matrix mirrorLookAt = Matrix.CreateLookAt(mirrorPos, cameraFinalTarget, Vector3.Up);
room.DrawRoom(mirrorLookAt, camera.projection, camera.position, camera.characterOffset, camera.isCrouched);
device.SetRenderTarget(null);
And then the mirror is being drawn using the rt texture.
I supposed something isn't completly right with the reflection physics or the way I create the LookAt matrix, Thanks for the help.
I didn't use XNA, but I did some Managed C# DX long time ago, so I don't remember too much, but are you sure mirrorLookAt should point to a cameraFinalTarget? Because basically the Matrix.CreateLookAt should create a matrix out of from-to-up vectors - 'to' in your example is a point where mirror aims. You need to calculate a vector from camera position to mirror position and then reflect it, and I don't see that in your code.
Unless your room.DrawRoom method doesn't calculate another mirrorLookAt matrix, I'm pretty sure your mirror target vector is the problem.
edit: Your reflection vector would be
Vector3 vectorToMirror = new Vector3(mirrorPos.X-camera.position.Y, mirrorPos.Y-camera.position.Y, mirrorPos.Z-camera.position.Z);
Vector3 mirrorReflectionVector = new Vector3(vectorToMirror-2*(Vector3.Dot(vectorToMirror, mirrorNormal)) * mirrorNormal);
Also I don't remember if the mirrorReflectionVector shouldn't be inverted (whether it is pointing to mirror or from mirror). Just check both ways and you'll see. Then you create your mirrorLookAt from
Matrix mirrorLookAt = Matrix.CreateLookAt(mirrorPos, mirrorReflectionVector, Vector3.Up);
Though I don't know wher the normal of your mirror is. Also, I've noticed 1 line I can't really understand
if (camera.isCrouched) camerafinalPosition.Y -= (camera.characterOffset.Y * 6 / 20);
What's the point of that? Let's assume your camera is crouched - shouldn't its Y value be lowered already? I don't know how do you render your main camera, but look at the mirror's rendering position - it's way lower than your main eye. I don't know how do you use your IsCrouched member, but If you want to lower the camera just write yourself a method Crouch() or anything similar, that would lower the Y value a little. Later on you're using your DrawRoom method, in which you pass camera.position parameter - yet, it's not "lowered" by crouch value, it's just "pure" camera.position. That may be the reason it's not rendering properly. Let me know If that helped you anything.
For a project my team and I have been trying to track a wiimote in a 3D space using the built in accelerometer and the WiiMotion Plus gyroscope.
We’ve been able to track the rotation and position using an ODE (Found at http://www.alglib.net/,) but we’ve run into a problem with removing the gravity component from the accelerometer.
We looked at Accelerometer gravity components which had the formula (implemented in C# / XNA)
private Vector3 RemoveGravityFactor(Vector3 accel)
{
float g = -1f;
float pitchAngle = (Rotation.Z);
float rollAngle = (Rotation.Y);
float yawAngle = (Rotation.X);
float x = (float)(g * Math.Sin(pitchAngle));
float y = (float)(-g * Math.Cos(pitchAngle) * Math.Sin(rollAngle));
float z = (float)(-g * Math.Cos(pitchAngle) * Math.Cos(rollAngle));
Vector3 offset = new Vector3(x, y, z);
accel = accel - offset;
return accel;
}
But it doesn’t work at all. As a reference, the acceleration is straight from the accelerometer, and the rotation is measured in radians after it has been worked through the ODE.
Also, We are having problems with understanding how this formula works. Due to the fact that our tracking is taking into account all dimensions, why is Yaw not taken into account?
Thanks in advance for any advice or help that is offered.
EDIT:
After discussing it with my teammates and boss, we've come to find that this formula would actually work if we were using X, Y, and Z correctly. We've come to another stump though.
The problem that we're having is that the Wiimote library that we're using returns relative rotational values based on the gyroscope movement. In otherwords, if the buttons are facing up, rotating the wiimote left and right is yaw and if the buttons are facing toward you, yaw is the same when it SHOULD be the rotation of the entire wiimote.
We've found that Euler angles may be our answer, but we're unsure how to use them appropriately. If there is any input on this new development or any other suggestions please give them.
I'd bet that your accelerometer was not calibrated in zero gravity, so removing the effect of gravity will be difficult, at the very least.
First I'd suggest not using individual components to store the rotation (gimbal lock), a matrix would work better. calibrate by holding it still and measuring (it will be 1g downward). then for each rotation, multiple the rotation matrix by it. then you can tell which way is up and subtract a matrix of 1g down from the vector representing the acceleration. I know that doesn't make a lot of sense but I'm in a bit of a rush, add comments if you have questions.