GameObject:
I have a gameObject "Sphere" with the following properties:
Starting scale of 1.5 (x, y, z).
A script that makes sure that the scale is between 0 and 150.
What do I have:
Now, I have implemented a function that the user can scale the GameObject by using the HTC Vive Controllers (we are using Virtual Reality).
This function checks the distance between the controllers (often between -1 and 1 to decide if we want to upscale or downscale the object).
So when I have this value between -1 and 1, I am scaling the GameObject by the value multiple the sensitivity (this is editable in the Unity Editor).
What do I want:
This works pretty fine, although, I want to increase the sensitivity over time on a not hard-coded way. So when the GameObject is very small, the scaling will be very slow. When the GameObject is pretty big, the scaling will go quick.
What have I tried:
I have this value (between -1 and 1), then I will multiply this value with the sensitivity.
Then I will multiply by the current scale / the maximum scale.
However, this is causing an issue that the zooming in is going faster then zooming out.
The code that I am using looks like below:
float currentControllerDistance = Vector3.Distance(LeftHand.transform.position, RightHand.transform.position);
float currentZoomAmount = currentControllerDistance - ControllersStartPostionDifference; // Value is between -1 and 1.
currentZoomAmount = currentZoomAmount * ScalingSensitivity; // Multiplying by the value in the Unity Editor.
float currentPercentage = ObjectToScale.transform.localScale.x / ObjectMaximumScale.x; // Current scale percentage in comparison to the maximum scale.
currentZoomAmount = currentZoomAmount * currentPercentage; // Changing the ObjectToScale by adding the currentZoomAmount.
ObjectToScale.transform.localScale = new Vector3(ObjectCurrentScale.x + currentZoomAmount, ObjectCurrentScale.y + currentZoomAmount, ObjectCurrentScale.z + currentZoomAmount);
Does someone have any idea how to do this kind of scaling?
Thanks in forward.
If I understood the question correctly, you're looking for way to specify the rate of change of your scaling so that it changes faster when closer to the maximum scale, which sounds like a job for an easing function.
If your project already uses a tweening library like DOTween, this should be easily done with that library's capabilities. If not, you can try using the equation for the cubic bézier, which is one of the simpler curves:
Cubic Bézier
This is simply y = x^3, so you can try ObjectMaximumScale.x * currentPercentage * currentPercentage * currentPercentage to get a value that goes from 0 to ObjectMaximumScale.x when fed a value between 0 and 1 respectively.
Related
I'm making a Theremin-like app in Unity (C#).
I have horizontal Axis X, on which I can click (with a mouse or with a finger on a smartphone).
This X-axis determines the frequency, which will be played.
The user will specify the frequency range of the board (X-Axis), let's say from frequency 261.63 (note C4) to 523.25 (note C5).
I'll calculate x_position_ratio which is a number between 0 and 1 determining, where did the user click on the X-axis (0 being on the most left (note C4 in this example), 1 on the most right (note C5))
From this, I will calculate the frequency to play by equation
float freqRange = maxFreq - minFreq;
float frequency = (x_position_ratio * freqRange) + minFreq;
And then play the frequency. It works just fine.
If I draw the notes on the board (X-axis), we can see, that the higher is the frequency, the higher is the jump between the 2 notes.
// Drawing just note A4 to demonstrate the code
float a4 = 440.0f //frequency of note A4
float x_position = (a4 - minFreq) / freqRange;
loc_x_position indicating the position of the note on the X-axis between 0 to 1
output
Question:
I would like to make the jump, same, between 2 notes (Make it linear instead of exponential, if you understand what I mean). Found the equation on Wikipedia Piano_key_frequencies but it's for the keys. I want it to every frequency and I cannot figure out how to implement it in my 2 code examples I posted
I figured it out. Tried to plot it logarithmic to at least approximate the result.
I was inspired by this answer Plotting logarithmic graph
Turns out this solution worked
To draw notes on the x-axis I used this:
minFreq = Mathf.Log10(minFreq);
maxFreq = Mathf.Log10(maxFreq);
float freqRange = maxFreq - minFreq;
x_position = (Mathf.Log10(frequencyOfNoteToPlot) - minFreq) / freqRange;
and to calculate the frequency I just derived frequency from the equation for the x_position and ended up with this piece of code:
frequency = Mathf.Pow(10, x_position * freqRange + minFreq);
One thing I still don't get is, why it doesn't matter, which base of the logarithm I use. Is it because I'm always getting ratio (value between 0 to 1)?
I'd like to find out how far the player is pushing an Xbox controller analogue
stick. I don't need to know the angle it is being pushed, just how far. I'd like it to return a value of 0 to 1. Here are some examples!
If the player was pushing the stick fully down it would return 1.
If the stick was in the center it would return 0.
If they were pushing the stick to its full extent but at a 45 degree angle it would still return 1.
If they were pushing the stick only halfway and at a 135 degree angle it would return 0.5
Code I have tried so far
new Vector2(Input.GetAxisRaw("Horizontal"),Input.GetAxisRaw("Vertical")).magnitude
and I tried this code, despite not really understanding it :-/
float xAxis = (Input.GetAxis ("Player" + padNumber + "Horizontal"));
float yAxis = (Input.GetAxis ("Player" + padNumber + "Vertical"));
thumbStickDistance = Mathf.Sqrt(Mathf.Abs((xAxis*xAxis)+(yAxis*yAxis)));
However - Both these return different values depending on the angle the stick is pushed.
You have the answer, but you might think you don't because you are expecting to see different values. The magnitude is what you want.
First of all, your formula for thumbStickDistance is exactly what the magnitude property for a Vector2 does. Second, the difference between GetAxis and GetAxisRaw is that GetAxis applies some smoothing. This is best seen if you use the arrow keys. GetAxis will rise and fall, whereas GetAxisRaw will jump to -1, 0 or 1 immediately. Which you use for the xbox controller doesn't matter much.
Now, both the horizontal and vertical axes are in the range -1;1. If the xbox controller joystick had a square movement area then you could have both axes at 1 and the maximum distance you could measure would be sqrt(2) or about 1.414. If the controller had a perfectly circular movement area then the maximum distance you could measure would be 1 no matter which direction you pushed it in. As it turns out, although the xbox controller is closer to circular than square, it is neither. The maximum distance i was able to measure was about 1.16 which means that i reached a point just outside the circle around the centre.
So my advice would be to use the magnitude and if you need it use Clamp01 to limit the value to not exceed 1. The alternative would be to calibrate the controller, which could be complicated and it would need to be done each time you switch controllers as there are bound to be differences even between two otherwise identical xbox controllers.
I'm developing a 3D spaceshooter in XNA as a school project (basically Asteroids in 3D with power-ups), and have been working to implement roll, pitch, and yaw with respect to the ship's local axes. (I should emphasize: the rotation is not with respect to the absolute/world x, y, and z axes.) Sadly, I've been struggling with this for the last few weeks. Google and my neolithic monkey brain have failed me; maybe you folks can help!
Here's my setup:
Via keyboard input, I have the following variables ready to go:
yawRadians, which stores the desired yaw away from the ship's initial
position
pitchRadians, which stores the desired pitch away from the
ship's initial position
rollRadians, which stores the desired roll
away from the ship's initial position
The ship also maintains its own Front, Back, Right, Left, Top and Bottom unit vectors, which are used both for the rotations and also for propulsion. (Different keys will propel the ship toward the Front, Back, etc. This part is working great.)
Ultimately, I generate the rotation matrix mShipRotation, representing all of the ship's rotations, which is passed to the ship's draw method.
The problem I have is with the rotations themselves. Different solutions I've tried have had differing results. Here's what I've gone with so far:
Method 1 – Yaw, Pitch, and Roll relative to the absolute/world x, y, and z axes
At first, I naively tried using the following in my ship's Update method:
qYawPitchRoll = Quaternion.CreateFromYawPitchRoll(yawRadians, pitchRadians, rollRadians);
vFront = Vector3.Transform(vOriginalFront, qYawPitchRoll);
vBack = -1 * vFront;
vRight = Vector3.Transform(vOriginalRight, qYawPitchRoll);
vLeft = -1 * vRight;
vTop = Vector3.Transform(vOriginalTop, qYawPitchRoll);
vBottom = -1 * vTop;
mShipRotation = Matrix.CreateFromQuaternion(qYawPitchRoll);
(vOriginalFront, vOriginalRight, and vOriginalTop just store the ship's initial orientation.)
The above actually works without any errors, except that the rotations are always with respect to the x, y, and z axes, and not with respect to the ship's Front/Back/Right/Left/Top/Bottom vectors. This results in the ship not always yawing and pitching as expected. (Specifically, yawing degenerates to rolling if you have pitched up so the ship is pointing to the top. This makes sense, as yawing in this solution is just rotating about the world up axis.)
I heard about the Quarternion.CreateFromAxisAngle method, which sounded perfect. I could just combine three Quaternion rotations, one around each of the ship's local axis. What could go wrong?
Method 2 – Quaternion.CreateFromAxisAngle
Here's the second code snippet I used in my ship's Update method:
qPitch = Quaternion.CreateFromAxisAngle(vRight, pitchRadians);
qYaw = Quaternion.CreateFromAxisAngle(vTop, yawRadians);
qRoll = Quaternion.CreateFromAxisAngle(vFront, rollRadians);
qPitchYawAndRoll = Quaternion.Concatenate(Quaternion.Concatenate(qPitch, qYaw), qRoll);
vFront = Vector3.Normalize(Vector3.Transform(vOriginalFront, qPitchYawAndRoll));
vBack = -1 * vFront;
vRight = Vector3.Normalize(Vector3.Transform(vOriginalRight, qPitchYawAndRoll));
vLeft = -1 * vRight;
vTop = Vector3.Normalize(Vector3.Transform(vOriginalTop, qPitchYawAndRoll));
vBottom = -1 * vTop;
mShipRotation = Matrix.CreateFromQuaternion(qPitchYawAndRoll);
The above works perfectly if I only do one rotation at a time (yaw, pitch, or roll), but if I combine more than one rotation simultaneously, the ship begins to wildly spin and point in many different directions, getting more and more warped until it disappears entirely.
I've tried variants of the above where I first apply the Pitch to all the vectors, then the Yaw, then the Roll, but no luck.
I also tried it using Matrices directly, despite concerns of Gimbal Lock:
Method 3: Matrices
mShipRotation = Matrix.Identity;
mShipRotation *= Matrix.CreateFromAxisAngle(vRight, pitchRadians);
mShipRotation *= Matrix.CreateFromAxisAngle(vFront, rollRadians);
mShipRotation *= Matrix.CreateFromAxisAngle(vTop, yawRadians);
vFront = Vector3.Normalize(Vector3.Transform(vOriginalFront, mShipRotation));
vBack = -1 * vFront;
vRight = Vector3.Normalize(Vector3.Transform(vOriginalRight, mShipRotation));
vLeft = -1 * vRight;
vTop = Vector3.Normalize(Vector3.Transform(vOriginalTop, mShipRotation));
vBottom = -1 * vTop;
No luck; I got the same behavior. One rotation at a time is okay, but rotating about multiple axes resulted in the same bizarre spinning behavior.
After some brilliant debugging (read as: blindly outputting variables to the console), I noticed that the Front/Right/Top vectors were slowly, over time, becoming less orthogonal to one another. I added Normalization to vectors basically every step of the way, and also tried computing new vectors based on cross products, to try to ensure that they always remained perpendicular to one another, but even then they were not perfectly orthogonal. I'm guessing this is due to floating point math not being perfectly precise.
Note that I regenerate the mShipRotation matrix every Update method, so it cannot be accumulating drift or inaccuracies directly. I think that applying multiple Quarternion rotations may be accumulating error (as I can do one rotation just fine), but my attempts to fix it have not worked.
In short:
I can pitch/roll/yaw relative to the world axes x, y, and z just
fine. It's just not what the player would expect to happen as the
rolling/pitching/yawing is not relative to the ship, but to the
world.
I can roll, pitch, or yaw around the ship's local axes (Front/Back/Top/Bottom/Left/Right) just fine, but only one at a time. Any combination of them will cause the ship to spiral and deform rapidly.
I've tried Quaternions and Matrices. I've tried suggestions I've found in various forums, but ultimately do not wind up with a working solution. Often people recommend using Quaternion.CreateFromYawPitchRoll, not really realizing that the intent is to have a ship rotate about its own (constantly changing) axes, and not the (fixed) world axes.
Any ideas? Given a situation where you are given the roll, pitch, and yaw about a ship's front, right, and top vectors, how would you go about creating the rotation matrix?
You seem to be applying your overall angles (yawRadians, pitchRadians, rollRadians) to your local axis in your methods 2 & 3. These values are married to the world axis and have no meaning in local space. The root of your problem is wanting to hang onto the 3 angles.
In local space, use an angular amount that is the amount you want to rotate between frames. If you only pitched up 0.002f radians since the last frame, that would be what you would use when you rotate around the vRight axis.
This will screw with your overall angle values (yawRadians, pitchRadians, & rollRadians) and render them useless but most folks who stick with 3d programming quickly drop the angle approach to storing the orientation anyway.
Simply rotate your matrix or quaternion little by little each frame around your local axis and store the orientation in that structure (the quat or matrix) instead of the 3 angles.
There is no worries about gimbal lock when you are rotating a matrix about local axis like this. You would have to have 90 degree rotations between frames to bring that into the picture.
If you want to avoid error accumulation use a quat to store the orientation and normalize it each frame. Then the matrix you send to the effect will be made each frame from the quat and will be ortho-normal. Even if you didn't use a quat and stored your orientation in a matrix it would take hours or days to accumulate enough error to be visually noticeable.
This blog might help: http://stevehazen.wordpress.com/2010/02/15/matrix-basics-how-to-step-away-from-storing-an-orientation-as-3-angles/
I think this might be what you're looking for:
http://forums.create.msdn.com/forums/t/33807.aspx
I'm pretty sure that CreateFromAxisAngle is the way to go.
I am building a Kinect SDK WPF Applicaiton and using the Kinect to move a "cursor"/hand object.
The problem i am having is that at 30 frames a second the cursor is actually jumping around a bit erratically because of the precision of the Kinect (i.e. while holding your hand still the object moves within a 5px space).
I am planning on writing an algorithm that doesn't simply move the X/Y of my "cursor" sprint to the right position on the screen, but behaves more like a "move the hand towards this X/Y co-ordinate" so that it is a more smooth movement.
Can someone point me to a good one that someone else has written so i can avoid reinventing the wheel.
I understand that this is probably pretty common, but as i am more of a business developer i am not sure of the name for such a feature so apologies in advance if its a n00b question.
When I worked with the Kinect, I just used some simple math (which I think is called linear regression) to move to a point some distance between the cursor's current location and its target location. Get the location of the cursor, get the location the user's hand is at (translated to screen coordinates), then move the cursor to some point between those.
float currentX = ..., currentY = ..., targetX = ..., targetY = ...;
float diffX = targetX - currentX;
float diffY = targetY - currentY;
float delta = 0.5f; // 0 = no movement, 1 = move directly to target point.
currentX = currentX + delta * diffX;
currentY = currentY + delta * diffY;
You'll still get jittering, depending on the delta, but it will be much smoother and generally in a smaller area.
On a related note, have you taken a look at the Kinect's skeleton smoothing parameters? You can actually let the SDK handle some of the filtering.
Consider your input values (those jumping positions) as a signal with both low and high frequency parts. The low frequencies represent the rough position/movement while the high frequency parts contain the fast jumping within smaller distances.
So what you need or look for is a low pass filter. That filters out the high frequency parts and leaves the rough (but as accurate as the Kinect can get) position over, if you manage to set it up with the right parameter. This parameter is the crossover frequency for the filter. You have to play around a bit and you will see.
An implementation example for time-discrete values would be from here (originally from wikipedia):
static final float ALPHA = 0.15f;
protected float[] lowPass( float[] input, float[] output ) {
if ( output == null ) return input;
for ( int i=0; i<input.length; i++ ) {
output[i] = output[i] + ALPHA * (input[i] - output[i]);
}
return output;
}
You can put the last values of both the X and Y components of your position vectors into this function to smooth them out (input[0] for X and input[1] for Y, output[0] and output[1] are results of the previous function call).
Like I already said, you have to find a good balance for the smoothing factor ALPHA (0 ≤ ALPHA ≤ 1):
Too big and the signal will not get smoothed enough, the effect wont be sufficient
Too small and the signal will be smoothed 'too much', the cursor will lag behind the users movement, too much inertia
(If you look at the formula newout = out + alpha * (in - out), you see that with a alpha value of 0, you just take the old out value again, therefore the value will never change; while with a value of 1 you have newout = out + in - out that means you dont smooth anything but always take the newest value)
One very simple idea for solving this problem would be to display the cursor at a location that's the average of some past number of positions. For example, suppose that you track the last five locations of the hand and then display the cursor at that position. Then if the user's hand is relatively still, the jerkiness from frame to frame should be reasonably low, because the last five frames will have had the hand in roughly the same position and the noise should cancel out. If the user then moves the cursor across the screen, the cursor will animate as it moves from its old position to the new position, since as you factor in the last five positions of the hand the average position will slowly interpolate between its old and new positions.
This approach is very easily tweaked. You could transform the data points so that old points are weighted more or less than new points, and could adjust the length of the history you keep.
Hope this helps!
With reference to this programming game I am currently building.
I wrote the below method to move (translate) a canvas to a specific distance and according to its current angle:
private void MoveBot(double pix, MoveDirection dir)
{
if (dir == MoveDirection.Forward)
{
Animator_Body_X.To = Math.Sin(HeadingRadians) * pix;
Animator_Body_Y.To = ((Math.Cos(HeadingRadians) * pix) * -1);
}
else
{
Animator_Body_X.To = ((Math.Sin(HeadingRadians) * pix) * -1);
Animator_Body_Y.To = Math.Cos(HeadingRadians) * pix;
}
Animator_Body_X.To += Translate_Body.X;
Animator_Body_Y.To += Translate_Body.Y;
Animator_Body_X.From = Translate_Body.X;
Translate_Body.BeginAnimation(TranslateTransform.XProperty, Animator_Body_X);
Animator_Body_Y.From = Translate_Body.Y;
Translate_Body.BeginAnimation(TranslateTransform.YProperty, Animator_Body_Y);
TriggerCallback();
}
One of the parameters it accepts is a number of pixels that should be covered when translating.
As regards the above code, Animator_Body_X and Animator_Body_Y are of type DoubleAnimation, which are then applied to the robot's TranslateTransform object: Translate_Body
The problem that I am facing is that the Robot (which is a canvas) is moving at a different speed according to the inputted distance. Thus, the longer the distance, the faster the robot moves! So to put you into perspective, if the inputted distance is 20, the robot moves fairly slow, but if the inputted distance is 800, it literally shoots off the screen.
I need to make this speed constant, irrelevant of the inputted distance.
I think I need to tweak some of the Animator_Body_X and Animator_Body_Y properties in according to the inputted distance, but I don't know what to tweak exactly (I think some Math has to be done as well).
Here is a list of the DoubleAnimation properties that maybe you will want to take a look at to figure this out.
Is there are reason you're using DoubleAnimation? DoubleAnimation is designed to take a value from A to B over a specific time period using linear interpolation acceleration/deceleration at the start/end of that period if required (which is why it's "faster" for longer distance.. it has further to go in the same time!). By the looks of things what you are trying to do is move something a fixed distance each "frame" depending on what direction it is facing? That doesn't seem to fit to me.
You could calculate the length of the animation, depending on the distance, so the length is longer for longer distances, then the item is always moving at the same "speed". To me, it makes more sense to just move the item yourself though. You can calculate a objects velocity based on your angle criteria, then each "frame" you can manually move the item as far as it needs to go based on that velocity. With this method you could also easily apply friction etc. to the velocity if required.
The math you have to do is: velocity*time=distance
So, to keep the speed constant you have to change the animation's duration:
double pixelsPerSecond = 5;
animation.Duration = TimeSpan.FromSeconds(distance/pixelsPerSecond);
BTW, I don't think animations are the best solution for moving your robots.