I am working on a infinite runner game for android using Unity. The player runs (automatically) and has to avoid the obstacles by touching the screen and changing the gravity. But i sometimes when he hits the corner/side of the platform, the player gets stuck. I already applied a "noFriction" material on the player but it still happens sometimes.
(The small boxcollider under the player just checks if the player is touching the ground)
Here some pictures:
You could add a Physics Material, that has it's Friction and Bounciness set to 0. Resulting in making you slide of platforms if you just hit them from the side.
You should also make sure to set RigidBody2D Collison Detection to Continuous.
Related
My 2D game is not lagging, but for some reason the entire game world (except the player) is jittering when the player/camera moves. I tried parenting the camera to the Player and I tried using a script to make the camera move to the player, but it didn't help. It is worse if the framerate is lower, or if there are little frame drops. I use velocity to move the player. Using FixedUpdate (for the player and the camera) didn't help either, it just makes my player not jump every time I press the jump button. I tried searching but I didn't find a solution.
Honestly this is a really difficult and tedious problem to troubleshoot.
I've had issues previously with moving a game object by transform that had a rigidbody.
I've had issues where a rigidbody parent had a child with a rigidbody - also never do this, each hierarchy should have one and only one rigidbody, and that should be at the root level. A rigidbody will "inherit" or consider colliders at all levels at or below the rigidbody in the hierarchy.
I've had issues with a rigidbody that didn't have a collider.
I've had issues where the camera was attached to the rigidbody.
I've had issues where the interpolation settings were not set.
I'd really need to browse through the project and look at the setup to give any kind of specific feedback. Otherwise I'll say that rigidbodies and rigidbody motion is such a pain in the ass that I would recommend NOT using them unless you absolutely have to.
You can do collision checks as triggers without rigidbodies. You can code your own movement and gravity system pretty easily for 2D. The only thing you really need physics for is if you want to solve physics-based collisions, and there I would again say don't use it if you can help it. You can also programmatically add a rigidbody, like for a combat knockout, and add the knockout force and direction, but all the rest of the combat is using scripted motions.
I just purchased the A* Pathfinding asset from the Unity store. I have a few questions on getting it set up. My current problem is the AI Destination Setter will only follow the player's "0th" position. It goes to wherever my player starts at and does not update to the current position. (See Image below)
I have a game object prefab called "Essentials Loader" that I drag into every scene. This has things like my Player prefab, Game Manager, UI Overlay, Audio Manager, and other things that will be needed in every scene. I set it up this way for two reasons. 1. It prevents me from forgetting to drag in anything important into a new scene. 2. If I completely re-do the player prefab I only need to swap it out in the Essentials Loader and not in every single scene.
The code in this Essentials Loader checks for accidental duplicates and deletes them to prevent unwanted issues between scenes.
Temporary Solution: I have taken the player prefab out of the Essentials Loader and dragged it into the scene. Now when I assign the "AI Destination Setter" Target transform to the player from the scene and not the project window, it works just fine and follows the player correctly.
My question is: is there a way that I won't have to do it this way? I do plan on completely changing the player prefab a little while down the road and I would hate to have to make that change in every single scene. I would like to still have the player prefab in the Essentials Loader and have the enemy still follow me.
Instantiated objects are stuck at initial position while it was working just normally with no change in code.
I am making a spaceship based game and i was able to making instantiation of prefabs normally until i started adding layers so that the instantiated objects of different kind do not collide. But meanwhile when i ran the game my instantiating objects got stuck at the starting location of their launch.
You probably messed up your Layers a little bit.
To fix this problem go to the Physics 2D Manager (Edit > Project Settings > Physics 2D).
At the bottom you can see the layer collision matrix. (The fancy looking thing with checkboxes)
Because I don't know which layers you have I'm using "Player" as the layer for your player spaceship and "PlayerBullets" as the layer for the bullets your spaceship shoots.
You can now disable collision between the player layer and the player bullet layer.
If your enemies can also shoot, I would create a new Layer called "Bullets". The player layer should be able to collide with enemy bullets, but not with its own bullets.
I've been plucking away at a capital ship real-time strategy game, and I'm getting ready to start implementing a ground to space invasion system -- but I can't seem to wrap my head around how ground unit pathfinding would work if a Quadtree LOD system is being used.
Most ground-to-space games I've played seem to generate mesh colliders on the fly as you get closer to the ground, and the quadtree subdivides; with that comes all the NPCs that pop into existence for the first time and use said colliders, but I'd like my units to have a persistent presence as the player zooms in and out from the planet -- but I feel that having dynamic terrain and mesh colliders would mess up unit pathfinding, or they may glitch in and out of the terrain for example.
Does anyone know if this issue has been tackled in a game before? My gut tells me that you'd need a persistent mesh collider, which would mean the planet couldn't have much variation (to keep the collider poly count down).
Thanks for your time!
I'm working on real-time multiplayer game in Unity. Connection between two players is established with Google Play Game Services.
Simple example of action - shooting from a slingshot: player pulls rubber with his finger and makes shot. The only important initial data is position of rubber before player will release it, so its coordinates are sent on other device. After this projectile gets velocity (RigidBody2D.velocity = new Vector2(...)) and flies.
Problem is having the same initial data and executing the same code projectile can hit the target on one device and miss on other.
Unity's physics engine is not finite state engine and may simulate differently in different devices. But, still I used this in a simple manner for a physics based 2d game and worked out pretty well. Here's the steps that I followed:
1. Before each move, I made sure every physics objects are in same place over the networked devices.
2. Added force to the player object in the local player. Then sent the force position and direction over network to other devices. Then simulated on other devices according to it.
3. After the move is complete(on other words, all the rigidbodies are stopped), double checked if the positions, rotations are same on all devices. If not, aligned them according to the local player who made the move.
Repeated the above steps. The 3rd step is just to be 100% sure that all objects are in sync in all the devices, before I take the next move.