I am planning on making a city building game in the vein of SimCity 2000 and Zeus in Unity. While looking at Unity's Terrain component and its experimental terrain API, I was under the impression that it would be trivial to paint texture over Terrain at runtime considering it is easily done in Editor, but it turns out it's not.
What's the approach to painting road or any texture over Terrain at runtime? Can it be achieved using Unity's Terrain and its API? How did these old games implement this?
I am aware of assets in the Unity Asset Store which can be used to achieve this, but I am looking to learn the underlying concept.
Related
I am creating a game that uses 2D sprites in 3D terrain. I am using Universal Render Pipeline and 2D renderer to use 2D lighting. These 2D objects are placed on 3D terrain to create a game similar to "Don't Starve Together" where everything is at an angle so it creates an illusion of 3D space of 2D objects.
To be able to use 3D terrain with the 2D renderer, you need to convert every single material and texture, this is done easily. For the terrain to work, you need to create new material and attach a URP shader to it (Universal Render Pipeline - Terrain - Lit). And here comes the problem, the terrain is suddenly completely invisible and I am not able to paint on it. According to tutorials, this should work perfectly fine and in the videos, it does.
After 2 days of searching, I found nothing so I had to think of my own ideas. I have got one that is not working. My question is if you have seen some similar problem before and know how to solve it, how to solve that material issue, or how to be able to get my idea working.
Idea:
I have got an idea if it is possible to use 2 cameras at once, one that would be using a 2D renderer on 2D objects and a second one that would be using URP renderer for the terrain. The terrain now has a terrain layer. The one with the 2D renderer ignores the terrain layer and the one with the URP renderer ignores everything except the terrain layer. But for some reason, it still is not working.
Any help would be appreciated. I would have provided pictures but the servers are broken again.
I am trying to build a VR applications with portals into other worlds. These portals do however not have to be flat, they can be bent. Here https://www.youtube.com/watch?v=EpoKwEMtEWc is video from an early version.
Right now, the portals are 2D planes with camera textures which stretch and move according to the users position. For VR use, and to ease performance issues, I would like to define which area of the screen gets rendered by which camera. Sort of like a viewport in regular Unity, but not rectangular, and for VR with Windows Mixed Reality.
Is this somehow possible? Possibly even without Unity Pro?
Novice Unity user. I want to create a pixel art 2D RPG game for fun in unity. I ran into an issue with my pixel art when I would drag it into the scene window. What would happen is that the Game Window would distort the pixel sizes to be either twice as wide or twice as narrow in the x and y direction. I have no clue on how to fix this issue and looked all over. I would appreciate any feedback. But I would prefer a discord call so that I may share my screen to get efficient troubleshooting advice! Thank you
There are quite a few tutorials and even helper packages to deal with Pixel Art games in Unity:
Achieve Crisp Pixel Art with Unity 2018.2
Pixel Perfect Package
Making A Flappy Bird Style Game
By default, Unity will import your images as Textures and resize them to the nearest Power of 2, which should explain the distortion observed.
A bit more advanced, but if you want all the images to import as Sprite, you can save a Sprite import setting as a preset and then assign it to the TextureImporter in Project Setting > Preset Manager. See Reusing property settings and Preset Manager.
hth.
I'm currently facing a problem with WPF 3D using C#. To put it simple, I need to animate some simple mechanical part by only moving two of them (one at a time or both together). Here is a simple drawing depicting the situation :
So by moving (translating) vertically P1 or/and P2 parts, the whole thing needs to move accordingly.
I guess it may be possible to do by computing a lot of angles and applying numerous transformations but this is not my goal.
Therefore I would imagine something like attaching the parts together by the means of a pivot point.
What is the preferred way to do this to preview it using WPF 3D?
WPF 3D, Ogre, Mogre, OpenTK... are libraries for display. They have nothing to do with mechanical constraints calculations. But they goes well with physics engines.
WPF 3D is a subset of WPF dedicated to 3D drawing. If you need 2D, then WPF is enough.
As your project looks 2D, you might want to have a look to Farseer Physics which is a port of Box 2D. The feature you need is called joints. Both libraries target 2D games development, but they can be used for simple kinematics animations, and Farseer Physics is doing very well with WPF.
It's a simple problem for any 2D kinematics package.
http://books.google.com/books?id=IGtIWmM2GWIC&pg=PR12&lpg=PR12&dq=c%23+kinematics&source=bl&ots=eCJZLq_i6R&sig=wC42cNOdtw4VX9ElTk4IBDAYtzc&hl=en&sa=X&ei=3YkXU4u1EeHu2wXum4GYDA&ved=0CFsQ6AEwBQ#v=onepage&q=c%23%20kinematics&f=false
i have a 2d plane that is generated thru script and i need it to have a material on it that will work as a light.
What i am doing is having a side scroll game with a generated 2d torch which is the plane, now the environment color and the skybox are pitch black and my 2d torch needs to light up the background that is 3d so far i have gotten almost desirable effects with a particle additive material however the contrast is weak and will not help if the environment is black.
will i need to develop my own shader as i have never done this before id prefer not to or is there a simple material solution that i can use.
i am using unity3.55 free
A particle system won't light anything up. Your only out-of-the-box solution is a pixel point light.