Making sense of Unity PPU and compression

Unity and Pixels per unit

No, I don’t have the answer – yet – but I will. I’m still new to Unity and trying to get my head round this one aspect, the rest is simple, in comparison. PPU is not straight forward, at all. I wouldn’t say it’s complicated, but it is. If you ask experts you’d get the answer “it depends.” Anything predeterminably set but whose solution depends on variables needs fixing, I reckon.

For now (having written this article), I think the best solution is to stick with the default (100) and scale images down to fit the needs, as below. The bard (in a box) is set to a scale of 176×176, which corresponds to the average human height (1.75m, or 5’9″). The little green box is obviously one unit is size (100px, 1m x 1m), as for the two miners, one is the default (it’s imported at 2048, looks better at the native 4096, but massive files size), the smaller, game-sized miner is scaled down to 0.275, or a little over a quarter.

The miner and some other assets are from GameDeveloperStudio (who I particularly like), and the 2D Jungle Side-Scrolling Platformer Pack, the latter of which is free from the asset store.


Mulling it over

On face value, it is simple: 1 unit (one square) in the scene is 1 metre in the game world. So a human will be around 1.75 units tall, while a giraffe would be 2 units tall.

The default setting for PPU is 100, so, for a (now) standard HD 1920×1080 screen setting you have a visible working area of 19.2 x 10.8 unit, which (rounding) is 200 foot x 100 foot (57.6m x 32.4m).

Makes perfect sense, thus far.

But in Unity you add in an image or texture to your sprites – and it all goes to pot. Proverbially and literally, because maths. In game design, it seems, images are not a ‘resolution’ or a ‘height and width’ (as is the case in say web design or printing), but either POT or NPOT.

So, 2n: e.g. 22, 23, 24, 24, … 29
2, 4, 8, 16, 32, 64, 128, 256, 512, 1024, 2048 etc

But also not that simple, ‘cos different algorithms can have different, well, algorithms! In the case of DXT5, while appearing to nod towards POT, and working with NPOT, it also needs it be by perfectly divisible by 4. So, 21 = 2 would presumably fail, ‘cos 2/4 = 0.5, a fraction. 20 is not a power of 2, but 20/4 = 5, so it’s accepted.

In simpler terms, if I am reading this right, a png of size 256 x 256 is great, 512 x 64 pixels is also fine, but an image of size 511 x 64 or, worse, 255 x 255 is going to cause issues. But it’s not even that simple because the game engine wants to change the file, and doesn’t always know how to change it correctly, so it fudges and gets in a tizz.

You think, aha, I got this, I’ll import it and set the… Nope.
Try to get software to fudge it and, as Katbits points out (link below), you get glitches. It might look right – until it doesn’t, you see, ‘cos “it depends”. Rather a problem if you aren’t artistically inclined and can’t make your own graphics. But I know a man, so I’ll get him to make them. Only my artistic skills (or lack thereof) are not the problem, ‘cos I’m trying commercial graphics perfectly designed for this purpose. And they are giving me issues. The pictures are not the problem, my ignorance, while a factor, is not the problem. The problem is PPU is a fudge, and it is buggy.

There’s a great explanation of this here, especially in relation to 3D: Katsbits: Make better textures, the ‘Power of two’ rule & proper image dimensions

 

The guidance appears to be “just pick a side and stick with it”. So, import everything as the one PPU, typically 100, as it makes rescaling easily, if needed later. e.g answers.unity: Doubt with pixels per unit

What you should have in mind is to get a match between the game world and the size of your textures. IMHO it’s much more convenient to have a fixed ppu for a particular target resolution.

BUT, the guy doesn’t actually answer the question he was asked, but fudges it by saying, ‘so make your own graphics, problem solved’. Well, no.

I am following a number of courses and tutorials on Unity, some by academics (e.g. Patrick Felicia, Ph.D: Learn to Create Games), others by experienced games developers, such Jason Weimann; they all seem to skirt the problem! Part of it, I think, is because they have worked it out, and just do it, without thinking about it, a bit like riding a bicycle. But “Just get on and peddle” does not tell you how to ride a bicycle, not without a few nasty falls.

For the moment I’ve stopped programming the game I was working on until I solve this conundrum. See, you watch their carefully scripted, pre-prepared lessons and it looks so easy. you copy what they did, using the files they suggest, and it all works. So you start thinking, “OK, I’m getting this, I’ll try…”

But when you import other stuff into Unity and start to play yourself, it changes things, things can go out of whack for scale, ‘cos PPU. I don’t think they do it on purpose, but I’ve yet to find anyone or anything giving a clear answer to such a simple question (e.g. “How do I add ‘this’ picture”). Especially given the entire purpose of game engines is to allow you to add ‘this’ image and do ‘something’ with it.

So, I started a new one to play around with, thinking I was getting there. PPU is a height thing, the width (and z-axis) get applied after, sort of. So, I added four sprites, a man, a hedgehog, a worm and a snail. No beggering about, new import, 100 PPU, 1:1 scale.

And this happened:

The screen filling I’m not too bothered about, you can change that with the camera, I was more interested in import scaling. I have no idea why the 224 px high worm towers over the 270px high hedgehog. The Snail (319px), Miner (291px) and ‘hog (270px) I can understand.

Playing around, finding a workaround (more on that in a minute), and mulling it over in my head:

Another suggestion for designers:
Design with POT and dxt5 in mind. I could be very wrong here (still researching it), but games engines and in particular Unity’s ‘pixels with unit’ are right fussy over texture and sprite sheet sizes, in a “make the problem go away” sort of way. It’s all to do with this the power of 2 and being divisible by four.

It’s not you, it’s me
‘Me’ in this case being Unity itself, or at least recent iterations of it. At first I thought it was the images I was using, maybe they were at fault, but no, they are correct. The indented the lines below were me mentally having a conservation with artists.

This is an observation and suggestion, not a critique, the issue appears to lie in white space surrounds and much as anything else.
Here’s an example: Snail (2048×1272), fine (‘cos 2048/4 = 512, and 1272/4 = 318),
Miner (2048×1512), also fine (‘cos 1512/4 = 378),
but the hedgehog is 2048×1625 (1625/4 = 406.25) and the worm is 2048×1201 (1201/4 = 300.25).

But if the hedgehog and worm were just 1 pixel narrower, no error warnings, ‘cos (1624/4 = 406), (1200/4 = 300).

You can also get around this by dropping the maxsize on import, (say from 2048 to 1024), which might resize it more acceptably.

However, as you’ll see, the images are all fine, Unity is just breaking them! Might be fair to argue that it’s my fault because I’m a noob and I should know that the default setting don’t always works, but it still bounced with back to Unity: if the defaults are flawed they should not be the defaults; if the defaults are buggy, they need debugging.

Before (with hedgehog): “Only textures with width/height being multiples of 4 can be compressed to DXT5 format”

After (with a worm):

As you can see, in the case of the worm, importing at a lower resolution dropped it to 1024×600, which it was happy with. I suspect that if I put the hedgehog sheet into Paintshop or Adobe and shaved off 1 pixel it MIGHT be happy with it then.

BUT the problem here is this: the native png size for the hedgehog (a moment before dropping it into Unity is 2984×2638. Unity here is defaulting to changing the size down to 2048Mb, and is then complaining that file is the wrong size, after it rescales it incorrectly. It was perfectly fine as (2984/4 = 746) x (2638 / 4 = 592).

If you set import the 2984×2638 file as 4Mb in Unity, it’s happy.

But set 256, 512, etc, and just does not like it. Complains every time.

Unless you tell it NOT to use automatic (not to guess), but instead use (say) 2048Mb using RGBA 32bit. It is still at 2048×1625, which it complained about, but now it’s removed the error warning:

Still struggling, if I’m honest, but this helps, for Unity3DTips:

Unity Texture Compression Options
Max Size
This will set the maximum resolution allowed for this texture, e.g if your texture is 3000×3000 setting this to 2048 will set the texture resolution to 2048×2048, if the original texture size is smaller than the selection e.g 2000×2000 then setting this to 2048 will not affect the texture resolution, keeping it at native file resolution.

 

Basically, without I’d more or less worked out by now, you pick ‘a’ PPU scale, (100, 70, whatever) and mess with either the raw files or the import settings and upsclae or downscale them as required in the sampling rate. As opposed to, or perhaps less aggressively than editing the sprites scale settings. Seems to me to cause more fuzziness, but you can end up with a smaller end games, so it’s a plus.

Also on that Unity tips page. Curiously – I find – these are flagged as ‘advanced’ topics, where I would consider (in web development terms) adding a picture to a page as basic as it gets.

 

Format
The texture compression format which the texture will be compressed with. You should try stick to a specific format across your entire project to ensure you don’t run into compatibility issues. On Android when a compression format isn’t natively supported it’ll switch to software decompression which is MUCH slower. Usually for Android you’ll want to stick with ‘ETC2’ texture compression formats, using ‘RGB Compressed ETC2 4 bits’ for opaque textures and ‘RGB + 1-bit Alpha Compressed ETC2 4 bits’ for transparent textures. However if the textures appear glitchy or transparency stops working then use ‘RGBA Compressed ETC2 8 bits’ which results in a slightly higher texture filesize but should fix any issues with the texture.

See, basic stuff. To add images I need to set (in my case) RBGA-32 for them all, and not trust Unity and it’s hit-and-miss automation. Or, simply, I can consider dropping big 4096 files down to 512 if there’s not too much loss. Whereas hopping between 50 and 500 for the ‘pixel per unit’ on the images and spritesheets etc would not end well.


Unity manual

There is a particularly long and information guide in the Unity manual as well: Texture compression formats for platform-specific overrides. For instance Windows, Linux, mac, PS4 and Xbox One, WebGl all support RBGA compressed DXT5. iOS, Android, and tvOS want different formats. Then it goes into options and caveats.

Further up the breadcrumbs is Importing Textures, which notes that:
“In a 3D Project, Unity imports image and movie files in the Assets folder as Textures. In a 2D Project, Unity imports image and movie files in the Assets folder as Sprites.”

Adding:

Texture dimension sizes

Ideally, Texture dimension sizes should be powers of two on each side (that is, 2, 4, 8, 16, 32, 64, 128, 256, 512, 1024, 2048 pixels (px), and so on). The Textures do not have to be square; that is the width can be different from height.

It is possible to use NPOT (non-power of two) Texture sizes with Unity. However, NPOT Texture sizes generally take slightly more memory and might be slower for the GPU to sample, so it’s better for performance to use power of two sizes whenever you can.

If the platform or GPU does not support NPOT Texture sizes, Unity scales and pads the Texture up to the next power of two size. This process uses more memory and makes loading slower (especially on older mobile devices). In general, you should only use NPOT sizes for GUI purposes.

You can scale up NPOT Texture Assets at import time using the Non Power of 2 option in the Advanced section of the Texture Importer.

So, basically, at least according to the manual, when a guide or course tells you to import “an” image, and glosses over all the potential problems of blindingly, ignorantly following such bland instructions, well, someone is wrong.

Similarly, under ‘Texture Import Settings‘ there is advice on the advanced settings. For instance:

Non Power of 2 If the Texture has a non-power of two (NPOT) dimension size, this defines a scaling behavior at import time. See documentation on Importing Textures for more information on non-power of two sizes. This is set to None by default.

None
Texture dimension size stays the same.

To nearest
Scale the Texture to the nearest power-of-two dimension size at import time. For example, a 257×511 px Texture is scaled to 256×512 px.
Note that PVRTC formats require Textures to be square (that is width equal to height), so the final dimension size is upscaled to 512×512 px.

To larger
Scale the Texture to the power-of-two dimension size of the largest dimension size value at import time. For example, a 257×511 px Texture is scaled to 512×512 px.

To smaller
Scale the Texture to the power-of-two dimension size of the smallest dimension size value at import time. For example, a 257×511 px Texture is scaled to 256×256 px.

They consider this ‘advanced’. To my thinking, if NPOT images can cause problems, the default should be to import and convert to POT, surely. However, I can find no way to alter the NPOT scaling behaviour, so I have to wonder if the manual isn’t keeping up with interface and game engine changes!

Guess I have to add the entire Unity manual to my ever-growing reading list.


Further reading

Though less related to error messages there is a great description of PPU setting on StackExchange by DMGregory, under: Unity – Pixels, Units, Sprites and Pixels per unit – Setting a relationship between sprites and screen size

Generally you want to keep your Pixels per Unit setting constant for a set of assets you use together. It’s the conversion factor between the texture resolution and the coordinate space of your gameplay logic.

For example, if you’re making a tile-based game with 32×32 tiles, a PPU of 32 would make good sense. Then it’s easy to write say a dash move that travels 3 tiles: it moves 3 units in world space. If you later go and make an “HD” version of your game with 64×64 tiles, you can replace the assets with PPU 64 versions, and none of your gameplay logic has to change, because the logical coordinate system is still the same: 1 unit = 1 tile.

To adapt to different display sizes, the main thing you want to change is your camera.

(It goes on for a bit and makes interesting reading)

Another important point – also skipped in every tutorial I’ve watched – it the relationship between PPU and game physics. Also on StackExchange, someone asked, “What is the pixels to units property in Unity sprites used for?” (He was setting it to 1)

100 pixels per unit would mean a sprite that’s 100 pixels would equal 1 unit in the scene. It’s simply a scale to say how many pixels equal one unit. This can affect things like physics. A lower pixels to units setting would require more force to move one unit than a higher pixels to units setting.

Yes, there may be times where you’ll want to manipulate the pixels per unit. If you have a tile sheet of 16×16 tiles, you may want to consider setting the pixels per unit to 16 so that you can easily snap tiles together in a scene, for example.

~ Cooper


(eg. setting it (Pixels per Unit) to 1 with a sprite that is 200 pixels tall will make the sprite 200 meter tall in the physics engine).

~ bummzack


Answering the question in your title, units in Unity are arbitrary. It’s just a number; the most common interpretation is that 1 unit means 1 meter, but that’s just the most commonly agreed upon value. You could decide 1 unit = 1 foot, or 1 unit = 1 inch, etc.

As for why you would use the default setting of 100 pixels, it’s because the physics system doesn’t like values that are too large. If you set 1 unit = 1 pixel, then the physics system would be moving objects hundreds of units per frame, and the physics calculations tend to break down in that situation. By setting 1 unit = 100 pixels, then physics will be moving objects more like a couple units per frame.

~ jhocking

 

Comment:

Last one is quite a thing to get your head around, eh.

When they say “it depends” – but neglect to say what it depends on – that’s not right, I feel. Hence going looking for my own answers.

PPU affect physics, which in term affects game play, and power and memory requirements. I’d say it was important to understand such concepts BEFORE you start developing, not months down the line, when everything has gone pear-shaped. It’s easier to design correctly at the start than fudge stuff later. The important thing to take away here is planning, and consistency.

I was searching for advice on resolution (pixel density, found the following. Didn’t answer my question, but makes a few good points
Best Image Formats, Sizes, and Resolutions for Unity Sprites?

Still looking for the ppi/dpi answer, but a good starting place, which mostly explains in it terms of printing, is on Shuttershock: PPI vs. DPI: Demystifying the World of Online and Print Resolution.

 


Resolutions

Does (raster) pixel density matter? What about the rest?

8-bit RBG?
16-bit RBGA?

Pixels per inch? Pixels per centimetre?
300 ppi = 118.11 pixel/cm – the standard for print
150 ppi = 59.055 pixel/cm – arguably the standard for web pages
72 ppi = 28.346 pixels/cm – often given as the standard for web pages.

But games? Decided to run a few tests myself to see, and the first thing I’ve noticed, at least according to Paintshop Pro (2020) is it makes absolutely zero difference to memory requires, so you may as well go with 300 ppi. (Image file size is a different matter).

Make up two png images in Paintshop Pro, one is a 16-bit RBG, created at 1,000 PPI, the other 8-bit, created at 72 ppi. Take a look:

The file sizes are 9.65KB and 10KB, so – at least for a simple image like this – the difference is negligible. seem to me you might as well make them at 300 pixels per inch.

There is the argument that well HD is always 1920×1080 and the pixels just sort of crunch up or spread out depending on your screen size. Well not exactly, because screen size and screen resolution are not the same, because of pixel density.

Here, let Eizo expalin it better: Confused about HiDPI and Retina display? ? Understanding pixel density in the age of 4K


And on Youtube

Will probably come back to this tomorrow, if I remember ( :/ ), but I’m starting with Google search on Youtube for Pixels Per Unit +Unity for some other explanations.

I’ve not watched these myself, yet, but the top results include:

This first one, with 78k views, is very good. I have subscribed to his channel on youtube: Chris’ tutorials

 

Next is, the music is annoying as heck (I think), but it’s only a minute long and makes an interesting point.
It currently has 2.8k views:

 

This next one relates to getting your game to fit into the boundaries (shape) of your phone, tablet etc, rather than PPU, being it’s still really useful. His channel (Press Start) is good for new learners.

24k views:

 

Packt publish some decent text books, this is clearly relating to one of their Unity books or courses.

2.8k views:

 


 

Revisited

I get like this sometimes, can’t move on until I grasp a fundamental, or even obscure puzzle. Been researching this for a few days now and firmly came to the conclusion this issue is as old as Unity and they still have not fixed it. Why is that? I found a thread on the Unity forums that was almost a decade old and the last post, from 2020, incredulously asked “Still?”.

NASA put a man on the moon in 1969.

Let’s put this into perspective:

The first PC, the IBM XT, only came out in 1982.

The first smart phone, also from IBM, only came out in 1992, (the Simon Personal Communicator).

The smartphone in your pocket (if only 4Gb) has a million times more memory and is 100,000 times faster than the guidance system that put man on the moon.

Yet they can’t sort a scale drop down option for a multi-million dollar game engine?

It’s not that it’s too hard for me to code around, my course covered calibrating speed and jump force for the PlayerMovementController script on the first day, it’s the inherent inflexibility of the system.

Honestly, I looked at scores of forums and videos and manuals and they all said the same things:

In unity 1 unit = 1m, but this arbitrary – but don’t change it ‘cos you’ll break things.

Only it’s not arbitrary because: maths. I haven’t got a complete list of this things implicitly and explicitly tied to this unit, which underpins the ‘gravity = 1’ setting, which then gets complicated.

For instance:

Setting the Unity editor scale

By default Unity uses a scale where 1 grid unit equals 1 meter (many people say that each grid unit is “arbitrary,” but many parts of the Unity physics engine assume each unit is a meter). There is no single variable that can be modified to change this relationship.

Many of the physics-related components have values for gravity, force, etc, that if all modified consistently can simulate having a grid of different dimensions. Personally I find it’s best to adjust the scale of your objects to fit Unity’s standard, rather than trying to force Unity to work at the scale of your imported models.
~ Sikori17

 

This makes no sense for me as it would infer the entire games entire (grid system) was developed for 1m tall gnomes. Want to make a game with insects? Better make the insects three foot tall, chum. Want to make a spaceship game, make the space ships three foot high too!

This is answered reasonably well in a thread on the unity forums (Meter or centimeters). The best answers (from 2015) included those from a guy called Runevision:

“There is no ‘default unit’ in Unity. Just choose what you want units to represent and stick to that. If you have gravity in your game, you need to set it up to match the unit representation you have decided on (if you want it to be realistic).

The gravity per default is set to be like earth gravity as measured in meters per second, but if you use centimeters for your units, you can just multiply the gravity by 100.

“Whatever scaling issues happen when you import things into Unity is a by-product of the import process only. It does not mean that Unity itself has a specific default unit.”

“Those settings you mention are not only dependent on whether you use meters or something else, but also on the scale of your subject matter in the first place.

For example, you probably want to adjust those settings differently depending on whether you are creating a game about ants or about huge space ships, even though you might use meters as units in both cases.

And elsewhere:

Meters matter is for physics/rigidbodies. Objects in Unity fall at 9.8 units/second-squared. If you aren’t using 1 unit == about 1 meter, falling will look weird (but you can adjust physics settings to make any scale work.

The only thing that it does affect is force and velocity calculations by the physics engine. But since you control this all you have to do is take it into account. The only hardcoded place where you need to adjust the force is under gravity settings, because all the other physics settings are done through code by you yourself.

And elsewhere:

Not just physics simulation. But Lighting also works on real world scale. Field of depth.”

 

From 2011: Unity unit scale

The default scale is in meters, which affects terrain, audio doppler effects, gravity, etc.

In general though, the Unity scale is whatever you want it to be. In the project I’m working on, 1 unit = 1 km. You can adjust scale-related settings by editing your project settings.

 
So, basically,
if you mess with the default 1 unit is 1m you are affecting, amongst other factors, depth of field, light, sound, motion, force, velocity, and gravity.

And all that is without the added headache of export settings (e.g. a 3D model from Maya or Blender) verses the Unity import settings, which apparently can mess up by a factor of 100.
e.g. (3D) Art Asset best practice guide:

“Be aware that different systems use different units – for example, the system unit default for Max is inches while in Maya it is centimeters.”

“Unity has different scaling for importing FBX vs. importing native 3D modeling software files. Make sure you check the FBX import scale settings. For example, if you want to achieve Scale Factor = 1 and Object Transform Scale = 1, use one of the proprietary file formats and set the Convert Units option.
If in doubt, export a meter cube with your Scene
to match in Unity.
Animation frame rate defaults can be different in different packages. Because of this, it is a good idea to set this consistently across your pipeline (for example, at 30fps).”

 

Have to say, it’s no wonder it’s confusing to new users like myself. Surely Unity should be able to sort something so simple as scale setup options (on a new platform). The ‘default’ is 1 unit is 1m, fine, but if you are developing a different game, you should be able to say:

Create [2D][3D] platform with world physics unit scale set to [drop down]

macro (1 unit = 1cm)
small (1 unit = 10cm)
standard (1 unit = 1m)
large (1 unit = 10m)
very large (1 unit = 100m)
(etc)

Might not solve the physics for outer space based games, but you have everything else from insects to battleships covered. Sure, you might well have to tweak stuff, but out of the box you have sorted many of the headaches. Which is easier, the engine having a set of defaults (instead of only one), or practically every game created with the engine having to fudge around the limitations of the default?

Get your act together, Unity Technologies, this problem should be simple to sort, it’s not rocket science!

.

Ack

Been playing with computers since the stone age, online since the '80s, and developing websites since the '90s.

Leave a Reply

Exit mobile version