Unity ships with a UI solution I often reference as “OnGUI” to make it clear I’m talking about Unity’s GUI system in particular, and not other GUI solutions built for Unity. OnGUI is the special function name that many Unity classes use, such as MonoBehaviors, for recognizing calls to the GUI system. OnGUI is great for setting up prototypes of menus quickly, building menus not meant to be shipped, such as cheat menus, and building editor utilities. The big mark against using OnGUI for shipping menus is it is a huge performance hog. For mobile devices it is especially expensive, and even an empty function titled OnGUI might drop your framerate by half. The other issue with OnGUI is it has a bit of a learning curve for customizing it with custom art and fonts. If you’re using the default graphics for buttons and other UI elements, it can be extremely quick to get your menu up and running.
So if OnGUI is not recommended for shipping games, what are your choices? This is where the ability to customize Unity, and the built in asset store is extremely helpful. When you have a problem with Unity, you can also try and write your own custom libraries and editor scripts to fix it. If you don’t want to, or you can’t, then you can look for a solution in the Asset Store. The two big UI solutions I have experienced from the asset store are nGUI http://www.tasharen.com/?page_id=140 and EZGUI http://www.anbsoft.com/middleware/ezgui/. The most popular solution for UI in Unity right now is nGUI, in fact it was so successful Unity hired the nGUI developer. I have not spent a huge amount of time with nGUI, so I do not have much tips on the inner workings and usability of it. EZGUI we found did not scale well to larger projects, and had some manual steps that would work much better automated, especially related to atlasing. This post is going to mostly cover atlasing for UI.
The first step when considering a UI solution is to look at how atlasing textures work with it. If you are not familiar with the concept, you build a texture atlas by creating a new texture file with many contributing textures packed into it, and you store the UV coordinates of those packed in textures so you can display them out of the texture atlas. There are two big performance benefits to this: It reduces draw calls by allowing for dynamic and static batching where appropriate, and at the very least minimizes state changes between rendering UI elements. The other big benefit is It reduces the memory cost of UI: Unity’s solution for handling non power of two textures is very memory expensive. Unity will create a second copy of that texture in memory scaled up to a power of two. With a lot of UI at non power of two dimensions, you mind find yourself missing a lot of memory.
So you know why you want atlasing in your UI solution, but what do you want to look for in how it is handled? Atlas creation and management needs to be easy. You don’t want to have to spend a long time training content creators on the steps to mark textures for inclusion in an atlas, and generating that atlas. You also want iteration of atlases and UI to be simple. My recommended solution for this is atlasing as a build step for mobile platforms, with an option to build and test in editor. People building UI elements such as buttons and sprites are going to want to iterate quickly and see their work, and will often do this in the editor. They will also want to go back and change things sometimes, and changing the source art is easier and makes much more sense than needing to remember to change the source art and the atlas art. Generally the machine you are using to run Unity editor on will run games in the editor much faster than you will on target, and the performance hit of un-atlased UI will not matter much for iterative development. I like a build time solution because generally the time spent generating these textures for a PC making Unity builds is not going to add too much to your build time, and then you can avoid dependency and updating issues, the first pass solution can just re-generate the atlases every build, if build times do become a problem you can implement dependency checking for updating atlases later.
If you are rolling your own UI solution, identifying what needs to be atlased, and updating UI elements with the new atlas texture and atlas UV coordinates is a complicated task. One challenge I encountered when helping build solutions for this was matching how teams decided to build UI. I found some people preferred to use prefabs for constructing menus and UI elements, whereas other teams would rather build their UI into scenes they loaded additively. You might want to just build a hinting system into your UI instead of trying to automatically detect what goes in what atlas, and give content creators a way to identify what atlas different UI elements should go into.