Unity3D 2013. 8. 20. 03:07
반응형

유니티 Layer 개념을 이해하기 좋은 글이 있어 공유 합니다.

http://www.devkorea.co.kr/reference/Documentation/Components/Layers.html

반응형
posted by choiwonwoo
:
Unity3D 2013. 8. 20. 02:58
반응형

고등학교때 물리에서 배운 왼손 좌표계를 생각해보자.

기준은 Z축은 plane과 나와의 거리로 생각하면 이해가 쉽다.

 

 

반응형
posted by choiwonwoo
:
Unity3D 2013. 8. 20. 02:50
반응형

1) Unity 실행 아이콘: 마우스 오른쪽 click

2) 속성 ==> 바로가기 ==> 대상에서 아래와 같이 "-pojectPath"를 추가 후 적용

3) 아래와 같이 2개 이상의 창을 띄울수 있음

 

 

반응형
posted by choiwonwoo
:
Unity3D 2013. 7. 13. 19:32
반응형

유니티에 2D Toolkit package를 import하고, atlas를 생성 후, sprite를 만들려고 하면, 아래와 같은 에러가 발생한다.

이런 저런, 해법을 찾다가, 처음 설치부터 다시 해보았다.

유니티 최신 버젼으로 설치

 

 

 

현재 소유하고 있는 2D Toolkit package 버젼 확인

 

 

다시 Sprite 생성 재 시도, 하지만 다시 똑 같은 에러 발생.

버젼 문제인가?

http://unikronsoftware.com/2dtoolkit/forum/index.php?topic=1310.0

반응형
posted by choiwonwoo
:
Unity3D 2013. 6. 29. 23:34
반응형

Choosing GUI framework for your Unity3D Project: EZGUI vs NGUI, Part I

The following post might be helpful for those standing on the crossroad which GUI framework to go with in Unity project. Among lots of more or less advanced 3rd party options there are just two which can be seriously considered: EZGUI and NGUI. I’ll clarify major differences I’ve noticed in my experience working with both of them.

EZGUI and NGUI provide great features for making in-game UI easily and efficiently. However they use different implementation approaches. EZGUI comprises lots of instruments and controls with plenty amount of settings, so you can tweak almost every parameter of any UI element. In opposite, NGUI provides lots of small components and I like its minimalism, short, clean and understandable code. Both EZGUI and NGUI target “1 draw call for UI” and they’ve got very close. Of course “one-draw call for UI” isn’t the main concern, but why not to have such a great addition to your well-structured and optimized code.

The following table contains a comparison of both frameworks with features I find important:

Pixel perfect

Scale of controls is adjusted automatically once on a scene start. Note: it has issues losing render camera reference when instancing UI as a prefab

Scale of controls is adjusted automatically every time resolution changes. Also has ability to apply half-pixel-offset

WYSIWYG

Generates ordinary gameobject with geometry, so everything is visualized by Unity itself

Uses several ways of visualization: “geometry” and “gizmos”, since all UI elements are a part of a single mesh

Access from code

Methods from a specific controller-script are linked with control events.

There can be some issues with instancing objects and losing references. Another way –using delegates can be more convenient in some cases

Similar as EZGUI, but here you have helper components, such as UIButtonMessage, which send specified message to a gameobject (or to itself, if target is null), on selected type of interaction. Also you can access to last used control through static variables such as UICamera.lastHit or UICheckbox.current

Ease of controls creation

Empty GameObject is created and attached with necessary components

Provides handy wizards for creating all kind of controls.

Workflow speed

Smooth, but slow. Searching scripts which aren’t included in common menus, adjusting tons of settings, fixing broken atlases and lost camera references (most likely I’m not the only one who experienced these issues)

Supersonic! Just a little slowdown when creating atlases for sprites in the beginning, and then pure enjoyment of future process!

Drag and Drop

Both frameworks have this feature. Just a little note: any object in NGUI with a collider can be draggable

Atlases creation

Atlas has to be recreated every time you want to add/change an image in it. EZGUI can scan all objects, even in a project folder, find all  using the same material and then regenerate the atlas. This process takes lots of time and you should be very accurate not to break something

Atlas can be managed in two ways: either using fast and handy Atlas Maker to add, delete or modify images in atlas or managing sprites in atlas already created via Atlas prefab inspector

Panels switching

Making menus with switching panels has never been easier due

EZGUI’s powerful abilities

Panels can be switched easily as well, but some additional scripting is required. Panels can be switched through animations and helper components, but I haven’t found any direct way to enable one and disable another panel

Additional stuff

Since EZGUI is based on Sprite Manager, classes (e.g. Sprite etc.) can be quite useful in 2D games for environment creation, backgrounds etc.

Sprites can be used, however with some restrictions like any control must have a parent like panel or UIRoot.

And here is a comparison by controls implemented in the frameworks.

Control

Label

Sprite

Sliced sprite

Tiled sprite

Filled sprite

Simple button

Image button

Toggle button

Radio button

Checkbox

Progress bar

Slider

Input

DropDown list

Scrollable lists

I was really excited with Sliced Sprite from NGUI. When there is an objective to create a resizable window that should be pixel-perfect in different sizes, have a frame outside of it and be filled with a pattern - that’s exactly when Sliced Sprite can manage everything, just specify areas on texture to be used as a frame, corners and filling.

Tiled Sprite can be implemented manually with EZGUI, however it won’t be so easy. Tiled sprite always stays pixel-perfect, and it tiles the texture you’re using when scaling. That’s very handy for creating backgrounds for example.

NGUI extends Unity with a bunch of useful hotkeys which are really nice-to-have, e.g. Ctrl+Shift+N to add new empty GameObject as a child to selected one, hotkey for toggling gameobject’s activity, handy buttons for resetting transform’s position, rotation and scale.

Both frameworks are provided with detailed documentation describing every script, every component, property or method. Additionally, NGUI is shipped with a lot of step-by-step tutorials, video and write-up lessons for beginners.

EZGUI is based on Sprite Manager 2 (developed by Above and Beyond Software). SM2 provides features for creating 3D mesh sprites, customizing and changing their parameters in runtime, as well as creating texture atlases, so that all sprites in a scene are a part of single batch and are drawn in one draw call.

And here’s my subjective comparison of these both frameworks:


Usability

Functionality

Flexibility

Reliability

Extensibility

…that means I like NGUI much more, however I haven’t described another very important difference between NGUI and EZGUI - the way of working with them. I’ll demonstrate it in my next post, stay tuned.

Choosing GUI framework for your Unity3D Project: EZGUI vs NGUI, Part II

As a follow-up to «Choosing GUI framework for your Unity3D project: EZGUI vs NGUI, Part I» here is another post comparing these both frameworks by workflows. They are quite different, probably when you realize it with this small example how to create a simple button, you will make your decision more confidently which framework to choose in your project.

Creating simple button with EZGUI

First of all let's create an orthographic camera in UI layer that will render UI-stuff only: GameObject -> CreateOther -> Camera and set it up as shown below:

You see I’ve attached UIManager component.

Create an empty GameObject called “Button” and attach Button component to it (“Component/EZ GUI/Controls/Button”).  The script needs to be set up as well. Assign UI Camera in Render Camera slot, and check Pixel Perfect checkbox (Auto Resize will be checked automatically). And here is one of the most annoying disadvantages of EZGUI – when instantiating UI element as a prefab, there’s no guarantee Render Camera will be assigned correctly.

We will also need several button images for all button states: Normal, Over, Active, Disabled

Next, you will need to create atlas for button states. Just press this little gear and select “Build Atlas“

And here is the texture:

Note that atlas can contain not only states for single button – but for all the elements in your entire UI. But I would highly recommend sorting them by screens, like start screen elements in one atlas, settings screen in another.  I like this way, because you won’t load memory with redundant atlases.

Now let’s make it working. It’s quite simple with EZGUI: just specify GameObject that has a script with a method that will be called when button is pressed.

Here’s the script that we have to attach to a “controller” GameObject.

Then configure the button: set controller GameObject in “Script With Method To Invoke” slot, specify method name in “Method To Invoke” field and select “When To Invoke” option.

Now if you click “Play” you will get fully functional button. Here we go…

And here is just a little tip. Attach EZ Screen Placement component to your button and it will be placed on screen just the way you need. For example you can stick your button with the top right corner of screen, or stick with some object, and it will maintain constant pixel offset in screen coordinates. Again be very careful with Render Camera. This is the first point you need to check if you see that your elements are moved to incorrect position or have a wrong scale. Also this component stops working in editor mode when you have it on prefab instance and click “apply” button – just click “Play” and then “Stop” to fix this.

Creating simple button with NGUI

Basically, both EZGUI and NGUI work with similar principle – generating meshes, generating UV coordinates and applying textures automatically. However NGUI’s “Button” doesn’t actually mean what we used to think. “Button” is everything in the scene that has a collider attached and is visible by camera having UICamera component, so this pseudo-button object can receive events, generated by UICamera. Here’s the list of all events.

When you see UIButton* (e.g. UIButtonColor) component, it does not mean that it has to be attached to something like a button (well, you won’t even find UIButton component in the list of scripts). Instead, it can be attached to ANYTHING that has a collider, for example – UICheckbox, and your own script, attached to this checkbox, will receive all the events from the list. Well, UICheckbox is just an example, but even a sphere with a collider can receive events! (It can be seen in one of examples provided by Tasharen Entertainment)

Now let’s do the same job using NGUI and start with some preparations. You need to create atlas for UI. In comparison with EZGUI – NGUI atlas is not just a texture – it’s a prefab that contains all information about sprites. It can be easily created using Atlas Maker.

By clicking “Create” button, wizard creates new material and prefab that will store all required information about atlas and sprites.

Great advantage of NGUI atlas is a possibility of adding and deleting images from atlas without rebuilding it totally. As you can see below, all you need to do is just select textures you want to add/update. In this case you see four new textures will be added to the atlas TestAtlas.

After adding images to atlas, you will be able to add, delete, and modify them at any time which is cool I think. The only thing I miss here is forcing creation of square atlases, which is required for PVRTC compression on mobile devices.

Atlas prefab can be edited from its inspector, where you can add, delete and setup sprites, with real-time preview of what you are doing.

So, when you’re finished with sprites, you can go ahead to creating your UI.

Create a base for the future UI. NGUI -> Create new UI. UI Tool opens up.
Just set a layer where UI will be rendered and a camera you’re going to use.

You will see the following hierarchy:

  • “UI Root” is responsible for scaling the entire UI, to maintain its screen size when changing resolution.
  • “Camera” renders UI geometry (btw, this geometry is handled by UIDrawcall script). It also sends events to objects.
  • “Anchor” is used for placing widgets in correct positions, add half-pixel offset to all UI (you can read about half-pixel and half-texel offsets here), and it also can be used to stretch sprites to fill entire screen on different resolutions, e.g. to be used as tiling background.
  • “Panel” groups UI objects (widgets) together and shows some debug information about widgets it contains. Also it has clipping ability to be used in scrollable lists.

Next step – creating widgets. “NGUI -> Create a Widget” opens Widget Tool. In order to create a widget of any type you need to specify an atlas where required sprites will be taken from, widget template – in our case it would be “Image Button”, that works quite similar to EZGUI’s button, setup images needed for three button states – normal, hover and pressed (I’m not sure why, but there’s no disabled state), and press “Add To” button.

That’s it, you already have you button! Just hit play and check it out!

And the last step – the button interaction with scripts. Attach “UI Button Message” component to the button. It allows calling methods in the script attached either to a button GameObject or any custom GameObject, specified in “Target” slot. If target GameObject isn't set, the button GameObject itself will be assigned as Target automatically.

You can see UIButtonMessage component on the screenshot, with Function Name specified – the name of method that will be called on a specific event (check “Trigger” variable).

Then, we press our button and here’s it:

Finally EZGUI or NGUI?

Personally I stick with NGUI that attracts with its minimalism, short, clean and easy-understandable code, stability and optimization. I really hope these posts were helpful for you. Please put comments, ratings and suggest what to add\improve, I’ll be able update this post later.

[source: http://blog.heyworks.com/choosing-gui-framework-for-your-unity3d-project-ezgui-vs-ngui-part-i/]


반응형
posted by choiwonwoo
: