Friday, November 7, 2008

BUI - Basic UI framework

Last weekend I began to develop my own user interface abstraction using Blender's Python API. The API offers basic access to Blender's drawing (Blender widgets) and event system. Based on this I built my own little system.

To drive development forward I used my "filter layers" concept. I find Blender's layer system a bit restricting. Filter layers offer the user a way to define what objects belong to a given layer by defining a filter. In other words you can have all lamps on one layer, all cameras on one and so on.

My approach is quite simple. I abstracted the structure of the user interface using YAML. It should be possible to change this to support other formats, such as XML, without too much effort. Here's a snippet of what user interface definition looks like at the moment:


ui_structure = '''
VerticalContainer:
width: 400
children:
- HorizontalContainer:
children:
- Label:
name: Filter layers v0.9
- PushButton:
name: X
tooltip: Quit script
event_handler: quit_script
width: 20
- EmptyContainer:
height: 10
- VerticalContainer:
name: layers
children:
- UIStructure:
name: layer_structure
- HorizontalContainer:
children:
- PushButton:
name: Add layer
tooltip: Add new layer
width: 100
'''


The treelike structure consists of containers and elements. Containers handle the order in which the elements inside it are rendered. So if you use HorizontalContainer, elements are laid out horizontally. In case of VerticalContainer, as you might expect, the elements are laid vertically. EmptyContainers can be used to add empty space to the layout. I originally used specific padding property put decided to remove it as EmptyContainers proved to be more simple and nicer to handle.

If you look closely, you can find a special element known as UIStructure. This actually represents a link to another tree. The reason why I implemented it this way is because I needed to duplicate certain parts of tree in the user interface code for instance when I am adding a layer or a filter.

The whole structure is converted in object format when the application is run. I wrote simple traversing functions that can be used to find elements in it. Also the user interface tree can be modified on runtime.

It is possible to attach an event handler to each element. The names of the event handlers are defined implicitly by default following Convention over Configuration principle. So if you have a PushButton named "Add layer", you can expect that it uses an event handler known as add_layer. Of course this is not desirable always so I made it possible to define event handlers explicitly.

At one point during the development I noticed that I need something more than just user interface definition and events. I realised that there are certain constraints that apply to the user interface all the time. For instance layers have to be numbered starting from one and increasing by one till nineteen. Or "Show filter" must be renamed to "Show filters" should a layer contain more than just one filter and vice versa.

To solve this issue I implemented a constraint system. Constraints are named as some_descriptive_name_here_constraint. They are evaluated each time the script window of Blender is redrawn. I suppose there are some optimizations that could be done but I am not too worried about the performance at this point as the constraints defined are pretty simple anyway.

At the moment I am in progress of generalizing and restructuring this whole system on proper modules. The idea is that there is a specific abstract part. Based on this common part drawing and event handling implementations can be made for wanted platform such as Blender or Pyglet. In other words should you want to use the basic system, all you need to do is to subclass event manager, write elements and hope it works. :)

No comments: