!DOCTYPE html PUBLIC "-//W3C//DTD XHTML 1.0 Transitional//EN" "http://www.w3.org/TR/xhtml1/DTD/xhtml1-transitional.dtd"> » Blender to POD for Oolong -- Mobile Perspectives

Blender to POD for Oolong



By paul ~ March 23rd, 2009. Filed under: Resources.

For those of you who, like me, are just getting started developing with OpenGL-ES for the iPhone, one of the challenges is building some meshes to work with.  If you don’t have lots of money to spend on the real "pro" tools, you’ll want to check out Blender (http://www.blender.org ).  Blender is a feature-rich, but somewhat hard to learn, open-source 3D modeling and animation environment.  It even has integrated physics simulation and scripting.  To get up to speed on Blender, I highly recommend the Wikibooks tutorial Blender 3D: Noob to Pro .  Once you face the Blender GUI for the first time, you’ll be very thankful for this grand effort at a tutorial!

Before I go on, I need to state that I’m a real noob myself and my needs at this point only include the creation of some meshes that I’m going to import into my application and then manage entirely there.  I haven’t yet had a need or chance to try out boned meshes or any of the animation features.  That being said, here are a few guidelines for developing your meshes in Blender with the intenion of importing them into an Oolong (PowerVR ) application as POD files.

  • Go ahead and create your meshes in Blender, but recognize that rendering is going to be performed in your application, so don’t spend a lot of time trying to get it to look "just right" in the Blender renderer.
  • You can set up initial light and camera locations in Blender.  These are imported in the POD file.
  • Create and apply your materials in Blender.  Materials will be imported in the POD file (although I can’t find the Emission component yet.)
  • Note that meshes with different materials on different faces will be split up into multiple meshes (with a single parent mesh.)  For instance, a cube with a different material on each face will be split up into 6 different meshes, one for each face.  Each of these "child" meshes will have a pointer to the parent node.  More on how to deal with this in Oolong later.
  • Right now, I’m applying textures to textured surfaces as place-holders only.  This ensures that the UV coordinates are included in the interleaved vertex data.  In Oolong, textures are imported from PVR files, and I haven’t yet figured out a path from Blender texture to PVR.  If you can help out with this, please chime in!  The result is that you will have to figure out the right texture settings through experimentation.

Once you have your meshes built, you’ll want to do the export/import dance to get to POD.  Right now, it appears that the best path is from Blender to Collada 1.4 (via the Blender Export menu) then from Collada to POD via the PowerVR Collada2POD tool.  (Collada2POD is in the utilities folder of the PowerVR SDK distribution.)  Sorry, I couldn’t find a Mac OS-X version of the SDK, so I have to use a Windows PC for this step.  Here are the settings I’ve come up with for the export and import.

Export to Collada 1.4 Settings

  • Set the geometry type to triangles.
  • Don’t set anything else.  I tried messing with the "UV Image Materials" but didn’t find that useful.

That’s all.

Collada2POD Settings

  • Under geometry options select (not all of these may be required, I didn’t experiment with dropping any out.):
    • Normals
    • Vertex Colors
    • Mapping Channels
    • Flip V
    • Interleave Vectors
  • PowerVR documentation recommends sorted, indexed, triangle strips for performance, so I turned on
    • Sort vertices with PVRTTriStrip
    • Indexed Triangle List
  • I haven’t selected anything special in the Vertex vector formats.
  • The resulting POD file can be automatically opened in PVR Shaman for examination by simply setting up the path on the Post-Export tab.

I used the Shadowing example as the template for how to load POD and PVR texture files.  Here are a few notes on what I’ve discovered:

  • A "SPODNode" is created for each instance of a mesh, the camera and light, and each parent node (for composite meshes.)  The Node has a link to it’s defining mesh, a transform, and a link to it’s SPODMaterial.
  • A "SPODMesh" is created for each mesh and contains the interleaved, indexed, vertex data.
  • You can draw the children of  "composite" nodes ad-hoc if you like, because the GetWorldMatrix( ) methods automatically multiply by the parent’s transform matrix.  However, if you’re doing physics, you’ll want to treat them as a unit (a single btCollisionShape for the composite node.)
  • As mentioned before the Emission component of materials doesn’t come through, so I’m having handle that manually in my application.
  • Again, I haven’t figured out how to get textures directly from Blender to PVR.  I’m handling them separately.  Create your texture externally, use it in Blender and convert it to PVR for use in your application and fiddle to get things right.  (If you’ve got a better way, please tell us noobs about it!)
  • April 24, 2009 — Discovered that if you have a material assigned to an object, and you have also assigned materials to faces, but the material assigned to the object is not used on any of the faces, the Collada exporter will crash.  Make sure that all materials assigned to an object in blender are actually used.  Delete any that are unused before attempting to export.

Note that there has been some recent discussion of this same topic on the Oolong mailing list, so I really recommend getting hooked up to that.  In particular, there is someone there working on a direct Blender reader (into the ModelPOD structures), and mention of a blender reader in the Bullet physics toolkit.  So, there may be more direct methods of using your Blender models in Oolong soon.

I’d really like to thank Wolfgang Engel, the PowerVR folks, the Bullet Physics team, and the rest of the Oolong contributors.  My early efforts are showing excellent performance on the device and I’m having a blast with the programming.  I don’t know where things are headed next for Oolong, but it can only get better!

11 Responses to Blender to POD for Oolong

  1. Paul

    Hey Paul! Thanks for sharing your notes and information as you experiment. Have you tried extracting the Collada file from a Google Sketchup export and then running that through the PowerVR tools? I’m wondering if you could get textures to export that way. Here’s how to extract the COLLADA from a Google Earth file: http://sketchupdate.blogspot.com/2009/05/champion-3d-web-using-collada-contest.html

  2. paul

    I haven’t tried that, but it sounds like an interesting path to pursue. i’ll look into it. Since I wrote this post, I have figured out how to use textures properly in Blender and am getting excellent results when exporting through Collada to PoverVR POD. I have a blog post on the topic in progress, but I have to finish an update to iPunt before I can finish the blog post. (Unless I get some spare hours where I’m just too burned out to do real work!)

  3. Paul

    Great! I’m looking forward to it. I’ve tinkered with Blender in the past… but now I’m going through some tutorials in earnest. Maybe by the time I can make actually the art I need, you’ll have your blog post published!

  4. greg

    hi Paul,

    I’m about to run through the oolong101, making some models and getting them imported etc to learn me some iphone/c++/oolong action. I was wondering about the semi-painful toolchain of creating geometry and then having to apply the textures by hand (in code), so I’ll be searching or waiting for your post about the easier pipeline from blender including texturing :) …otherwise I’m assuming that you were previously having to load and apply textures to each face in code (using the mesh nodes you documented above) – fine for small stuff like I’m about to try, but not great for more model data.

    I haven’t checked that part of the oolong code yet, but do the pod importers not have renderers? Is it the coder’s responsibility to set up and draw the faces etc. for models?

    as a non-mac (and not really a c++) programmer, thanks for your other xcode setup and sound tips as well – I’m hoping to get stuck in over the next few weeks to try and make sense of it all. Actually, I’m interested in getting some 2d stuff happening as well (and have noticed a few posts on the oolong list about it vs cocos2d etc) so it will be interesting to see how things come together.

    cheers,

    // greg

  5. paul

    I’ll try to get the texturing blog post finished tomorrow.

  6. greg

    hi Paul,

    good stuff – I just saw the oolong list arrive with your message… I’ll check your texture post out later :)

    I tried wrangling with blender a little yesterday – it’s always been a tricky beast to get started with and I’ve never really gone further than opening it up and doing very simple things. I did mess with the pipeline a little (made a cube, exported to collada, converted to pvr and viewed in shaman), but I haven’t gone as far as oolong stuff yet (much more reading and head scratching required!).

    cheers,

    // greg

  7. paul

    The Blender tutorial states right up front that the GUI is unique and hard to use initially but you’ll grow to love it. Well, I can’t say I love it yet, but at least I can use the tool. Sometimes you really have to hunt for the functionality you need. Thank goodness for the tooltips over the buttons or I’d never figure it out!

    Here’s the link to the texturing post for those who are following behind…
    http://www.BluMtnWerx.com/blog/2009/06/opengl-es-text…oolong-powervropengl-es-texture-mapping-for-iphone-oolong-powervr/

  8. greg

    heh, it is unique… but I’ve messed a little with max and maya and they’ve each got their ways of doing things – I only enough to know nothing, though. I tried vertex colouring in blender on friday and it just gave me white cubes. Awesome :)

    I read some other posts on the list about quake model support, and asking if anyone (particularly you) had managed to get animation from blender into the iphone. I can’t say I’m anywhere close to that – the hello oolong world falling cubes compilation is where I’m at, and all this other performance and opengl and c++ memory managenent programming stuff is currently making my head hurt. I’ll keep following your posts – thanks for sharing the info.

    // greg

  9. Alex

    Hi, Pod format supports animated and skinned models.

    Does anyone know how to export a skinned model from blender?
    Unfortunally blender collada exporter does not support skinning..

    For now the only way I found to create a POD file with an animated/skinned model is use the 3ds max plugin.. but consider that 3ds max is free for 30day and non-commercial use.
    I can’t sell my game without a regular licence..

    so.. any suggestions?

  10. Paul Z

    Hi Paul(s), I’m Paul..

    Any more news on this technique? Specifically, I’m worried about supporting animation like Alex above. Anyone get it to work?

  11. Erwin Coumans

    You could check out readblend or bParse, both can read .blend files directly. In particular bParse has easy access to any data within the Blender file. Check out http://code.google.com/p/gamekit/ or see Oolong Engine/Examples/Demos/readblend

    There is no example how to read the skinned/animation data yet, but it will be added.