BlenderPeople Development

Tracking the development of the BlenderPeople script suite.

Thursday, April 29, 2004

Woo hoo! My last fix for the IPO Curve handle problem made it into the Blender source, just before the CVS freeze for 2.33 (happening today, if everything is on track.)

That means BlenderPeople will work out-of-the-box with Blender 2.33! Of course, I'll test it to make sure, and put up an official announcement on the BlenderPeople web page when the release is made.

Also, the texture painting thing is working, but sometimes Actors want to slog right through restricted areas. I'm still trying to figure out why.

posted by Roland  # 4:52 AM

Tuesday, April 27, 2004

The code is all written for the vertex painting and pathfinding, but I don't have the heart to test/debug it tonight. Tomorrow will have to do.

posted by Roland  # 7:57 PM
Vertex Painting Ground Meshes

While I'm waiting for some feedback on the logic of my pinBone extension to the Blender source, I've decided to dig back into the BlenderPeople code itself.

I've just added support for vertex-painted Ground objects. Here's what it will do:

BlenderPeople checks the vertex colored red channel of an Actor's current location on the Mesh. The red channel is used to determine "travelability." Full painted Red (the default vert color for that channel, as it defaults to white overall) is the most travelable color. Black means that the area is impassable. So, you could give your entire battlefield a midlevel red color, then give preferred paths a brighter red, and give areas of the mesh corresponding to walls or pits or boulders a black color. Actors will attempt to route around dead areas.

It is NOT a full-on pathfinding solution a la A*. It does not look ahead very far. But that is kind of what BlenderPeople is about - generating good looking crowd/battle sims without a lot of statewise overhead. I've run some sims on paper, and I'm thinking it will work very well. Of course, the true test will be when I pop it into the code and watch a real animation. And thanks to LetterRip for pointing me to the article that gave me the inspiration for this system.

Vertex color information is stored with vert location information in the database tree, so dynamic modifications to the passability of the mesh can be altered over the course of the simulation, if we so desire later on. Want to keep the modifications? Just don't rebuild your ground tree. Want to ditch them? Rebuild it. Easy enough.

I have the data structures built for it, and have the vert color analyses working fine. Now I just have to add a couple of checks into the pathfinding section of the movement code. It's a very simple algorithm and shouldn't take long. I'll make another post when it's in and working. A day or two later, I'll follow up with an animation.

posted by Roland  # 9:33 AM
Vertex Painting Ground Meshes

While I'm waiting for some feedback on the logic of my pinBone extension to the Blender source, I've decided to dig back into the BlenderPeople code itself.

I've just added support for vertex-painted Ground objects. Here's what it will do:

BlenderPeople checks the vertex colored red channel of an Actor's current location on the Mesh. The red channel is used to determine "travelability." Full painted Red (the default vert color for that channel, as it defaults to white overall) is the most travelable color. Black means that the area is impassable. So, you could give your entire battlefield a midlevel red color, then give preferred paths a brighter red, and give areas of the mesh corresponding to walls or pits or boulders a black color. Actors will attempt to route around dead areas.

It is NOT a full-on pathfinding solution a la A*. It does not look ahead very far. But that is kind of what BlenderPeople is about - generating good looking crowd/battle sims without a lot of statewise overhead. I've run some sims on paper, and I'm thinking it will work very well. Of course, the true test will be when I pop it into the code and watch a real animation. And thanks to LetterRip for pointing me to the article that gave me the inspiration for this system.

Vertex color information is stored with vert location information in the database tree, so dynamic modifications to the passability of the mesh can be altered over the course of the simulation, if we so desire later on. Want to keep the modifications? Just don't rebuild your ground tree. Want to ditch them? Rebuild it. Easy enough.

I have the data structures built for it, and have the vert color analyses working fine. Now I just have to add a couple of checks into the pathfinding section of the movement code. It's a very simple algorithm and shouldn't take long. I'll make another post when it's in and working. A day or two later, I'll follow up with an animation, updated docs, and a 0.6 release.

Incidentally, when Blender 2.33 is out, I'll make a 0.7 release that is compatible with the integrated Python 2.3.3.

posted by Roland  # 9:33 AM

Tuesday, April 20, 2004

Still working on pinned bone. The NLA/Armature/Bones code is a bit obtuse. Many of the developers who have ventured into its hoary depths to make a bug fix or clean the code have called it strange, messy, and/or confusing. Of course, I'm sure there are some Blender geniuses out there who get the whole thing. I am not one of them.

I have the source compiling with preliminary support for pinned bone put in. The NLA Strip structure now has hooks for it, as well as the function that my Python addNLAStrip module accesses. So, you can set a pinned bone via Python, and Blender remembers it. It just doesn't actually DO anything yet.

In the meantime, as I'm slogging through the source code, I'm thinking about the next generation of obstacle avoidance. You want a system that's dynamic - such as a big boulder falling into a narrow trail effectively blocks the trail - or bodies piling up will be avoided. You also want something with a decent amount of precision - defining irregular barriers like castle walls - round turrets - parked cars - et cetera. I think I've decided on using vertex painting of the ground mesh for this. Black means that "None shall pass!", whereas white means smooth sailing. Shades in between would indicate various levels to "resistance" in the terrain. Thus, you could create favored pathways across a rolling plain, while still allowing the possibility of interaction off the path, if circumstances so warranted.

Building the texture reading parts will not be a problem. The real trick, though will be the AI algorithm for pathfinding among the different possible routes. I've done a lot of reading on the A* algorithm, and it seems to be the current industry standard for pathfinding. The thing it ignores, though, is that it often gives Actors a sort of Godlike intelligence when selecting a path. I don't want that. I'm trying to come up with something that is based on what the Actor can actually see, in the direction they are pointing. What I'm thinking is to have each Actor calculate the "cost" in vertex colors of an approach to its target, weighted for different offsets. As a global preference, you would set the sampling, both the spread of angles it will sample away from a direct route, and the number of vertex color tests along each test vector. It just sounds computationally expensive though. Grrr.

If anyone has any suggestions, I'm all ears.

posted by Roland  # 6:02 PM

Friday, April 16, 2004

Pinned Bone NLA Feature Explained

Here's a repost (slightly modified: I removed my plea for help at the end, as I've decided to try to do it myself, and changed a couple of terms) of my comments on blender.org, regarding the "pinned bone" system feature talked about in the previous post. BTW, this doesn't just affect BlenderPeople - it affects everyone who does character animation in Blender. Hopefully this will give you a better idea of what I'm talking about:

I don't know how many of you have actually tried to use the NLA system to do the things it was intended to do. It doesn't work.

Ideally, you should be able to create a list of actions for your armature, then layer and sequentialize them in the NLA window, and have your character smoothly animate between them. Here's the problem, and for simplicity's sake, I'm using only a static pose in my actions, but the effect with full motion is the same:

Pose 1:


Pose 2:


When put into the NLA, you get this (small animation, only 96k, please watch):
http://66.134.133.114/nlabad.avi

The problem is obvious. When animating characters, any difference in foot position leads to ice skating. Bad. The only solutions are to either keyframe the armature object itself to move in correspondence with the slide so that the anchor foot stays in the same place in world coordinates, or to have your anchor foot appear in the exact same location relative to the centerpoint of the armature in actions that will be stripped together.

The first solution is incredibly difficult to pull off and defeats the purpose of having NLA. What good is it to chain actions together in NLA, only to have go in and keyframe things by hand at each and every transition?

Likewise, the second method also defeats the purpose of NLA. You should be able to animate within each action with freedom, not hitting prespecified starting and ending points.

This is why you do not see good, long character animation done with Blender. I've tried, and I'm no hack. The system just isn't useful at this point.

Solution: a new NLA Strip parameter called pinBone. It works like this...

Any NLA strip can have a pinBone set. pinBone refers to any bone in the armature that has been keyed in the Action for this strip. A reverse transform is generated based on the difference between the orientation of the pinBone in the action strip and the current orientation of the bone in world space, and used to transform all the bones (gradually, if blendin is used) of the armature into a position so that the pinned bone position in both world space and the keys in the NLA strip are the same.

Here's a sample of what this would look like, if the pinBone parameter of the second pose were set to the bone Foot.Right in the first example(128k):

http://66.134.133.114/nlagood.avi

Beautiful. This is a useful character animation tool.

posted by Roland  # 6:14 AM

Thursday, April 15, 2004

Development continues, so don't despair. As I said a few days ago, I added a module that allows the creation of NLA strips. Once I got into it, though, I recognized a critical weakness in the NLA system, and I'm working on a fix.

The problem is that when you have two actions following one another with the main armature in different positions relative to the object centerpoint, the character slides when the strips blend. The solution I've come up with is to create a new parameter for NLA strips: "PinnedBone". You would, for example, set it to bone "foot.left", which would apply a translation to the armature at the object level to place the "foot.left" bone in the same global place and orientation in the active strip as in the current global bone position. This eliminates sliding and allows you to freely animate with the NLA without concern for object placement within individual actions.

I'm currently going through the code, and seeking help from other developers. Once I have this in place, development of the Python section of BlenderPeople will continue.

posted by Roland  # 4:08 PM
fweeb, if you're really interested, you can get the diff here:
http://66.134.133.114/patch.txt

It'll add a new method to the Armature section of Python. It also adds a new function to editnla.c in /src. You can access the new functionality as such:

Armature.AddNLA(armaturename, actionname, startframe, endframe, repeat)

armaturename and actionname are strings of 30 characters or less. startframe, endframe and repeat are integers. Providing bad armature or action names results in no activity. Bad start and endframe and repeat values are automatically adjusted, but I that's it. I haven't done any heavy testing, so really weird input data might cause something strange to happen.

Just so you know, this is probably not the final form of the module, so don't go coding War and Peace on top of it, if you know what I mean.

posted by Roland  # 3:50 PM
Okay - I am totally jazzed. I have on my computer a self-compiled build of Blender that has the following Python method:

Armature.AddNLA(ArmatureName, ActionName, StartFrame, EndFrame, Repeat)

I've just tested it with a sample Python script and it adds the appropriate NLA blocks. As E-tek noticed, I wasn't the most confident about being able to do this, but I've done it. It works. Python generated NLA character animation here we come. Woo freaking hoo!

posted by Roland  # 3:50 PM
Great news on the Python API access to Blender's NLA system. I just finished writing, compiling and testing a generalized module for NLA strips that allows modules to generate NLA strips from existing Actions, with user-specifiable parameters. There were a couple of places in the original code that made NLA stips, but nowhere that would just accept a bunch of values, including Armature name and Action name, and make NLA for it.

Now that that's done, I just have to write the Python wrapper for it (should be easier than what I just finished), and I can have BlenderPeople adding character animation.

This went a lot faster than I expected. Woo hoo (in Homer Simpson voice)!

posted by Roland  # 3:49 PM
BlenderPeople 0.5 is now released, including a whole new section of controls for working with Actor stats and Types.

One backstep is that I am now disrecommending the use of Barrier objects. I'm going to try to introduce texture painting of the Ground mesh to signal area avoidance/denial. I just hate the Barrier code, and it's been causing crashes. Grrrr.

Anyway, get the new BlenderPeople 0.5 and updated documentation from my new website:

http://www.harkyman.com/bp.html

posted by Roland  # 3:49 PM
Okay, update time...
I've been training in Boston this week on some new stuff (unrelated), and I had some time to code. The interface is now done for selecting and modifying stats of individual actors, as well as adding and modifying actor types.

AI command is on hold at the moment. The grunt work is done, but I'm going to tackle NLA first. Whenever I'm hot for a project, and I try to do something else, it just doesn't get my best effort. Right now, I can feel that I'm hot to do the NLA module for Python, with BlenderPeople as it's first test bed. So, after this weekend, I'll be starting that. This means that active development on BlenderPeople will hang until it's done. I might kick it in two weeks (HA!), or it might take several months. In any case, it won't be easy.

Next week, I'll put up a 0.5 release that has the GUI elements and ability to access/modify actor and type info.

In the mean time, I encourage you to check out the following animation:
http://www.kaydara.com/products/affiliateProducts/aiimplant_demo/crowdRunning5000.mp4

It's Kaydara's demo file of crowd sim, called AI.Implant. Their MotionBuilder stuff for individual character anim seems to pretty much rule, but... well, am I crazy, or is this just not that good? I mean, yeah, it's characters, but the gross motion is unconvincing. And that's seen as a premier product in the industry. Hmmmm. I think my gross motion looks better. Of course, they might not be putting their best foot forward, but I certainly would be. Oh well.

Also, I've been watching demos and workflows of other character animation systems, and I will be making proposals to TPTB for some badly needed (but hopefully simple) enhancements to Blender's character anim toolkit.

Later.

posted by Roland  # 3:49 PM
Working on the 0.5/0.6 features, and I'm checking out for the night.

Done so far:

GUI tabs, with a new one for accessing stats directly in BlenderPeople. This means you select an Actor, hit the get Stats button, and Blender queries the database and shows you all the Actor's stats within the GUI. You can adjust them however you please, including manually setting new commanders and orders, then Set the changes back into the database. It's pretty cool.

When I need a break from coding the strategy section, I'll include, just under the Actor stats GUI, a similar tool for editing (and creating new) Actor Types. After doing the groundwork on Actor stats, this'll be easy.

No GUI yet for this next feature, but I have the toughest part of it done: Strategy Rules. Basically, you enter commands into the Strategy screen, using the proper syntax (to be provided in comprehensive docs). These rules are saved and evaluated after each round of motion calculation. Here's an example of a statement:

enemy average Health >= commandedby 145 average Health orders commandedby 145 Retreat 2

This means that if after a turn, the average of all enemy Health is greater than or equal to the average Health of all actors Commanded by Actor 145, then change the orders of all Actors commanded by 145 to Retreat at double speed.

I have the syntax fully worked out. The parser is coded, too, and it makes nice SQL statements out of the strategy syntax. All that's left to do is put in the code that actually checks and implements the statements (easy), then put a GUI on it.

Also, thanks to everyone for the nice comments, words of encouragement, and offers of help.

I've taken a look at what I'll have to do to write an NLA Python module for the Blender source, and it's a little intimidating. I think I can do it, though I'll need some help. Fortunately, this is the best software community in the world for getting help!

Later!

posted by Roland  # 3:48 PM
MySQL is the engine that drives the machine. The reason for it is twofold:

1. Speeeeeed. When an Actor evaluates it's surroundings, it does four (sometimes five) things: find the nearest enemy (by sight or failing that by sound), find the nearest ally, find the center of enemy concentration, find the center of allied concentration.

Can you imagine the time it would take Python to cycle through distance calculations between each and every Actor in the sim, at each frame? SQL systems are designed to query and sort lists of results. It's pretty much all they do. For each Actor, I ask it for the first result in a list of opponents/allies, ordered by Distance, and it spits it out. It's also great about returning lists that meet discrete criteria, like everyone trained to a specific Commander, or everyone below a certain level of health, or a certain distance from an object, etc.

I'm also thinking about scalability. If you have 40,000 Actors, this might slow things down. If a professional organization might use this some day (or an enterprising individual), they could run MySQL on a clustered system without any modification. Zoom! It's as fast as you need it to be.

Task splitting. Once again, when you get into high numbers or very detailed terrains, things will most likely slow to a crawl even on the snappiest single system. You can have one machine running MySQL at full speed, and the sim running on another box, full speed there too. If you really need the speed, put Gigabit Ethernet between them.

What it boils down to is that upon thinking about the various things this kind of program would have to do, a whole lot of them were problems that good database software had licked a long time ago, and I felt no need to reinvent the wheel when others had done it MUCH better than I could.

2. Portability. If you want to run things over and over with similar settings, or run the sim for a while then put it away and return later, you need some way to save your state. I could have written exporters/importers and created a file format, but using MySQL, I didn't need to. If you want to take a break from running your sim, you just stop. Your state is waiting for you in the DB. You can examine it with any DB tool you care to. If you want to pull up and take the sim to work, you just grab the dbActors folder from the MySQL folder tree and drop it in place at work. Voila! As I mentioned above, you don't even need to do this - you can run the sim with a connection to the db over the Internet. In fact, this is what I did when I was first working on this.

But what's the justification for making people jump through hoops to use this? Wouldn't it be easier to have it run all with just plain Python out-of-the-box? Yes. But it would have been significantly slower. Also, to get good results out of BlenderPeople, you have to do some work. The odds are that if you (and that's a general "you", not "you" personally S68!) can't follow the install instructions, or are disinclined to because of the effort, that you wouldn't be happy with the results you get from the scripts anyway. It takes some tweaking and playing with stats to get cool looking battle configurations.

Will that cut down on appeal for the casual user? Yes. Do I care? Not so much. If someone needs the tool, I suspect they'll use it regardless. This is one of those "I'm glad people enjoy it" kinds of things, but I'm really doing this just because I find it an interesting problem and fun to do.

I am a long-winded cuss.

posted by Roland  # 3:48 PM
UPDATE: I'm working on a multiple-frame GUI at the moment (thanks, S68, for the cool code) that will allow access to character and type stats, so you can adjust things directly from within Blender instead of using MySQL Control Center, and allow the use of AI conditional strategy for commanders.

It's a simple expression language - so simple I may just include a GUI-based expression builder. It'll start out as a "command line" style thing, though. In the end, it may be a waste of time, as it might be easier and faster to just watch the sim and change things by hand as you see fit. You'll probably get better results that way anyway, but you never know, so I'm going to give it a shot.

Also, don't get your hopes up for a full installer. Blender is cross-platform, as are all the components necessary to run Blender People. I'm not even going to try to create installers/etc. that will work on every platform and take into account every possible system configuration. I've tried to make the docs pretty good about installing the different packages.

Later, everyone!

posted by Roland  # 3:47 PM
Blender People 0.3 is ready for you to test. The link to the 1MB .zip file is here:
http://66.134.133.114/BlenderPeople.0.3.zip

There's a great Documentation .pdf included. For the love of mercy, please read it first. Requirements for running it are in the docs but I'll reproduce it here, just so you know what you're getting into:

Blender 2.32
Python 2.2.3
mySQL 4.0
mySQLdb 0.9
A computer

If you can't even get Python working on your system, head over to here for help. I'll work with everyone to resolve Python-based problems generated by the Blender People system, but I unfortunately do not have time to help everyone get Python installed properly on their computers.

URLs for the different packages you need can be found in the docs. I recommend that you follow the installation order presented there. Linux, Mac, and other users - all of this should work for you, too. You'll have go get the mySQLdb stuff yourself (I've given you a link), and to get good recording of animations, you'll have to roll your own binary - but I've given you instructions for that, too.

I WILL attempt to help you troubleshoot mySQL installation/setup problems, as it is the bread and butter of this system. So post your mySQL woes here, and I'll do my best.

Other things I'd love to see posted in this thread: cool animations you've made with the script; comments/criticisms of how the Actors react to different orders; documentation deficiencies; errors/script crashes that occur and what the console messages were that went with them.

If I get good crash/problems feedback, I'll be able to fix it up for the next milestone (also in the docs).

As of today, I'm taking a coding break from this project, for hopefully only a week, to work on some stuff that I'm actually getting $ for. Gotta pay the bills somehow. I'm on elYsiun all the time during the day (GMT-5) and I browse here quite frequently, so I'll try to be responsive to requests for help/problem reports.

Thanks to locash for making the GUI (get back to me man! check your pm's!).

Have fun!

posted by Roland  # 3:47 PM
You make some excellent points. I wouldn't be surprised if characters are running through each other in LOTR. There's so much to look at, and if you just spent 4 days rendering final production and your deadline is Tuesday and you notice some dude pass through another one - you're probably going to let it go.

That said, my Actors have a preferred distance from and proximity to allied Actors. They also have a physical "no-touch" radius. This is one of the things I'm working on improving right now. One of the things that has made me happy when I look at some of my larger sims is seeing things like waves and moving columns develop where I hadn't specifically set it up to do so.

As for stepping over dead bodies, I've been contemplating ways to do that. First, you simply add a Barrier entry when someone bites it, denying access to that small area of space altogether. That might work if the bodies start stacking up, but I don't like it. Another solution would be to add the dead character's mesh to the ground reference mesh. Once it's part of the terrain, other Actors would be free to move over it as though it were a bump on the ground. Of course, this means regenerating the Ground search tree, which can take a while. I don't want to be doing that every time someone dies. Maybe a new check could be in order when the script sticks an Actor to the ground - it could check within a small radius of the location of all dead Actors. If the moving Actor is within this radius, the Z-value would be modified slightly to approximate stepping on the body. It would ignore arms and legs - but if everything's moving fast enough, this might be close enough for most cases. Of course, you could always tweak something by hand if it was in slo-mo, or right in front of the camera.

Anyway, thanks for the good thinking.

posted by Roland  # 3:46 PM
I wrote documentation today for the first "release" of this. I'm calling it Blender People until I think of something better.

Here's the docs in a 44k pdf:
http://66.134.133.114/BlenderPeople0.3.pdf

So if you're wondering what's going on with this project, you can now read all about it. I'm hoping to have a release "package" ready this week.

posted by Roland  # 3:46 PM
Quad tree searching of terrain is sooooo much faster.

http://66.134.133.114/battlegroundcomp3.avi
Divx 5.1, MP3 audio, 1.1 MB, 15 seconds.

Here's the stats for this video:

400 Actors. 4000 face ground mesh. 1 calculated turn every 12 frames. Each calculation turn: 12 seconds. That time descends as Actors die and no longer evaluate. Final frame calc times were down to 6 seconds a piece. Rendering (not like it's anything great) with blue hemilight and orange sunlight with ray shadows (which I freaking love).

I know this doesn't look much different to you guys, but it represents a massive (ha!) speed increase. Bugs fixed are that actors no longer get stuck for no reason (maximum turn radius for some Actors was less than 0) and actors don't start randoming spinning nearly as often. They still do, but not as much as they were.

One thing left to do before I stick in the GUI (and I've had much help, so thanks to inciner8 and locash for the superfast help - you guys just saved me a weeks worth of work, at least). That one thing is that right now, when actors are trying to head up too much of a slope, they slow to a stop. I need to update things so that they alter their trajectory to run more parallel to the sloping area. Should be pretty simple.

Once I have the GUI put in, I'll make a package with my IPOCurve Blender build, a test .blend, and instructions for mysql and the mySQLdb Python hooks, so you all can play with it.

posted by Roland  # 3:45 PM
Work is continuing -

While I was refining the code for directional awareness, I removed the scriptlinks to the FollowTerrain script for simplicity's sake. Upon putting them back in, things slowed to a crawl. FollowTerrain slows things down, much more than I realized, with only a moderately complex Ground mesh, by a factor of 20 or so. If you want a better Ground mesh, which would be essential for good final renders and smooth hilltops, etc., you're going to have to bump up subsurf or subdivide the thing. FollowTerrain slows proportionally to the number of faces in the mesh.

FollowTerrain checked every face in sequence to see if the object was above it, then calculated the proper elevation when it found it. The main script of this whole thing calls FollowTerrain twice per Actor per calculated frame. A Ground Mesh with 3000 verts (not that many) with 800 actors had things running at about 1 actor every quarter second. Way tooooooo sloooooow for me.

Today I learned about quadtree and octree searches. So now, the script builds a quadtree of the ground mesh upon initialization and stores it in (you guessed it) a mySQL table. This lets you do lightening fast queries based upon your quadrant criteria. That same 3000 vert mesh now only has to four rounds of very simple math, followed by a simple query and the examination of 8 or 9 faces. Much much faster than searching all 3000 faces. I'm getting deformed trees as of now, but it's moving significantly faster. I'll probably spend the rest of the week working on this code, as it's new to me, and recursive stuff has always made me feel a little wobbly.

posted by Roland  # 3:44 PM
The scripts run in two passes. Pass one (the one I'm working on now that you are seeing) generates the gross character motions - translations and rotations across the stage. It also writes a log of what action each character is engaged in, including dying.

Once your happy with the overall look of the scene, you move on to the next step. The second pass uses the action log, the IPO Curves and the NLA system to generate the character animation, including getting hit and biting the dust.

posted by Roland  # 3:43 PM
Here's the thing about the rigged warriors: there is no Python access to the NLA. I'm going to have to write it myself. This might take a couple of weeks. I'm going to contact Hos, who did some major work to the NLA and Actions code to see if he'll give me a synopsis of how the code works. Once that's done, though, first tests ought to be done on hoverbots, getting the NLA stuff to work with auxiliary motion before tackling walkcycles.

I only have a couple more things to do on this stage of the project before I build a GUI. I've been prepping the code for that this morning. Mr_Rob, have you used mySQL before? What platform are you on? I know this was your idea from the beginning, and if you'd like I can let you run remote tests once there's a GUI, to get user issues with the mySQL integration hammered out.

New Addition as of this morning: Each actor is randomly assigned a "turn", which is an integer slot among the resolution of the sim. If you're having the sim run once every twelve frames, then you have 12 "turn" slots. Keys are then generated with offsets corresponding to the turn slot the actor occupies. This gives a much better overall look, as it appears that Actors are not all moving, changing orders, etc. en masse, but as individuals. The order in which actors are now evaluated is based on their turn slots and relative speeds. Previously, it was based on creation order.

posted by Roland  # 3:42 PM
The generated motion is stored in IPOs. Once it's generated, you can do whatever you want with it... add actors that are keyframed by hand, remove actors that do really stupid things, etc. So, you can test your lighting, find good camera angles and do anything once the motion is recorded. And as I've said motion generation is pretty fast. Don't like the way things played out this time? Duplicate your Blender file in case you change your mind, then run it again. If you'll be changing camera angles, you could use snippets from different sim results.

Once you have good motion, then the second stage of the script suite kicks in (which is not written yet). You replace your dummy objects with character rigs, which will follow the IPOs. It uses an "action log" that is generated in the motion stage (it's actually a mySql database, with a full log for each actor) to assign appropriate NLA information to character rigs, based on what they are doing and how fast they are doing it.

Currently, there is no NLA access through the Python API. I fixed the IPO recording bug in the cvs for this project, and if I have to, I'll write the NLA module myself. Hopefully someone else will get to it before me, though.

Cheers!

posted by Roland  # 3:41 PM

Archives

02/01/2004 - 02/29/2004   04/01/2004 - 04/30/2004   05/01/2004 - 05/31/2004   06/01/2004 - 06/30/2004   07/01/2004 - 07/31/2004   08/01/2004 - 08/31/2004   09/01/2004 - 09/30/2004   11/01/2004 - 11/30/2004   12/01/2004 - 12/31/2004   01/01/2005 - 01/31/2005   02/01/2005 - 02/28/2005   06/01/2005 - 06/30/2005   09/01/2005 - 09/30/2005   10/01/2005 - 10/31/2005   11/01/2005 - 11/30/2005   12/01/2005 - 12/31/2005   01/01/2006 - 01/31/2006   03/01/2006 - 03/31/2006   04/01/2006 - 04/30/2006   05/01/2006 - 05/31/2006   06/01/2006 - 06/30/2006   07/01/2006 - 07/31/2006   08/01/2006 - 08/31/2006   09/01/2006 - 09/30/2006   10/01/2006 - 10/31/2006   11/01/2006 - 11/30/2006  

This page is powered by Blogger. Isn't yours?