BlenderPeople Development

Tracking the development of the BlenderPeople script suite.

Friday, March 24, 2006

Another demo video, of the same setup. More improvements include fixed skate at the beginning of the cycle, some additional parameters and motion for better realism, and an improved rig.

3.6 MB Quicktime (again)

A question was asked about speed. This created the motion for 700 frames in less than a second.

(Here's the secret -- when you bake object level motion to action level motion, all the positions are already there. You just have to sample them properly. So, at frame x, you just take the right foot position already existing for frame x+12, and the left foot position for x-12, and reset the keys.) Simple and fast!

posted by Roland  # 10:57 AM (0) comments

Thursday, March 23, 2006

3.6 MB Quicktime Walking movie

This animation was generated by:

1. keyframing my rig around the terrain, which is done automatically by BlenderPeople in final. I did this by hand for the sake of simplicity.

2. going into NLA and doing Bake Object to Action, a feature of my particular build of Blender.

3. Hand tweaking the resulting bone Ipos, because there's a deficiency in the way that Blender interpolates quats sometimes. Hopefully this problem will be tackled by Someone in the near future.

4. Setting a stride length and step height.

5. Running the script.

That's it. Admittedly, the animation is not cinema (or even game engine) quality yet. That's due in part to my unoptimized rig and the lazy way that the script handles arms. But, as you can see, the system I came up with handles off-axis walking, walking backwards, starting and stopping, etc., all of which Blender's current stride-based system can't even come close to trying.

So, it's on to refinement...

(first refinement done, but too late to make the video: first footstep causes the planted foot glide in the video, but I fixed that.)

posted by Roland  # 1:36 PM (0) comments

Monday, March 20, 2006

The automatic walking system I outlined in the previous post works.

I coded it this morning. No videos yet. Maybe tomorrow.

The system still needs some tweaking, and I'm thinking about a couple of things to make it a little more realistic (shorter strides when moving very slowly, or when ascending an incline, stuff like that). But the basics are down, and it works.

posted by Roland  # 7:08 PM (0) comments

Sunday, March 19, 2006

Automated Walking

I think I've solved the problem. I wasn't really working on it, but here's the story...

My wife (among other things) sings professionally. Last night, I was at her group's spring concert. They sing chamber music, baroque stuff, and the occasional original modern composition. Mostly, that's not my bag. I enjoy going to the concerts to see and hear her perform, and the group has a high degree of technical proficiency, but to me, that sort of music is for background only. It doesn't hold my attention as the primary attraction.

And so it was that I found my mind cranking on the autogenerated walking problem for BlenderPeople. I haven't had any time to work on it, between helping out the Orange project, buying a new house and beginning move-in maintanence projects on it. But it's been on my mind. The current state of walking tech in Blender is insufficient to my needs, because Actors in BlenderPeople don't always walk the way they are facing, and don't always walk forwards. Stride Bone and Path walking only support continuous forward motion.

I'd read several papers on the subject, but most of them consisted of building octrees of all possible footsteps on a terrain (per Actor), then searching through those for the one nearest the optimum footstep.

What I had forgotten is that I often do my best creative work with non-English lyric classical music playing in the background. Which was the condition in which I found myself during the first half of last night's concert.

Here's how it will work:

1. Make list of Actor location (via matrix) at regular intervals in time (every 10 frames or whatever gives acceptable results)

2. Use this list to create another list of Actor velocities at each location, as well as the key-to-key distance travelled so far. At this point you have a dictionary of evenly spaced frame numbers, with a corresponding velocity, and a corresponding total distance travelled

3. Bake object-level motion into Action-level motion. To do this, you need the Action Baking patch I wrote that will HOPEFULLY become part of bf-blender after Orange is over.

4. We'll now create a new list, containing sets of numbers that represent key positions for the designated foot and hand bones. This list is generated by sampling the key values of those bones in the following manner:

KEY
===
S = stride length (user setting)
K(n)=frame number of next footstep, based on S and interpolation from of distance and frame from the previously built dictionary. In this notation K(n+1) does not refer to frame n + 1, but to the next frame value. Thus, frame K(n-.5) is halfway between K(n) and the previous stride frame (K(n-1)).
LF=Left Foot
RF=Right Foot
LH=Left Hand
RH=Right Hand

On K(n), you add values to the list for LH and RF from frame K(n-.5) and for RH and LF from frame K(n+.5). Then, you go to K(n+1) and reverse which keys are set (LH,RF from K(n+.5); RH,LF from K(n-.5).

5. Add start and finish positions to the list.

6. Unlink location and rotation Ipos from the hands and feet.

7. Rebuild new Ipos for each using the location and rotation data from the list made in step 4.

8. Go through the final Ipos and raise the z value by the step height (user setting) on the foot that is processing.

That should do it.

Obviously, this is not going to create "Hero" quality walking animation. But, for moving hundreds or thousands of Actors around at once, I think it will work. Good IK and constraint setups will generate secondary motion.

One other thing I've thought about adding to the Blender sources is an "Action Noise" panel for the Actions window. With an Action active, you set up the variance values (+/- time, +/- location, uniform/guassian dist.) in a panel, select the Ipo curves you want to effect, then hit the Noise button on the panel. Key values on those curves a randomly moved based on the panel settings. You would expose this functionality to Python as well. This would let you autogenerate things like walking, then through a little noise into it to make it seem less automatic. It would have to be used judiciously, but the effect would be excellent.

When will I get time to code this? I'm not sure. But at least the problem has a solution, and I know what to do now!

posted by Roland  # 11:19 AM (1) comments

Archives

02/01/2004 - 02/29/2004   04/01/2004 - 04/30/2004   05/01/2004 - 05/31/2004   06/01/2004 - 06/30/2004   07/01/2004 - 07/31/2004   08/01/2004 - 08/31/2004   09/01/2004 - 09/30/2004   11/01/2004 - 11/30/2004   12/01/2004 - 12/31/2004   01/01/2005 - 01/31/2005   02/01/2005 - 02/28/2005   06/01/2005 - 06/30/2005   09/01/2005 - 09/30/2005   10/01/2005 - 10/31/2005   11/01/2005 - 11/30/2005   12/01/2005 - 12/31/2005   01/01/2006 - 01/31/2006   03/01/2006 - 03/31/2006   04/01/2006 - 04/30/2006   05/01/2006 - 05/31/2006   06/01/2006 - 06/30/2006   07/01/2006 - 07/31/2006   08/01/2006 - 08/31/2006   09/01/2006 - 09/30/2006   10/01/2006 - 10/31/2006   11/01/2006 - 11/30/2006  

This page is powered by Blogger. Isn't yours?