I think I've solved the problem. I wasn't really working on it, but here's the story...
My wife (among other things) sings professionally. Last night, I was at her group's spring concert. They sing chamber music, baroque stuff, and the occasional original modern composition. Mostly, that's not my bag. I enjoy going to the concerts to see and hear her perform, and the group has a high degree of technical proficiency, but to me, that sort of music is for background only. It doesn't hold my attention as the primary attraction.
And so it was that I found my mind cranking on the autogenerated walking problem for BlenderPeople. I haven't had any time to work on it, between helping out the Orange project, buying a new house and beginning move-in maintanence projects on it. But it's been on my mind. The current state of walking tech in Blender is insufficient to my needs, because Actors in BlenderPeople don't always walk the way they are facing, and don't always walk forwards. Stride Bone and Path walking only support continuous forward motion.
I'd read several papers on the subject, but most of them consisted of building octrees of all possible footsteps on a terrain (per Actor), then searching through those for the one nearest the optimum footstep.
What I had forgotten is that I often do my best creative work with non-English lyric classical music playing in the background. Which was the condition in which I found myself during the first half of last night's concert.
Here's how it will work:
1. Make list of Actor location (via matrix) at regular intervals in time (every 10 frames or whatever gives acceptable results)
2. Use this list to create another list of Actor velocities at each location, as well as the key-to-key distance travelled so far. At this point you have a dictionary of evenly spaced frame numbers, with a corresponding velocity, and a corresponding total distance travelled
3. Bake object-level motion into Action-level motion. To do this, you need the Action Baking patch I wrote that will HOPEFULLY become part of bf-blender after Orange is over.
4. We'll now create a new list, containing sets of numbers that represent key positions for the designated foot and hand bones. This list is generated by sampling the key values of those bones in the following manner:
S = stride length (user setting)
K(n)=frame number of next footstep, based on S and interpolation from of distance and frame from the previously built dictionary. In this notation K(n+1) does not refer to frame n + 1, but to the next frame value. Thus, frame K(n-.5) is halfway between K(n) and the previous stride frame (K(n-1)).
On K(n), you add values to the list for LH and RF from frame K(n-.5) and for RH and LF from frame K(n+.5). Then, you go to K(n+1) and reverse which keys are set (LH,RF from K(n+.5); RH,LF from K(n-.5).
5. Add start and finish positions to the list.
6. Unlink location and rotation Ipos from the hands and feet.
7. Rebuild new Ipos for each using the location and rotation data from the list made in step 4.
8. Go through the final Ipos and raise the z value by the step height (user setting) on the foot that is processing.
That should do it.
Obviously, this is not going to create "Hero" quality walking animation. But, for moving hundreds or thousands of Actors around at once, I think it will work. Good IK and constraint setups will generate secondary motion.
One other thing I've thought about adding to the Blender sources is an "Action Noise" panel for the Actions window. With an Action active, you set up the variance values (+/- time, +/- location, uniform/guassian dist.) in a panel, select the Ipo curves you want to effect, then hit the Noise button on the panel. Key values on those curves a randomly moved based on the panel settings. You would expose this functionality to Python as well. This would let you autogenerate things like walking, then through a little noise into it to make it seem less automatic. It would have to be used judiciously, but the effect would be excellent.
When will I get time to code this? I'm not sure. But at least the problem has a solution, and I know what to do now!