I worked on this movie at two stages, one stage was to support a CGI real-time, puppeteering approach of creating performance using MIDI based gloves to drive a CGI double.
A shape modeling pipeline was setup in both XSI and Maya to manage the versions of a large number of shapes in a XML format to encode only the sparse data required for the shapes which could then be automatically rigged up in Maya for the puppeteers.
The second stage was implmenting a 2.5d solution to the performance of the puppet suits. With special attention paid to live feedback in XSI's composting package. Where eye direction could be animated in 3d and feedback giving within XSI in near to realtime for playback capture.
It was a fun project, none of it ended up in the film as far as I know.