Monday, 18 May 2015

Blending nCloth caches using Blendshapes

With many thanks to David Knight, nCloth guru, I present his method for blending two nCloth caches on a per vertex basis. You can have one half of a nCloth following one cache and the other half following a different cache.

1. Create two simulations for your cloth sim. Use a copy of the mesh for each sim. If the meshes do not match exactly (same number of vertices) then this method of blending will not work.

In my example I have one wide simulation and one which is narrow.

2. Cache your simulations.

3. Make another copy of the mesh, label it 'blendMesh'

4. Select the two nCloth meshes and finally shift-select blendMesh. Create a Blend Shape deformer (Create Deformers > Blend Shape)

5. In the Blend Shape attributes, set the weights for each input to 1.0

6. Assign weights per vertex. To do this open the Paint Blend Weights Tool (in the Edit Deformers menu). Do not manually paint blend weights because the sum of blend weights on each vertex must be equal to 1.0. Painting will not allow fine control. You can edit blend weights per vertex  manually in the Component Editor, but it is also possible to use an image set weights.

7. I have created some ramps in photoshop and saved them as TIF files. First I created the blengMap_H ramp, then I inverted the image (ctrl-I) which will subtract the value of each pixel from 1.0. That inverted image becomes blendMap_H_inverted. This will ensure that when the two ramps are added together the result will equal 1.0

I followed the same procedure to create the vertical ramps. The version of the ramp you need to use will depend on the orientation of your simulations. It's useful to have any combination of ramps saved in a library.

8. Apply the blendMap ramp to the Blend Shape deformer. Choose one of the Targets on the Blend Shape node and then under the Attribute Maps section, press Import and browse to where the blendMap ramps are stored.

Once the blendMap is assigned to the first target, chose the second target and assign the inverted blendMap to it.

That's it. You should now have a mesh which one end follows one cache and the other end follows a different cache.

Thursday, 7 May 2015

nCloth Matching Mesh Constraint

If you want a high resolution nCloth, it can be very slow to simulate. One method is to generate a low resolution nCloth to produce the large scale movement that you require and then simulate a high resolution nCloth which follows the low resolution mesh on the large scale but will display small scale details of its own.

Here is one way to set up this systerm.

  1. Create a low resolution nCloth and simulate the large-scale motion. I will call that low resolution nCloth mesh "cloth_L0"
  2. Cache cloth_L0
  3. Smooth cloth_L0 using Mesh > Smooth. Be careful of having too many divisions as very high subdivision levels will cause significant slowing down of the simulation. I usually choose 1 to start with and then repeat the process if I need more detail.
  4. Export the smoothed cloth_L0 as Alembic using Pipeline Cache > Alembic Cache > Export Selected to Alembic. If you want to preserve UVs, remember to tick the check box in the options box.
  5. Import the Alembic file bac in to your scene. Rename that imported mesh as "Alembic_Import_L1"
  6. Duplicate Alembic_Import_L1. Rename the duplicate "cloth_L1"
  7. Create an nCloth from cloth_L1
  8. Select cloth_L1 and shift-select Alembic_Import_L1, then create an Attract to Matching Mesh constraint using nConstraint > Attract to Matching Mesh 
  9. In the constraint, choose a Dropoff Distance that makes sense in your scene. You want cloth_L1 to be able to deviate just enough from Alembic_Import_L1 to add some good detail, but not so much that it no longer the follows the large scale motion of the original simulation.
  10. In the Strength Dropoff ramp, create a profile that has a value of 0 in the left and 1 on the right. An exponential curve will work well.
  11. Tune the forces acting on cloth_L1 to give a variation over the movement of cloth_L0.
You should now have a high resolution nCloth which follows a low resolution cloth but has extra details. This process can be applied any number of times, depending on the power of your workstation.

In my example above I have chosen to use a division level of 2 because the original mesh was so low resolution I knew I would require quite a lot more resolution to get more detail.

Wednesday, 6 May 2015

Velocity Field from Moving Geometry

If you want to create a velocity field from a moving mesh, here is a way to do it:

1. With your geometry selected, emit nParticles.

2. For the emitter, set:
  • Emitter Type to 'surface'
  • Increase the rate to, say, 50000 (depending on the size of your mesh)
  • Key the emission rate so that emission stops after a couple of frames.
  • Emission Speed and Normal Speed to 0
  • check the 'Need Parent UV' option

  4 Add the following per-particles attributes:
  • parentU
  • parentV
  • goalU
  • goalV

Make a creation expression on the nParticle object:


 6. Assign the geometry mesh as a goal for the nParticles. Set the Goal Smoothness to 0 and Goal Weight to 1.0

 Now you should have some particles sticking to the mesh.

7. Create a fluid container. You can use auto-resize if you want.

8. Select the fluid and the nParticles and create a fluid emitter.

 9. Set the emission to zero for Density, Heat and Fuel. Set the emission speed attributes to 'Add' and the Inherit Velocity to a value greater than zero.

That's it. You should now have the nParticles emitting velocity in the fluid. You can visualise the velocity field with the Velocity Draw option on the Fluid shape node.

You can use the velocities generated by this method to drive other simulations - nCloth, particles or fluids.

Wednesday, 22 April 2015

Softimage button to apply a saved preset to a tool

Let's say you have a good preset for the Curve_To_Mesh tool and you want to apply that preset many times to different curves. It is slow to keep loading the preset manually each time you apply the tool.

Here is a way to apply the tool and then apply the preset in one handy button:

Firstly create the preset for the tool and save it somewhere. You will need the path to the preset later.

Next, open the script editor and copy the command form a previous usage. There are some command arguements that I am not yet familiar with, so copying from a previous usage guarantees that the syntax is correct.

for each curveObj in Selection
ApplyGenOp "CurveListToMesh", , curveObj, siUnspecified, siPersistentOperation, siKeepGenOpInputs

for each polyObj in Selection
LoadPreset "C:\Users\3d\Autodesk\Softimage_2012_Subscription_Advantage_Pack\Data\DSPresets\Operators\d1.Preset", (polyObj+".polymsh.CurveListToMesh")

Now I create a new Shelf

In the new shelf, I create a new Toolbar

Now I can drag my code from the script editor into the toolbar. That creates a button.

The first line reads the selection and runs the tool on the selected curve(s).

Softimage will have the newly created poly mesh object already selected, which makes the next part so much easier.

The second line gets the name of the selected poly object and applies the preset to the stack. This is where wou will need the location of the preset. Also, note the syntax of the last argument.

Having come from a Maya and MEL background I found this syntax really easy to pick up.

Monday, 20 April 2015

Extending a camera track in PFTrack

If you have tracked a shot in PFTrack and then the shot gets extended and you want to extend your track, but keep the old solve, here is the workflow which worked for me.

  1. Import the extended clip
  2. Copy your node tree in PFTrack. I created a new Node Page using the P+ button.
  3. Paste your node tree into the new node page. I do this so that I don't accidentally overwrite the existing solve.
  4. If you have any User Tracks, select and export them.
  5. If you have any Auto Tracks, select and export them as well.
  6. Connect the new clip with the extra frames into the top of your tree.
  7. When you connect the new clip, the User Tracks and the Auto tracks will not work anymore. Select all the User Tracks and delete them. Then Import the tracks you exported in step 4
  8. Do the same with the Auto Tracks.
  9. Your User Tracks will now have keyframes only where they were previously tracked. You now need to track the un-tracked frames for all of those User Tracks. Select them and press the Track button in the direction you need to fill.
  10. The Auto Tracks will also need to be tracked for the missing frames. Simply select them all and press the Auto Track button. Select 'extend' when the dialogue box appears.
  11. You now have all the trackers in 2D, they need to be solved for 3D. Go to the Camera Solver node and press the Solve Trackers button.
  12. Now you are ready to extend the camera solve. In the Camera Solver node, press the extend button in the direction you need. The camera solve will extend out to the new frames and you should now have a camera for the whole shot which does not deviate from the old solve.

Wednesday, 13 August 2014

Per-Particle Field Attributes

Here is an incredibly handy tip from the FXPHD training course MYA217 Maya Effects by Pasha Ivanov.

If you have a particle system being affected by a field, you can control the magnitude (or any other parameter) of the field on a per-particle basis. Here's how:

Let's say starL_nParticle is being driven by approachCurve_volumeAxisField
  1. Create a new attribute on star_nParticle
  2. Make the new attibute per particle (array)
  3. Name the new attribute approachCurve_volumeAxisField_magnitude

You now have a per-particle attribute to control the magnitude of the field's effect. You can create a per-particle attribute for any of the field's perameters (e.g. alongAxis) but the crucial thing to remember is the naming of the per-particle attribute: it MUST be in the form of


Maya will understand that syntax and make the connection for you.

Tuesday, 17 June 2014

Passing rgbPP to Instanced Objects

This is a nice one that I have needed quite a lot in the past. Now, thanks to Arnold, it's extremely easy to do.

Here is the problem:

I want to vary the shading on objects that are instanced to a particle system.

In Maya and Mental ray this is not easy to do. In fact I don't know of any way to do it. In Arnold, however, it is very straightforward:

1. Create your particle and instancer system as you normally would do.

2. Assign a shader to your instanced objects (not the instancer object)

3. create an aiUserDataColor node

4. type rgbPP in the Color Attr Name in the aiUserColor node

5. connect aiUserDataColor.outColor --> diffuse in your shader (or whatever channel you need it to go to)

6. type rgbPP into the Export Attributes in the Arnold section of the particle object.

 That's it. A really easy and long overdue feature.