Wednesday, 22 April 2015

Softimage button to apply a saved preset to a tool



Let's say you have a good preset for the Curve_To_Mesh tool and you want to apply that preset many times to different curves. It is slow to keep loading the preset manually each time you apply the tool.

Here is a way to apply the tool and then apply the preset in one handy button:

Firstly create the preset for the tool and save it somewhere. You will need the path to the preset later.




Next, open the script editor and copy the command form a previous usage. There are some command arguements that I am not yet familiar with, so copying from a previous usage guarantees that the syntax is correct.




for each curveObj in Selection
ApplyGenOp "CurveListToMesh", , curveObj, siUnspecified, siPersistentOperation, siKeepGenOpInputs
next


for each polyObj in Selection
LoadPreset "C:\Users\3d\Autodesk\Softimage_2012_Subscription_Advantage_Pack\Data\DSPresets\Operators\d1.Preset", (polyObj+".polymsh.CurveListToMesh")
next





Now I create a new Shelf



In the new shelf, I create a new Toolbar


Now I can drag my code from the script editor into the toolbar. That creates a button.




The first line reads the selection and runs the tool on the selected curve(s).

Softimage will have the newly created poly mesh object already selected, which makes the next part so much easier.

The second line gets the name of the selected poly object and applies the preset to the stack. This is where wou will need the location of the preset. Also, note the syntax of the last argument.

Having come from a Maya and MEL background I found this syntax really easy to pick up.

Monday, 20 April 2015

Extending a camera track in PFTrack

If you have tracked a shot in PFTrack and then the shot gets extended and you want to extend your track, but keep the old solve, here is the workflow which worked for me.


  1. Import the extended clip
  2. Copy your node tree in PFTrack. I created a new Node Page using the P+ button.
  3. Paste your node tree into the new node page. I do this so that I don't accidentally overwrite the existing solve.
  4. If you have any User Tracks, select and export them.
  5. If you have any Auto Tracks, select and export them as well.
  6. Connect the new clip with the extra frames into the top of your tree.
  7. When you connect the new clip, the User Tracks and the Auto tracks will not work anymore. Select all the User Tracks and delete them. Then Import the tracks you exported in step 4
  8. Do the same with the Auto Tracks.
  9. Your User Tracks will now have keyframes only where they were previously tracked. You now need to track the un-tracked frames for all of those User Tracks. Select them and press the Track button in the direction you need to fill.
  10. The Auto Tracks will also need to be tracked for the missing frames. Simply select them all and press the Auto Track button. Select 'extend' when the dialogue box appears.
  11. You now have all the trackers in 2D, they need to be solved for 3D. Go to the Camera Solver node and press the Solve Trackers button.
  12. Now you are ready to extend the camera solve. In the Camera Solver node, press the extend button in the direction you need. The camera solve will extend out to the new frames and you should now have a camera for the whole shot which does not deviate from the old solve.

Wednesday, 13 August 2014

Per-Particle Field Attributes

Here is an incredibly handy tip from the FXPHD training course MYA217 Maya Effects by Pasha Ivanov.

If you have a particle system being affected by a field, you can control the magnitude (or any other parameter) of the field on a per-particle basis. Here's how:

Let's say starL_nParticle is being driven by approachCurve_volumeAxisField
  1. Create a new attribute on star_nParticle
  2. Make the new attibute per particle (array)
  3. Name the new attribute approachCurve_volumeAxisField_magnitude


You now have a per-particle attribute to control the magnitude of the field's effect. You can create a per-particle attribute for any of the field's perameters (e.g. alongAxis) but the crucial thing to remember is the naming of the per-particle attribute: it MUST be in the form of

fieldName_perameterName

Maya will understand that syntax and make the connection for you.




Tuesday, 17 June 2014

Passing rgbPP to Instanced Objects

This is a nice one that I have needed quite a lot in the past. Now, thanks to Arnold, it's extremely easy to do.

Here is the problem:

I want to vary the shading on objects that are instanced to a particle system.

In Maya and Mental ray this is not easy to do. In fact I don't know of any way to do it. In Arnold, however, it is very straightforward:

1. Create your particle and instancer system as you normally would do.

2. Assign a shader to your instanced objects (not the instancer object)

3. create an aiUserDataColor node

4. type rgbPP in the Color Attr Name in the aiUserColor node

5. connect aiUserDataColor.outColor --> diffuse in your shader (or whatever channel you need it to go to)

6. type rgbPP into the Export Attributes in the Arnold section of the particle object.


 That's it. A really easy and long overdue feature.









Friday, 13 June 2014

Passing ageNormalized to Arnold

Passing the ageNormalized attribute from Maya particles to an Arnold shader is extremely useful and extremely not documented in Solid Angle's user guides. I will show two ways to do it - one is my own recipe, and one of from Pedro Gomez from the MtoA list.

Here is the problem:


I would like to pass my particle's ageNormalized to a shader, rather than age.

As you may be able to see from the screenshot, passing age sort of works, but not quite. Some of the oldest particles have reached the end of the colour ramp and wrapped around to the beginning of the ramp again.

If I try just typing in ageNormalized into the Export Attributes, it does not work at all, Arnold just reads the first value of the ramp and applies that value to every particle.

Is there a smart workaround for this? Can I put age/lifespanPP somewhere in the shader? But where? And talking of export attributes, can I put more than one attribue in there (eg age, lifespanPP, velocityPP)? 







First is my method: 

1. Add a new Dynamic Per-Particle Attribute, userScalar1PP, say.

2. Adding the runtime expression:

nParticleShape1.userScalar1PP=nParticleShape1.age/nParticleShape1.lifespanPP;


3. Put userScalar1PP into the Export Attributes

4. Connect the particle sampler to the shader ramps, but use the UserScalar1PP attribute instead of age.






Second is Pedro's way - more correct and elegant:

Export both the age and lifespanPP and catch those in two aiUserDataFloat nodes. Then use a Multiply/Divide node and divide the age/lifespanPP. Then pipe that into your shader. This is a much better as it does not require an expensive runtime expression to be cached.



Here is the shader: one ramp for Colour and one for Opacity.




And the answer is yes, you can export any number of attributes, so long as they are seperated by a space in the Export Attribues box.







A sample scene is available in Maya 2014 MA format

UV particles via SOuP

Here is a small SOuP technique for producing a plane of particles with a UV gradient on their RGB.

First you need a plane, then emit some particles from that plane. Make sure that the particles have the rgbPP attribute available as we will need to put an expression on it.

Create a SOuP TextureToArray node
Create a SOuP rgbaToColorAndAlpha node
Create two ramp texture nodes - ramp1 is black to red along U, ramp two is black to green along V

Connect the following:

ramp2.outColor -->  ramp1.colorOffset
ramp1.outColor --> textureToArray1.inColor
polyPlaneShape.worldMesh[0] --> nParticleShape.inputGeometry
polyPlaneShape.worldMesh[0] --> textureToArray1.inGeometry
textureToArray1.outRgbaPP --> rgbToColorAndAlpha1.inRgbaPP 

Also connect the emitting plane's transform node to the particle's transform node as shown in the node graph.



Now set the rgbPP using the creation expression:

rgbPP=rgbaToColorAndAlpha1.outRgbPP

Rewind and step forward one frame so that the particles are emitted. Then set their initial state and disconnect the emitter and the connection between polyPlaneShape.worldMesh[0] --> nParticleShape.inputGeometry





The particles will now be dynamic again.


In Nuke, plug in your rendered particles into the STMap node as shown in the image below


GPU renderers

I will be testing some particle renderers in the next few days - Arnold, Fury and Krakatoa.


This first test is Fury

20 million nParticles
motion blur switched ON
4 x multisampling

13.26 seconds (dual Xeon E5 - 32 cores, 48GB RAM, Nvidia Quadro 4000)

I'm showing the alpha channel only because I currently just have the demo version of Fury and the watermark is distracting in the colour channel.


The next test is Krakatoa

motion blur OFF, render time 11 seconds

It's slightly trickier to get started with Krakatoa, but I think the results look amazing. Again, this is a lot of particles (14 million)


Here is my setup for Arnold, but I cannot seem to get the opacity to work properly.






More details as soon as I can get some help making this work.

I have finally got this working. Please see my later post "passing ageNormalized to Arnold"