ft-SSIBL for “Screen Space Image Base Lighting” is based on a topic I covered in a previous post about Roy Stelzer’s “2.5D Relighting inside of Nuke”. In this shader I tried to reproduced a few approach found in his Nuke script. So with a Normal pass (object or world), you will be able to do some relighting with a HDR map. The shader won’t compute the 9 coefficients (spherical harmonics) needed for you as describe in this paper : http://graphics.stanford.edu/papers/envmap.
This node setup allow you to offset your normal pass on the X,Y,Z axis. It could be use for several cases (relighting is one of them for sure, so I thought I would share this node. All you have to do, is to “Append” this node in your blend file and your are done. It use a normal pass as input, and would output the new normal pass.
Yeahh Matt Ebb just commit my patch (SVN r27733) for the “Color Balance” node in Blender 2.5 Compositing node !!! Now it should be much easier to work with.
You can get a version of blender at Graphicall.org (any version above revision 27733)
There is still some precision issue on the color wheels, I guess some day it will be possible to move the color picker slower.
How to use it ?
First I would recommend you to un-check the “color management” setting in Blender 2.5 or it will make the blacks really hard to control.
If you are not so familiar with color grading, and push-pull techniques, I would really recommend you to watch Stu Maschwitz’s (Prolost) video tutorial using Magic Bullet Colorista. The settings won’t be exactly the same, but the approach quite the same though !
I described the Lift/Gamma/Gain in a previous post, and mostly this node is based on the formulas specified there. We just slightly modified it so the 3 defaults values parameters are equal to 1.0 just like in Colorista. Which makes it much easier to control the blacks. Actually the “Color Balance” node before this revision was the same formula but with lift default value equal to 0.0
Some presets !
While making some comparison test between Colorista in After Fx and the “color balance” node in blender, I tried to mimic some of Colorista’s presets.
You can download the “Blender Color Grading Presets” here : http://code.google.com/p/ft-projects/downloads/list
If you are doing Matchmove, you probably bumped into Lens work-flow issue, where you have to un-distort the footage in your matchmove software, then track it, and export a new undistorted footage, so your client can compose the 3d rendering on top of it and then distort it back.
I don’t really like this work-flow, since for instance AfterFx do not have Cubic Lens Distortion FX and it would be really hard for the client trying to match the distortion back.
Thanks to Jerzy Drozda Jr (aka Maltaannon) for his great tips about Pixel Bender.
So now, you can create a new comp with your distorted footage > pre-comp it > undistorted it with the shader > track it in syntheyes > export the camera to a 3d package > render the scene > import the render into your pre-comp > desactivate the shader. Should match perfectly 🙂
(yeah I know, PFTrack grid with Syntheyes … not cool ! :p )
00:00 : using “Avatar” Trailer 01:07 : using webcam feed
How, why, cool ?
So really basic programming, but I thought it would looks cool. Actually, what was going to be a cool looking animation turn out to become a cool visualisation tool !
I found out that by just showing those pixels in a 3d space based on there RGB values you could see several dimensions at once :
Red value : X axis
Green value : Y axis
Blue value : Z axis
Luma value : is the vector between the black color (0,0,0) -> white color (255,255,255). It means that if the point cloud is closer to the white corner, brighter the picture is (… no kidding 🙂 )
Saturation value : it is the vector perpendicular to the luma vector. it means if the picture is saturated wider the point cloud would be, and of course more it is desaturated finer the point cloud will be. A black & white picture would only show particles on the luma value.
This one was the less obvious to me (but I’m not really smart :p)
You will have to add a quicktime movie (.mov) in the “data” folder called “vid.mov”
For sure all this sounds pretty obvious, and I’m pretty sure I’ve seen people doing this kind of stuff before, but I’m surprised I haven’t seen it in any “video editing” software before (or maybe I miss it).
I think it could be a really helpful tool to have a quick over look on your picture and just in a snap being able to tell if its too saturate, too red, or too blue, too bright or too dark…
Feel free to leave any comments about that or if you know something similar, just drop a line in the comments. By the way, this is my really first complete project with processing, so I probably did things the wrong way, you are welcome to correct me 🙂
YouUp is a fun application by Ubisoft YouUp transform your head live and add fun to your video conversations ! Create your own custom videos with movie style. Add special FX, change your background. It uses a face detection and tracking system to create 3D masks that follow your every move.
I’ve been working on this project for the past couple years as first a CG Artist and then as a Technical Artist Director working on pipeline, realtime matchmove, augmented reality and so more. The technology used is totally done by our team at Ubisoft.
Feel free to give it a try, it’s free, it works with your Windows Live Account. They’ll update it pretty often with new assets some are free, some other you can buy