Visual Effects Artist | Lighter & 3D Modeler
EMMA
SCHABERG
12 March 2023
Class 19 - Technical Direction in Compositing
8 March 2023
Class 18 - Technical Direction in Compositing
Since last class, I was struggling to understand why the cleanplate and foreground were not lining up in comp like they did in the playblast. To troubleshoot this problem, I went step by step asking myself what I was doing and if it made sense.
The frames for the background were read into nuke starting at 117 because that is where I wanted the animation to start in my video. Thus, the render layers were written out at 117-571 and brought into nuke. When I merged the cleanplate and beauty, the beauty immedaitely was not matching the camera move.
First: What is the playback speed? I checked the settings and switched it from 24 to 30 (what it was in maya and from shooting cleanplate), but the problem still appeared to look like this from before...
Next, I compared the playblast and nuke comp to see where each frame of the cleanplate and beauty layer were beginning and ending. It appeared that my cleanplate was playing too slowly, so I used a retime node to shift the cleanplate to play at the same speed it was in maya. This fixed the issue of slipping for now.
Now that are plates are lining up again, let's add back the shadow and occlusion pass!
Roto Background
Remembering one of the suggestions for my project, I began to roto the windows out of the cleanplate to replace with a cleaner looking image or video.
Although the roto was still in its very early stages, I slap comped my beauty layers on top to visualize if this was the right direction.
Nuke Script
My nuke ndoe graph is layed out into 3 parts so far: the beauty, cleanplate, and shadow. I still need to render aov's to give myself more control over the beauty for easier integration.
Shadow
Because I never got a shadow plate because of my large windows, I tried to cheat the shadow by using a keyer to get only the lightest cast sunlight on the floor and grading it down. This is not a foolproof plan, but it might help with the intense shadow seen when woody falls on the already shadowed area of the ground.
6 March 2023
Class 17 - Technical Direction in Compositing
Over the weekend, I rendered the beauty, shadow, and occlusion by every frame. I also began a grey-ball match which will need to be included in final submission in a week from now.
Priorities for next class:
- Apply Woody's Textures
- Clean up Buzz's face texture
- Render Grey Ball Pass
- Compositing cont. and color grade
The test render above felt disorienting because of the shifting of the foreground and background. I found this to be an issue with a time offset node under my cleanplate to resolve the difference in frame numbers.
2 March 2023
Class 16 - Technical Direction in Compositing
Since my last blog post, I ran into some problems that delayed my progress for this class more than I would have anticipated. A strong lesson learned is to constantly save versions of your files as you work so that you can always refer back to the most recent step in your work if you need to go back. When I opened my file and tried importing new fbx rigs, I broke my file and the animation of Woody when the now two rigged characters started sharing constraints and making the same animation. I tried to undo this, but I already accidentally saved too late and couldn't go back. This caused me to go far back in progress and re-do many steps, but this was a lesson well learned. SAVE. YOUR. WORK!
Adding a new character into my scene!
Buzz!
After talking with Professor Gaynor last class, I was directed to add another character into the scene and immediately considered Buzz. I found this model off of cg trader as an fbx and used animations from mixamo.
Below I included a quick playblast of my scene to show the character's interactions. I tried scaling Woody and Buzz larger in the scene as well to fill more of the composition in comparison to last class.
Shadow
A problem I faced while working on my shadow was how to get Woody to appear half in shadow because his body is interacting with the strong light on the floor. I don't think I have the shadow in the right place yet, but it took a while to place and get a strong enough edge. I kept going back and fourth trying to figure out why using a plane in front of the key light to block the light with a strong edge wasn't working. I figured out I had to push my light far, far back in my scene to create a stronger edge!
I will render this for my beauty pass, but this won't be necessary when messing with the shadow pass since it should hopefully blend in with the shadow there seamlessly. This is something I need to work on fixing.
After this beauty render pass, I do believe I need to make the fall off even less soft between the shadow and highlight from the window.
25 February 2023
Class 15 - Technical Direction in Compositing
Over the weekend, I spent time on my camera primarily to make multiple variations until one looked good enough that an object didn't slip in the scene. I also used the trax editor in maya to blend 4 mixamo animations together for my first version of Woody's animation throughout the clip. I left the clip relatively long for the time being, but may need to cut it shorter if time permits by closer to the end of this project. Last but not least, I made an HDR of the environment using the multiple exposure photos I took of a chrome ball and projected it onto a sphere in my Maya scene. I used this to help me implement a quick pass of shadows for a test render.
Woody's Animations
Separated UV's
Here are the three final animations I am blending together to create the story of Woody dancing by himself until he realizes he's not alone and drops like an inanimate toy.
To learn how to blend animations together, I followed this link https://www.youtube.com/watch?v=b5vsjV66nAs on youtube which was very helpful and easy to follow for someone who doesn't animate.
First, I opened a new Maya scene and imported one of the animations from Mixamo. Then, I selected all the rig controls and went to Windows, Animation Editors, Trax Editor. From there, I could create an Animation clip. I already had exported a few for the demo below to show how they are imported together and blended.
Camera Tracking
Nuke Camera Track Settings
After multiple tracking attempts, I found my best track came from setting focal length to approximate constant 300 (our lens we used) and not assigning a film back preset. I set my initial number of trackers to 400 which gave me a large amount to keep once the rejected were removed from the track. I set my Min Length at 12 which would delete an amount of small tracks but not big enough to delete all the good tracks. Then I set my Max track error to 3, but had to set it lower the second time to get a closer solve error to 1. After going back and forth, I got a solve error of 1.08 which is close enough to 1 for my to feel happy enough to make a ground plane card, view the track, and export geo.
Here is a first version flipbook of my animation and camera track combined in Maya, but I had to go back in and tweak the animation before rendering. After fixing my animation, I set up my render layers and imported my HDR to create shadows.
HDR and Lighting
This is a preliminary lighting setup for now which will be cleaned in the next few days
First Render Pass
After examining the pass below, I feel okay with the camera movement for now. The shadow will need more work in nuke and it's time to start reworking the textures of Woody as we develop the lighting of our scene. Hopefully, I can create a lookdev rig of the textures I can make!
21 February 2023
Class 14 - Technical Direction in Compositing
After looking through several options for what I could use for Woody's animation, I stuck with the dancing and falling animations I found previously. For the next step, I am going into the uv's of my character rig and moving each of the shells in their respective 1x1 square. I am grouping the shells based on if they have the same material, and scaling the shells of that group to the largest I can to use the maximum amount of space. Thankfully, Woody's rig came with clean cut uv's that did not stretch dramatically on shells that would need textures with patterns like his shirt or pants.
Woody's UV Layout
Separated UV's
Here is a layout of separating the uv shells in their respective groups in the positive space of the UV layout. Each 1x1 shell will have its own texture, whether it is for the jeans, the shirt, the skin, etc.
Uv Stretch Test
After separating the UV's, I put a standard surface shader on all of Woody and changed the base color to a checker pattern. I notice the pants and the shirt, my biggest areas of concern, seem to have no stretching.
Next, I selected all the skin components to compare the uv size. I noticed that they had different scales of the UV's which meant they were relevant scale. To fix this, I went to the UV toolkit and used the Texel Density to make all the shells the same relative size.
Visual Effects Artist | Lighter & 3D Modeler
EMMA
SCHABERG
19 February 2023
Class 13 - Technical Direction in Compositing
Over the weekend, I spent some time shooting live plates for my next project! I wanted to use an indoor shoot with daylight, so I opted to shoot in my living room and kitchen on saturday and sunday. I considered both hand held and tripod movement for this project, but overall chose to do hand held because it adds to the story that the camera is a person walking in on a living toy.
Some problems came from using hand held camera movements, however. This included shaky footage and harder tracks once I tested it in nuke.
Project 3 Proposal
Footage (Camera Tracking)
Option 1:
I primarily shot 2 locations in my apartment. More favorably, I did a hand-held camera move peaking out behind a wall and moving in towards the space that "Woody" would be moving.
Option 2:
In this next footage, I recorded a perspective of walking in on the toy from the kitchen. This would give an interesting reflection for cg.
Rigged Character of Woody
This was the rig of woody that I bought from cg trader! Some of the textures will need to be reworked through the next week or two, but after testing uploading Woody through mixamo with a random animation, everything about the rig worked smoothly!
Aft. I have attempted to find some mixamo animations to use on the rig, one of which would be woody falling once he realizes there is a person (the camera) watching him. Before he notices, I have considered whether he is yelling at something and angry, or, dancing.
Slap-Comp
Calender
Week 7
Class 13
- Have Live Action Plates and Rig, start camera tracking
- Finalize animations to use
Class 14
- Have clean track
- Start re-texturing Woody
Week 8
Class 15
- Have all shadow render layers, test rendering every 5 frames
- Continue cleaning up textures
Class 16
- Render every 2 frames with any changes or updates to textures
- Lightwrap, Edge Blur, Re-Lighting
Week 9
Classes 17 and 18
- Render passes
- Composite into Nuke
- Troubleshoot
- Breakdown
Week 10
- Submit!
Uv Layout
After test tracking in Nuke, I brought the camera fbx into Maya. This was a very quick process and I did not get a great track. The error under camera tracks was 1.99, but for a test I just needed to write out the fbx camera and bring it into nuke.
14 February 2023
Class 12 - Technical Direction in Compositing
As a final wrap for my buddha project, I have organized my nuke node graph and composited the rest of my assets, including the shadow and caustic. Here is my most recent rendered composite which includes the breakdown, grey sphere match, and nuke node graph!
Because my rock did not make a very noticeable caustic even with a flashlight held up to it, I made the caustic very subtle in my final render.
Although I ran out of time to integrate the sequence in this submitted version, I plan to include the banding plate in a future submission. I did a rendered test to show how it looks comped in, but the texture swims through the buddha which will need to be fixed in the next render.
Using the contact sheet node in Nuke, I layed out all my subsurface aov's and shading layers in a still to help deliver my breakdown. While I didn't have to alter any of my subsurface aov's before merging, it did come in handy to shuffle out an aov to grade down the intensity since some come out so white and exposed. In my final render, you can still notice how white the buddha appears to my rock reference.
Over the past few weeks, I enjoyed working on this project and expanding my knowledge on texturing, aov's, and compositing. This project has been a test to using my eye from real life and trying to replicate every detail of my rock in CG. There are many things I would want to change if given more time and that I would like to implement in the future, such as displacement, caustics, and shadow.
Project 3 Proposal (Cont.)
As stated last class, I proposed integrating a 3D animated character like Woody or Buzz into a live-plate where the character will notice the camera and drop to the ground. I proposed this idea because of my interest in working in the animation industry and how it would benefit me to use my lookdev skills to improve the textures of the character of the rig I end up getting. If I don't use either of those characters, I can see using any toy rig as suitable for delivering the story of this project. A video I found online perfectly described my pitch and what I would like to do.
Credits to @markcannatarofilms on TikTok
12 February 2023
Class 11 - Technical Direction in Compositing
As I wrap up this project, I am rendering all the layers needed to composite. The trouble I experienced with the render farm last week was no longer an issue when we discovered this was due to the aiStandardVolume shader. When using that shader on our object, it broke all the other layers and wouldn't render itself. When I imported a new object just for the volume layer alone, I assigned it only the standard volume shader and it rendered perfectly through the farm!
Once I rendered every layer, I comped them over the cleanplate to visualize the movement and ensure none of my textures were swimming through the object.
I used a dirty surface to mimic the grimier parts of the rock that were brown.
Here is my current node network that needs to be cleaned up for the final render and submission tomorrow!
For the next project, I am pitching to use a rig of Woody, Buzz, or Jessie from toy story. With this rig, I will integrate the toy into live action where the toy will not notice the camera as it is moving. When the toy sees the camera, it drops as the toys do in the movie so that the people don't know they are alive. Since I will have to get a cheap rig, I will use this as an opportunity to improve the textures while referencing real life.
7 February 2023
Class 10 - Technical Direction in Compositing
As I attempt to render my aiSurfaceShader layer, I have run into some obstacles that have prevented me from getting a frame to comp. When I render in Maya locally, the render will only be 5% done after 15 minutes. When I submit to the render farm, I would receive no frames back and the render time would be 8 seconds.
Another issue I kept facing was that my Lucy object would no longer appear in the render view with any other shader than the aistandardsurface. After an hour of trouble shooting with no luck, I chose to switch the model with the buddha. The buddha had no problem rendering with any shader, so I think whatever source I got the Lucy model from was not reliable. I will use the Buddha Stanford Model from this point on in the project.
Here is a quick comp of my subsurface and glass using a mask in nuke with my new object.
I made a new aistandardsurface in maya that I would use as the dark brown dirty surfaced parts of the rock. My plan is to use another mask to shuffle out certain areas of my buddha since only small parts have the dirt brown areas.
Here was the result compared to close-up pictures of my rock!
Next, I wanted to start integrating my object by rendering a shadow matte and compositing it in.
In preparation to render out multiple frames, I animated the buddha to slightly lift from the surface! I wanted the motion to be slight and slow so that the focus is on the texture.
8 February 2023
Regarding the issue I was having yesterday where the aiStandardVolume would not render through the farm, I discovered it was because my file (specifically my shader) broke before I submitted it. That explains why my renders would not return and only last a few seconds (because there was nothing to render!). I discovered, to avoid breaking the connection, I should duplicate the object and material into the new volume render layer. After connecting the buddha to the same surface shader from yesterday, it worked both in my viewport and through the farm!
Oddly enough, as soon as I hid the original Lucy model from the viewport, the render of the buddha alone was 10x faster than the render of lucy alone (with the same shader!)
5 February 2023
Class 9 - Technical Direction in Compositing
Moving forward with the rock shading process, the next render layer I am going to add is a volume layer called "lucy_aimix". In this layer, I will assign Lucy with an AiStandardVolume.
At first, I saw all black in the render view. This is easily changed when I go into the shape node, scroll to the Arnold tab, view "volume attributes" tab, and change volume padding to 1 and the step size to .01.
I noticed immediately that the render time became very increasingly heavy with this new shader, so my screencaps from this point on will be very grainy as we tread forward. I noticed that now the object shader was visible and appeared very "cloudy".
Next, I went into the aiStandardVolume shader and changed the transparency weight color, in this case, to red. If all goes correctly, the render will show the object turning red as well. Let's find out...
When I change the scatter color to green, the render now appears like this. We can use transparency to map a marble or crater shader. This shader will help me achieve the marbling and irregularities you may notice in crystal rocks!
Since I was curious to see the visual difference between mapping marble or crater into the transparency weight, here is a side-by-side comparison of each.
It seems to map the transparency with marble was a tricky decision since the render time increased exponentially... However, through the red I notice the small variation and detail it maps onto the object. The default marble map may use can be seen in the above-left image, and it's clear that the variation is little to none.
On the other hand, using the default crater map gives more variation between 3 channels where I can change the melt, balance, and frequency parameters based on my own rock reference to get the best match!
31 January 2023
Class 8 - Technical Direction in Compositing
Upon suggestion from my professor, I have decided to change my photoset for this project to one with better lighting. This is the photoset I will use. I will use the teal rock as a reference.
So, with these new photos, I went back into Maya and did another grey sphere match and cube match!
Once I felt my ball match was close enough for the time being, I moved on to adding the stanford model "lucy" to my scene.
I started with making a subsurface shader and going through a few iterations of adjustments until I felt the subsurface was close to the rock reference.
A problem I ran into upon rendering the subsurface beauty layer is that upon turning off the image plane of the clean plate, the subsurface would change appearance. It appeared flatter, although the alpha was completely white so when composited in nuke it wouldn't appear like it did in the arnold renderview.
After rendering the beauty layer, I made a new render layer where the shader on lucy had no subsurface but instead had transmission.
After rendering the beauty and glass, I made a new render layer called "fresnell" with interpolation set to "smooth". This shader was made with a surface shader with output color set to "ramp".
Taking these rendered images into nuke. I composited them so that only the red area of lucy fresnell would have transmission, while the green had subsurface!
I made the transmission red in the first pass to visually see it better to ensure my composite was correct, but then changed it to the regular transmission color I set.
As an alternate method, I used an AI mix shader on a new layer to try blending in arnold rather than nuke! With this method, the blend feels much smoother. I used a pink transmission to visually see the difference more easily.
Visual Effects Artist | Lighter & 3D Modeler
EMMA
SCHABERG
29 January 2023
Class 7 - Technical Direction in Compositing
In the next project, I will be working with subsurface to create hybrid reflection, refraction, and translucency elements in 3D. This will include a more complex model sourced from stansford university called "Lucy". Complex models like these are important because they have various thicknesses which can be used for subsurface to assess how the light passes through the various parts of the model.
During class 6, I used the Buddha stanford model to quickly apply subsurface through an ai standard surface shader. This allowed me to learn what different appearances of subsurface you can get when adjusting weight and radius.
Over the weekend, I met with two classmates to take our own photography using the RS canon camera with a 200 ml lens. Because of unexpected weather, we were forced to film indoors. We rotated with various places to take our photos and get the best lighting.
Because we needed some movement, we compromised by flowing wind at the items in our scene, such as a soft rustle in the flowers.
I chose to work with the last photoset.
First, I used the same may file as the last project as a template and started replacing the image plane and making new render layers. Then, after making my cube match, I started on my sphere match.
After adding the projected HDR I put together in photoshop, I checked to see if it was placed accurately into the scene using a chrome shader on my sphere match. Looks like the keylight is accurate.
I rendered out my shadow pass and beauty pass and quickly comped to see how far I was to my reference. The contact shadow did not feel accurately placed as well as the diffuse direct was very odd.
I ren After a few more adjustments in maya, I rendered a new pass and checked my contact shadow. My ball shadow still was interesting. It was not receiving enough bounce from the ground plane.
After making a quick sphere match that will need more tweaking in the future, I started importing my stanford model of lucy and added an ai standard surface shader. then I started to adjust the subsurface like we did in class.
Double Shadow
While merging the shadow over the cleanplate, I noticed that the color of the shadow changed as it merged over (it appeared slightly transparent or less dark then the shadow plate). Because of this, when the sphere animated over the shaodw portion of the cleanplate, the shadow comp did not blend seamlessly with the shadow in the cleanplate.
Troubleshoot:
I attempted using a colorcorrect node below the shadow plate in order to more seamlessly blend the shadow's with one another, but the shadow's intersection still feels visible.
24 January 2023
Class 6 - Technical Direction in Compositing
In the final stretch of this project, I am continuing to render my occlusion and projected shadow passes and cleaning up everything in nuke.
In my last blog post for class 5, I mentioned how the shadow was not bleeding correctly and was overlapping. First, I needed to find a technique that would prevent the shadow pass from becoming darker upon overlapping the shadow in the clean plate.
Notice how the shadow does not overlap the plate, but feels like it is mixing with the cleanplate shadow to form a darker shadow. If I darken the shadow pass to match the shadow in the image, it becomes even darker.
To solve this problem, I asked myself why the shadow would appear to mix with the cleanplate instead of overlap. Keep in mind, the merge node was set to "over". This must be an issue with the alpha, so I took a "keyer" node and set input to alpha and output to rgba.alpha. Next, I moved the sliders until I noticed that the shadow finally overlapped the clean plate without feeling translucent! First problem solved.
Without Keyer
With Keyer
Next, I noticed that the shadow still felt like it was slightly moving as I scrubbed through the timeline. How could that be, unless the clean plate and shadow plate did not match pixel for pixel over each other?
Let's zoom in an area to see how separated the plates are. A reason for why the plates were shifted could be because the person behind the photography was taking it outside on a tripod, and the tripod could have slightly shifted on the gravel / rough surface.
After a small transform, the plates were perfectly aligned to help create that seamless shadow!
Here is a rundown of adding the shadow projection to the grey ball match. There were no issues prevalent in this stage of compositing.
Using the reference image for the occlusion, I rendered an ambient occlusion pass of the shadow and comped it into Nuke. Unfortunately, the shadow was to spread out and didn't allow it to blend with the clean plate because the shadow matte was smaller than the spread of the occlusion. Instead of going back into 3D to render another pass, I would find a simpler way in Nuke.
Using a color correct node, I set the gain of highlights, midtones, and shadows to red, green, and blue to visually separate them. Then, I went into the "ranges" tab and pushed those values to get a clearer separation!
Before
Separated Visually
After
Color Correcting the Gains
Grey Ball Update Render
Pokemon Ball Update Render
First Submission for Project
(to be resubmitted and Improved)
After a long day of work, what's better than taking a mental health check by grabbing some friends and going to Fancy Parkers for donuts. !
Check out their blogs to see their own process for this Match-To-Live-Action Project!
Thalia Valencia-Murphy
https://thaliavalenciamurphy.wixsite.com/website/blog/categories/vsfx-420
Nick Neff
https://nicholasneff19.wixsite.com/2023/vsfx-420-blog
22 January 2023
Class 5 - Technical Direction in Compositing
For the next class, we are expected to reach about 90% complete with our project. This would include adding the shadow projections onto the sphere and ensuring they are physically accurate, adding a new rolling object and rendering layers for every pass needed to integrate it as well as the grey ball, and fine-tuning.
The object I chose was a Pokemon ball. However, the model from Turbosquid did not come with any textures, so I chose to do the shading myself.
At first, I was having trouble connecting in texture maps into metalness and bump map without seeing a change. After receiving some help from my classmate Nick Neff, I discovered I needed to check on "alpha is luminance" under the color balance tab inside the file node.
I also noticed the texture was way too noticeable on the ball and lowered it using the Alpha Gain.
Here is my latest update on texturing the Pokemon ball! I wanted it to feel duller and scratched up than specular and in perfect shape.
While projecting the shadow matte of the fence back onto the ground plane and sphere, I was a little confused as to why the ball received the shadow much sooner that the shadow that hits the ground.
However, after closer inspection of what makes up the scene, I think it could be physically accurate because the top of the ball is hitting the shadow in space sooner than the ground behind it because it is closer and higher to the object casting it.
An issue I came to while taking my projected shadow render pass and bringing it into nuke is that the shadow plate doesn't fill the entire image and I animated the ball out of the bounds of the shadow plate.
To fix this issue, the only easy resolution I could come up with was adjust the keyframe of my animation so that the ball never falls out of bounds.
19 January 2023
Class 4 - Technical Direction in Compositing
Hello everyone, During class 3 of Technical Direction in Compositing, we lectured on aov's, render layers, shadow matte's, rendering, and compositing back into nuke. Many of the small mistakes I was making in my last blog were cleared up, including the fact that I mistaked the shadowmatte aov for the ai shadow matte shader.
While preparing to render my passes, I realized I needed to add a projection to the ground plane in the grey ball layer. However, I struggled with why the perspective looked stretched.
Update 20 January:
I needed to select renderCam as the camera to project the image through in the projection node after selecting "perspective" as the type of projection.
Using the shader to render my shadow pass worked correctly when I composited the shadow in nuke.