Featured Post

1.Vox V5151 Bluetooth Home Theater (5.1, Multi Connectivity) (Black)     Brand Name ‎IKALL ...

Continue reading

Saturday 30 April 2022

SOFTWARES USED FOR VISUAL EFFECTS

Visual effects is used for the integration of live-action footage with artificially created realistic objects, characters and environments.Professionals currently use some soft wares to create almost magical imagery to add a sense of realism.Mostly in dangerous,impractical, or impossible to shoot situations we need to do vfx.

Softwares used for visual effects

Autodesk 3ds max
3ds Max is mainly used in the video game industry & Architectural modelling.It is used to create 3D character models, game assets,visual effects & animations.It is also used to create building models. Architects use the program to create 3D models of Interior and Exterior architecture to better understand the building or the object.

Maya

Maya is used by 3D animators across the globe and is built for feature film production.Maya is a better animation tool.

Maya vs 3ds max

3DS Max is better for modeling, texturing, meshing models and mobile games development  Maya- Maya is better for Video games and animation.The initial release for Maya was Feb 1988,3DS Max the initial release was around 1996.

 (Language availability)    

Maya is available in English, Japanese, and Chinese languages    
3DS Max in English,Brazilian,German, French,Portuguese,Japanese, Chinese, and Korean.

Maya is available for Windows, Linux, and Mac OSX operating system.
3DS Max is available for Windows operating systems(OS) only.

Zbrush

Other tool to create high resolution model is zbrush.ZBrush is a digital sculpting tool used for creating a high-resolution model.ZBrush used for sculpting, rendering and texturing alone while Maya used for animation, VFX modeling, rendering and lighting.Autodesk Maya is a complete package that provides modeling, simulation, rendering, and visual effect, motion graphics, and animation.

Mocha
Mocha Pro is best software for planar tracking, rotoscoping,removing object,stabilization, and mesh tracking, camera solving.VFX artists always prefer Mocha for its ease of use and outstanding tools for rotoscopy ,mesh tracking and camera tracker using track mattes.Mocha is available as a standalone application and as a plugin in your favorite softwares like avid,premiere etc.After analysing the shot mocha Pro can export the shot's tracking data, roto shapes, lens calibration and 3D data to post production softwares in a wide variety of formats.

PowerMesh tool  in mocha enables a powerful sub-planar tracking process for visual effects and rotoscoping. Now Mocha Pro can track warped surfaces and organic objects and it is now highly used in digital makeup shots.

PowerMesh is simple to use and faster than most optical flow based techniques.Use PowerMesh to to drive roto shapes with less keyframes.You can export mesh vertices to Null layer in After Effects,Nuke tracking,Alembic format for Flame, C4D etc.

Roto with Less Keyframes

Mocha’s masking workflow reduced manual keyframes.Small no of keyframes are enough to rotoscope because mocha automatically track the rotoscoped subject.

X-Splines and Bezier splines are available to rotoscope in mocha with magnetic edge-snapping assistance which makes rotoscope easier.Using Area Brush tool  we can brush a particular portion to mask.It helps to create detailed mask shapes .

To mask organic moving objects like musculature, skin, fabrics, and more PowerMesh in mocha is used

Stabilize Camera or Object Motion
Mocha can stabilize even toughest highly shaking shots.We can eve export stabilized tracking data or render a stabilized clip.

Warp stabilisation tool inside after effects and premiere is used by many post technicians for simple stabilisations.With Power-Mesh tracking enabled, the Stabilize Module can produce an inverse-warped flattened surface for paint fixes.Original motion is easily propagated back to the original.Even other compositing tools has rotoscopy tools but still in vfx studio rotoscopy artists work with mocha.

Nuke

Nuke is the compositing application used by vfx giant companies like Digital Domain, Walt Disney Animation Studios, Blizzard Entertainment,DreamWorks Animation, Illumination Mac Guff,Sony Pictures Imageworks,Sony Pictures Animation, Framestore,Weta Digital,Double Negative and Industrial Light & Magic ,

Academy Award for Technical Achievement in 2001 was won by NUKE
Nuke is right now one of the most popular "photoshop for moving images" or, "compositing" software.

Nodal toolset

It has more than 200 creative nodes and it delivers everything you need to tackle the diverse challenges of digital compositing,nuke has many creative tools which includes industry-standard keyers, rotoscope, vector paint tools, color correction and so much more.etc

3D Camera Tracker

The integrated Camera Tracker in NukeX® and Nuke Studio replicates the motion of camera of the shot.  you can composite 2D/3D elements accurately with reference to the original camera used for the shot. It has more Refinement options, advanced 3D feature preview and lens distortion handling to improve efficiency and accuracy on the trickiest tracking tasks.With these tools we can maximize our 3D workflow


Nuke provides a wide range of keyers  such as Primatte, Ultimatte ,IBK, as well as Foundry's Keylight®.

Nuke’s flexible and robust tool-set empowers teams to create pixel-perfect content every time.

Advanced compositing tools
Nuke’s Deep compositing tools providing the best support to reduce the need to re-rendering CGI elements when we modify  the content .. 

Nuke supports phython scripting
With very little programming knowledge you can make widgets and scripts using Python scripting in Nuke.

Tracking is easy

 Nuke's 2D tracker and 3d camera tracker has a variety of options to make the tracking easier . It is possible to import tracking data from other softwares like mocha.

Nuke offers support for the leading industry standards including OpenEXR and rising technologies including Hydra and USD with support for OpenColorIO and ACES,color management is easy and ensures consistent color from capture through to delivery.


Vikram Kulkarni — Senior Compositor, Double Negative

Nuke has made possible things we couldn't have imagined doing in compositing. There is not a single project where we don't need to use its 3D pipeline for ease.I cannot thank Foundry enough for making comping so exciting!

Kindly search dcstechie yash to see this blog on google search.Watch this video to learn more



like this page for more updates

Friday 29 April 2022

MAGIC OF MATCH MOVING VISUAL EFFECTS

 In visual effects, match moving is a technique that allows the insertion of computer generated objects into shot with correct matching of movement and camera perspective.Sometimes it is referred to as motion tracking or camera solving.It seems very simple when it sounds but it is one of the technology which revolutionized the post.

Hulk, Captain Marvel, Spider man, Black Panther, Thor Ragnarok, and their legendary likes would not be possible without the art of match-moving.Today we can insert 3D elements into a 2D video clip after scanning the shot with match moving softwares.

The reason why match move artists track a scene is so that they can place objects, often CG elements that match the different perspectives of frame shots.

We need a virtual camera that moves exactly like the camera in the shot to place 3d objects.A good match move should be invisible to eye.No one should be able to identify that such visual effect was performed.While at it, it is worthwhile noting that matchmoving is different from motion capture.

In Martin Scorsese’s “Hugo,” some mastered techniques were used that includes tracking, recording, and reproducing motion of live footage to a virtual camera.


Matching the computer-generated objects to shot is not as simple work.

Matchmoving is a name derived from matching of motion of live footage to match a virtual camera. 

Automatic Tracking

Automatic tracking uses the algorithms in a 3dtracking software tools  to identify and follow features throughout the footage. The software also scans focal length, motion, and lens distortion of a camera .Automatic tracking might struggle with shots captured by a shaking handheld camera or repetitive objects in the environment.

 
This automated method may find it difficult to detect the finest details in the motion blur and out of focus footage.

  At that time we need to use tracking matte  which is similar in concept to a garbage matte .

Two dimensional tracking only tracks features in two-dimensional space and we can use it if there is no major changes in camera perspective.

For example, changing the billboard.Three-dimensional match moving tools make it possible to to derive camera movement and transfer it to computer graphics softwares to animate virtual cameras. Kindly search dcstechie yash to see this blog on google search.

like this page for more updates

 




DUAL ROLE VFX IN MOVIES


 A dual role  which is also known as a double role refers to one actor playing two roles in a single movie.
In most shots one person is the actor, the other is a stand-in/body double,but we need to use directorial techniques to make audience to never see the stand-in’s face.

Due to the advancement in technologies currently we have many methods for achieve actor multiple role sequences. 

Classic split screen technique using stop block shots
We need to film the scene with the actor performing in one part on the left and we need to film the same scene with the same actor in the other part on the right then we need to cut the two physical pieces of film together.The problem here is for each shot, there needs to be an invisible center-line the actors can’t cross so we splice the two pieces of physical film together.Most important thing to remember in this technique is the camera should be locked it should stay still.Any movement would destroy the split screen effect.This kind of locking the camera is also called stop block.

If we film the actor on the left part then right part should be hided by black matte and if we film the actor on the right part then the left part should be hided by black matte.

But the method is now easier by adding and adjusting the cropping effect with the digital editing software, you can simply layer the two shots and put a mask over clip in top layer . Since many post-production soft wares let you animate masks but mocha is the best and convenient software to rotoscope.

.i.e CGI-enhanced split screen: Same principle as the previous, but with newer technology. Film the actor twice and then combine them digitally. Digitally we can rotoscope so we can make actor to cross the center line. But even now camera should be locked it should stay still.Simply layer the two shots and rotoscope performer in top layer.If you plan to move the camera then motion control rig is the only option.Motion Control means motion that is very accurately, computer controlled, using electrical motors.We can repeat same movement of cameras again and again using motion control rig.  Roto scoping layers takes more time so it is usually done with green screen.

With motion control rig you can shoot the scene with the actor multiple times with different characters and combine them digitally.

CGI Import. We can design camera moves on CGI softwares, such as Softimage/Xsi and Maya and import it  to the motion control rig.But having the characters interact directly is still a little kludgy. 

CGI REPLACEMENT. We can now use current digital technology to put the actor’s face on the body double.

The good thing here is you can then have the characters interact — fight, shake hands, whatever — because you’re literally face-swapping the second person.That's how they did it in The Social Network with Arnie Hammer:The same face replacement technique was applied in "The Curious Case of Benjamin Button".

Same technique is used to replace face of stunt performer who work as double for an actor.

Powermesh tool in mocha provides good support for swapping face.It includes 12 new blend modes, improved render quality, motion blur, and new Grid Warp interface to bend/distort source elements.

You can understand how far the face can modified from the vfx woks of the movie Fan.
The vfx is handled by redchillies which is one of the legendary studios of vfx in india.Kindly search dcstechie yash to see this blog on google search.

Watch this video to understand more


UNREAL ENGINE IS FREE FOR FILM: THIS GAME ENGINE CAN REVOLUTIONISE VFX

 

Unreal Engine is one of world’s best game engine and advanced real-time 3D tool with realistic graphics. Unreal Engine (UE) is a game engine developed by Epic Games, first showcased in the 1998 first-person shooter game.Currently many creative updates are there in unreal engine but still it may take to gather many vfx technicians who only sticks with traditional softwares and workflow.

With Unreal Engine we can create high quality games for iOS,PC, PlayStation, Microsoft XBox, Nintendo Switch, and Android. Mostly AAA-rated game companies are prefering to use unreal engine because they find satisfaction in high-quality graphics, ,realistic textures,lights,shadows and effects.Unreal makes development easier and more efficient.In the video-game industry, AAA (Triple-A) is an informal classification used to categorise high budget or major publisher published game


Example : Electronic Arts
Unreal Engine is a state-of-the-art engine and editor with photo realistic rendering, dynamic physics and effects, lifelike animation, robust data translation, and much more—on an open, extensible platform that won’t tie you down.

This game engine is now used in film making also.Any movie from low budget to high budget will get benefit from unreal engine.Get tools you need to quickly create, edit, and manage the real-time environment of your dreams.

You can create realistic fire, smoke, dust and water with completely customizable particle systems in the built-in Niagara visual effects editor.Chaos is Unreal Engine’s high-performance physics system. Using Chaos’s Destruction feature, you can fracture,shatter, and demolish massive-scale scenes at extreme high cinematic quality.

Virtual Camera system
Cinematographers can drive cameras in Unreal Engine using an iPad Pro in virtual production.

Disney and Lucasfilm have turned to unreal as a tool for filmmaking.

In the first season of "The Mandalorian"an entirely new filmmaking technique got introduced known as "StageCraft" which uses "In camera vfx" methodology.

In-Camera VFX relies on mixture of LED lighting, live camera tracking, and real-time rendering,The LED screen will match the move of camera.Its primary goal is to remove the need for green screen.

Legendary Vfx company "Industrial Light and Magic" built a suite of tools for creating digital sets with help from Epic Games and NVIDIA.

Many countries are not aware about the potential of unreal engine.For example in India currently (2022) there is no In-camera vfx studio.

Real-time rendering enables immediate feedback that helps teams to make decisions in the moment.

Megascans is the world's largest library of 3d real world environment  for AAA and cinema.

Megascans realistic 3d environment assets are free for use with Unreal Engine.Disney’s The Lion King (2019), Black Panther (2018), and The Jungle Book (2016) used megascans to create environments.Unreal Engine has Sequencer which is a fully nonlinear, real-time cinematic editing and animation tool.

We can even record animations in your scene using motion capture linked to characters in our scene.

Unreal engine has support for the most widely-used formats and protocols in film and television production like FBX, Alembic, USD, C4D, OpenEXR, and OpenColorIO with Python and Blueprint visual scripting, a robust API, and complete C++ source code access. You can create the custom hooks and interfaces.

Using MetaHuman Creator which is a free cloud-based app we can create photorealistic digital humans, complete with hair and clothing, in minutes.MetaHumans comes as ready made 3d human model with fully rigged and ready to animate in our projects

 Uneal engine users feels little difficulty for coding, but they feel unreal much easier for non- coding activities.

Yes, it is worth it to learn unreal engine. Many vfx studios around the world currently using it.

To work efficiently with unreal we need NVIDIA or AMD  graphic cards with at least 4 to 6 GBs of VRAM.Kindly search dcstechie yash to see this blog on google search.

To understand unreal engine more kindly watch the following videos.

like this page for more updates

Unreal engine is free for film : This game engine can revolutionize visual effects and film making.

CGI - Unreal engine's incredible creations -Part 1 (Compilations only, filmmaker's reference).
 


 CGI - Unreal engine's incredible creations -Part 2