Lighting
Most notably, all three supervisors come from photographic backgrounds, and their knowledge of cameras and lighting is reflected in each of the nominated films. “The technology is now here to do incredibly accurate lighting,” says Fink, a previous Oscar nominee for Batman Returns and this year's Achievement in Visual Effects winner for The Golden Compass. “That's where the art is.”
Integrating effects within a film's photographic style is their key concern. Knoll, whose work on the Pirates franchise has earned him three Oscar nominations and last year's statuette, says he believes that lighting is crucial to making effects blend with the look created by a DP. “We pick up where the DP leaves off,” says Knoll, who began his ILM career as a cameraman. Although he's computer savvy (he co-authored the original Adobe Photoshop), Knoll sees advantages in having a live-action background. “Experiences on set, like holding a light meter and figuring out how to do an exposure split, teach you where the tradeoffs are. It helps you avoid mistakes like having interiors at the same brightness as exteriors,” he says. Supervisors having only CG backgrounds can be vulnerable to such errors, Knoll says. “For example, the standard cameras in animation systems rotate around their nodal centers. And that almost never happens in the real world.”
Transformers provides a striking example of live-action approaches to illuminating visual effects. Farrar, an Oscar winner for Cocoon, approached Transformers by imagining how a DP would have photographed 20ft.-tall robots if they really could stride through a shot. “If they were really on set, they'd be lit separately when they're in closer view,” he says. “The DP would knock out the sunlight for the close-ups and have HMIs [Hydrargyrum medium-arc iodide lamps] and cutters to cause shadows. You wouldn't just be worried about rendering the background lighting. Of course you need to reflect the background, and we recorded the robots' environments with reflective spheres — that's a given. But we totally relit the robots to look great wherever they were.” To accomplish this, ILM created virtual versions of cutters, flags, bounce cards, and shiny boards. “The robots could move in and out of shadows, which generally isn't done much in computer-graphics lighting,” Farrar says.
Camerawork
This year's nominees also demonstrated that locked-off visual-effects shots are increasingly rare. “We try to conform stylistically to the rest of the movie,” Knoll says. “Pirates was never shot from static cameras, so we didn't want the visual effects to take on a different look. You don't want to feel like you've entered a part of the movie that was heavily storyboarded. … The state of the art has advanced to the point where you can kind of not worry about the camera. Tracking software has become more sophisticated, and crews have developed very good methodologies for solving complex camera motion. We don't think twice about Steadicam or handheld shots. That makes for better shots, because it's impressive when you see CG happening in the background of a Steadicam shot.”
That is not to say that current tracking methods have become standardized, however. For The Golden Compass, shots were shared by several facilities — including Framestore CFC, Rhythm & Hues, Cinesite, Digital Domain, Tippett Studio, and Rainmaker. “Everyone does it differently,” Fink says. “When a background would go to another facility, it would be re-tracked. There's very little sharing of motion data. Of course, these facilities were usually tracking different parts of the frame.”
In Transformers, where the camera moved rapidly through cityscapes, Farrar employed photogrammetry to create backgrounds racing by. “It's a photoreal movie, so we photographed as many real buildings as possible. When we used computer models, the textural information was actual photography. We'd hang a camera over a street and photograph it in all directions. There were shots where a camera was flying down the street, and the shot was assembled from still photos. We'd relight it and add interesting shadows and maybe even change the colors of buildings to make it more appealing,” Farrar says.
source: http://digitalcontentproducer.com/mil/features/video_integrated_team_effort/index.html
Friday, July 15, 2011
Monday, June 20, 2011
10,000 B.C
"We did an extensive photoshoot of the miniature, and used Isis, a tool we wrote at MPC, for photogrammetry and geometry reconstructions from multiple pictures and a LIDAR scan for the miniature. With this, we were able to rebuild the entire environment. It looks good enough to pass in a few shots that were entirely CG."
Paul
![]() |
<Original plate> |
![]() |
<Final composite> |
Double Negative used survey data from sets and camera positions captured with a Leica Total Station. Some locations and the interior of the RV were LIDAR scanned. “An HDRI lighting pass was also taken for every set-up,” adds Couzens. “On other occasions witness cameras were used to record more complex moving lighting situations. And the mysterious Winklefix (an angle measuring tool) made many appearances while shooting inside the RV.”
Almost all of the scenes inside the RV were shot on a greenscreen stage. “The size and number of windows and reflective surfaces in the RV and the unrepeatable nature of road footage required us to capture large arcs of the driving background in one go,” says Couzens. “For this, the grips and camera department designed and built a purpose-specific six camera mount on a camera vehicle. It comprised of three Libra heads each mounting two cameras. Remarkably, the camera car was able to drive fast enough in forward or reverse gear, sparing any need to change the camera mounts between set-ups. This also enabled a near 180 degree overlapping field of view to be captured in one take.”
“These interlocking frames,” continues Couzens, “needed to be synced with a bloop-light, de-lensed, stabilized and stitched together in post before being projected onto a sphere or planar surface, re-lensed, graded and composited into the green-screen of the RV plates. Dust and dirt was added where needed. It was a considerable and necessary amount of work in 400 plus shots that hopefully the viewer never really notices!”
G.I. Joe: The Rise of Cobra
<G.I. Joe> |
Tuesday, May 17, 2011
What is this?
Sunday, March 27, 2011
A Problem, When Export Maya to Nuke.
보통 Maya에서 Nuke로 Export Scene 할 때 사용하는 Script인 maya2nuke.
사용하다 주의할 점이 발견되어 포스팅.
Maya의 rotate order의 기본값은 XYZ. Nuke의 rotate order의 기본값은 ZXY.
Nuke에서 camera를 확인하면 data가 약간 다르게 들어가는걸 확인할 수 있다.
수정하려면 Nuke의 rotation order를 XYZ로 바꿔줘야 한다.
사용하다 주의할 점이 발견되어 포스팅.
Maya의 rotate order의 기본값은 XYZ. Nuke의 rotate order의 기본값은 ZXY.
Nuke에서 camera를 확인하면 data가 약간 다르게 들어가는걸 확인할 수 있다.
수정하려면 Nuke의 rotation order를 XYZ로 바꿔줘야 한다.
![]() |
<Rotation Order of Maya's Camera Transform node> |
![]() |
<Rotation Order of Nuke's Camera Node> |
Sunday, March 13, 2011
Lens Distortion Models
![]() |
<Lens Distortion Models> |
- Barrel Distortion
이미지가 안쪽으로 모아지는 형태를 가지며 Focal Length가 짧을 경우 나타난다.
- Pincushion Distortion
이미지가 바깥쪽으로 나아가는 형태를 가지면 Focal Length가 긴 경우 나타난다.
- Moustache Distortion
가장 까다로운 distortion model이며 Anamorphic Lens에서 나타난다.
<어떻게 촬영 할 것인가?>
* 아래의 모든 이미지는 http://www.imatest.com/docs/lab.html에서 가져옴.
Camera와 Chart는 수직, 수평을 맞춰서 설치해야 한다.
![]() |
<수평기> |
![]() |
<수평기> |
![]() |
<카메라 수평기> |
수평기가 없을 경우 아래의 이미지처럼 무거운 물체를 줄에 메달아 사용하는 방법도 있다.
![]() |
<줄을 이용한 수평기> |
![]() |
<수평 찾기용 거울> |
Camera와 Chart가 수평이 맞다면 촬영을 하도록 한다. 아래 이미지는 좋은 예, 나쁜 예.
![]() |
<좋은 예, 나쁜 예> |
Friday, March 4, 2011
케플러의 3법치
제1 법칙: 행성의 궤도는 타원이다.
제2 법칙: 태양과 행성을 연결한 선이 일정 시간에 통과하는 면적은 같다. (면적속도는 일정)
제3 법칙: 공전 주기의 제곱과 궤도의 긴 반지름의 세제곱의 비는 어떤 행성에서나 같다.
제2 법칙: 태양과 행성을 연결한 선이 일정 시간에 통과하는 면적은 같다. (면적속도는 일정)
제3 법칙: 공전 주기의 제곱과 궤도의 긴 반지름의 세제곱의 비는 어떤 행성에서나 같다.
Wednesday, March 2, 2011
Sunday, February 27, 2011
Camera Tracking in Transformers: Revenge of the Fallen
![]() |
<From http://www.lasjedi.com/disciplines_3dvfx.html> |
Jedi Masters Program 사이트에 있는 이미지이며 트랜스포머 2편 중 한 장면이다.
클릭하면 커짐(Click to enlarge.)
Points가 green, blue로 나눠져 있는데 단순한 grouping인것 같다는 김기범님의 설명.
Layout 관련해서는 보통 컷 촬영 전에 Photomodeling을 위한 marker(not for tracking)를 설치하고 사진 촬영 후 제거 한다고 한다. 물론 marker for tracking은 설치하되 후반 작업시 removing marker 작업을 해야 하므로 최소한의 갯수로 제한한다고 한다.
Camera Scene Data 한번 보고 싶다.
Saturday, February 19, 2011
아인슈타인의 상대성 원리
'갈릴레이의 상대성 원리'를 알베르트 아인슈타인이 더욱 발전 시킨 것. '상대성 이론'은 이것을 토대로 만들어 냈다.
"등속 직선 운동을 하고 있는 장소에서는 '모든 물리 법칙'이 정지한 장소와 구별되지 않는다."
"등속 직선 운동을 하고 있는 장소에서는 '모든 물리 법칙'이 정지한 장소와 구별되지 않는다."
Subscribe to:
Posts (Atom)