Harrison in Wonderland
Let's talk about Visual Effects!
Tuesday, January 3, 2012
To Do List - Nuke
- Write some information(file path, timecode, etc...) to DPX header or EXIF of JPEG.
Sunday, January 1, 2012
이벤트 루프(Event Loop)
* 출처: http://xylosper.net/notebook/108?category=17
이벤트 루프는 GUI의 핵심이라고도 할 수 있습니다.
콘솔 프로그램의 경우는 프로그램이 사용자의 행동을 쉽게 컨트롤 할 수 있습니다.
사용자로부터뭔가를 입력 받는 방법은 콘솔에 쳐넣는 방법밖에 없으니까요.
하지만 GUI 프로그램의 경우는 사용자로부터 입력받는 방법이 매우 많습니다.
예를 들어 버튼을 누른다던가, 단축키를 누른다던가, 더블클릭을 한다던가...버튼을 누르는 것도 버튼 갯수만큼 입력가능한 경우가 생기는 거구요.
그렇기 때문에 콘솔처럼 정해진 흐름대로 처리하는 건 불가능합니다.
그래서 일종의 무한루프인 이벤트루프를 돌리면서 사용자가 무언가를 누르거나 마우스를 클릭하면 '이벤트'가 발생하고, 이벤트 루프에서는 이 이벤트가 적절히 처리되도록 적합한 객체로 이벤트를 보내줍니다.
예를 들어 어떤 버튼을 누르면 이벤트 루프에서 버튼이 눌렸다는 것을 캐치하고, 적합한 이벤트를 생성하여서 이 이벤트를 처리하도록 그 버튼으로 이벤트를 보내주는 것이지요.
이과정은 프로그램이 종료될때까지 계속 반복됩니다.
이벤트 루프는 GUI의 핵심이라고도 할 수 있습니다.
콘솔 프로그램의 경우는 프로그램이 사용자의 행동을 쉽게 컨트롤 할 수 있습니다.
사용자로부터뭔가를 입력 받는 방법은 콘솔에 쳐넣는 방법밖에 없으니까요.
하지만 GUI 프로그램의 경우는 사용자로부터 입력받는 방법이 매우 많습니다.
예를 들어 버튼을 누른다던가, 단축키를 누른다던가, 더블클릭을 한다던가...버튼을 누르는 것도 버튼 갯수만큼 입력가능한 경우가 생기는 거구요.
그렇기 때문에 콘솔처럼 정해진 흐름대로 처리하는 건 불가능합니다.
그래서 일종의 무한루프인 이벤트루프를 돌리면서 사용자가 무언가를 누르거나 마우스를 클릭하면 '이벤트'가 발생하고, 이벤트 루프에서는 이 이벤트가 적절히 처리되도록 적합한 객체로 이벤트를 보내줍니다.
예를 들어 어떤 버튼을 누르면 이벤트 루프에서 버튼이 눌렸다는 것을 캐치하고, 적합한 이벤트를 생성하여서 이 이벤트를 처리하도록 그 버튼으로 이벤트를 보내주는 것이지요.
이과정은 프로그램이 종료될때까지 계속 반복됩니다.
Friday, July 15, 2011
Integrated Team Effort
Lighting
Most notably, all three supervisors come from photographic backgrounds, and their knowledge of cameras and lighting is reflected in each of the nominated films. “The technology is now here to do incredibly accurate lighting,” says Fink, a previous Oscar nominee for Batman Returns and this year's Achievement in Visual Effects winner for The Golden Compass. “That's where the art is.”
Integrating effects within a film's photographic style is their key concern. Knoll, whose work on the Pirates franchise has earned him three Oscar nominations and last year's statuette, says he believes that lighting is crucial to making effects blend with the look created by a DP. “We pick up where the DP leaves off,” says Knoll, who began his ILM career as a cameraman. Although he's computer savvy (he co-authored the original Adobe Photoshop), Knoll sees advantages in having a live-action background. “Experiences on set, like holding a light meter and figuring out how to do an exposure split, teach you where the tradeoffs are. It helps you avoid mistakes like having interiors at the same brightness as exteriors,” he says. Supervisors having only CG backgrounds can be vulnerable to such errors, Knoll says. “For example, the standard cameras in animation systems rotate around their nodal centers. And that almost never happens in the real world.”
Transformers provides a striking example of live-action approaches to illuminating visual effects. Farrar, an Oscar winner for Cocoon, approached Transformers by imagining how a DP would have photographed 20ft.-tall robots if they really could stride through a shot. “If they were really on set, they'd be lit separately when they're in closer view,” he says. “The DP would knock out the sunlight for the close-ups and have HMIs [Hydrargyrum medium-arc iodide lamps] and cutters to cause shadows. You wouldn't just be worried about rendering the background lighting. Of course you need to reflect the background, and we recorded the robots' environments with reflective spheres — that's a given. But we totally relit the robots to look great wherever they were.” To accomplish this, ILM created virtual versions of cutters, flags, bounce cards, and shiny boards. “The robots could move in and out of shadows, which generally isn't done much in computer-graphics lighting,” Farrar says.
Camerawork
This year's nominees also demonstrated that locked-off visual-effects shots are increasingly rare. “We try to conform stylistically to the rest of the movie,” Knoll says. “Pirates was never shot from static cameras, so we didn't want the visual effects to take on a different look. You don't want to feel like you've entered a part of the movie that was heavily storyboarded. … The state of the art has advanced to the point where you can kind of not worry about the camera. Tracking software has become more sophisticated, and crews have developed very good methodologies for solving complex camera motion. We don't think twice about Steadicam or handheld shots. That makes for better shots, because it's impressive when you see CG happening in the background of a Steadicam shot.”
That is not to say that current tracking methods have become standardized, however. For The Golden Compass, shots were shared by several facilities — including Framestore CFC, Rhythm & Hues, Cinesite, Digital Domain, Tippett Studio, and Rainmaker. “Everyone does it differently,” Fink says. “When a background would go to another facility, it would be re-tracked. There's very little sharing of motion data. Of course, these facilities were usually tracking different parts of the frame.”
In Transformers, where the camera moved rapidly through cityscapes, Farrar employed photogrammetry to create backgrounds racing by. “It's a photoreal movie, so we photographed as many real buildings as possible. When we used computer models, the textural information was actual photography. We'd hang a camera over a street and photograph it in all directions. There were shots where a camera was flying down the street, and the shot was assembled from still photos. We'd relight it and add interesting shadows and maybe even change the colors of buildings to make it more appealing,” Farrar says.
source: http://digitalcontentproducer.com/mil/features/video_integrated_team_effort/index.html
Most notably, all three supervisors come from photographic backgrounds, and their knowledge of cameras and lighting is reflected in each of the nominated films. “The technology is now here to do incredibly accurate lighting,” says Fink, a previous Oscar nominee for Batman Returns and this year's Achievement in Visual Effects winner for The Golden Compass. “That's where the art is.”
Integrating effects within a film's photographic style is their key concern. Knoll, whose work on the Pirates franchise has earned him three Oscar nominations and last year's statuette, says he believes that lighting is crucial to making effects blend with the look created by a DP. “We pick up where the DP leaves off,” says Knoll, who began his ILM career as a cameraman. Although he's computer savvy (he co-authored the original Adobe Photoshop), Knoll sees advantages in having a live-action background. “Experiences on set, like holding a light meter and figuring out how to do an exposure split, teach you where the tradeoffs are. It helps you avoid mistakes like having interiors at the same brightness as exteriors,” he says. Supervisors having only CG backgrounds can be vulnerable to such errors, Knoll says. “For example, the standard cameras in animation systems rotate around their nodal centers. And that almost never happens in the real world.”
Transformers provides a striking example of live-action approaches to illuminating visual effects. Farrar, an Oscar winner for Cocoon, approached Transformers by imagining how a DP would have photographed 20ft.-tall robots if they really could stride through a shot. “If they were really on set, they'd be lit separately when they're in closer view,” he says. “The DP would knock out the sunlight for the close-ups and have HMIs [Hydrargyrum medium-arc iodide lamps] and cutters to cause shadows. You wouldn't just be worried about rendering the background lighting. Of course you need to reflect the background, and we recorded the robots' environments with reflective spheres — that's a given. But we totally relit the robots to look great wherever they were.” To accomplish this, ILM created virtual versions of cutters, flags, bounce cards, and shiny boards. “The robots could move in and out of shadows, which generally isn't done much in computer-graphics lighting,” Farrar says.
Camerawork
This year's nominees also demonstrated that locked-off visual-effects shots are increasingly rare. “We try to conform stylistically to the rest of the movie,” Knoll says. “Pirates was never shot from static cameras, so we didn't want the visual effects to take on a different look. You don't want to feel like you've entered a part of the movie that was heavily storyboarded. … The state of the art has advanced to the point where you can kind of not worry about the camera. Tracking software has become more sophisticated, and crews have developed very good methodologies for solving complex camera motion. We don't think twice about Steadicam or handheld shots. That makes for better shots, because it's impressive when you see CG happening in the background of a Steadicam shot.”
That is not to say that current tracking methods have become standardized, however. For The Golden Compass, shots were shared by several facilities — including Framestore CFC, Rhythm & Hues, Cinesite, Digital Domain, Tippett Studio, and Rainmaker. “Everyone does it differently,” Fink says. “When a background would go to another facility, it would be re-tracked. There's very little sharing of motion data. Of course, these facilities were usually tracking different parts of the frame.”
In Transformers, where the camera moved rapidly through cityscapes, Farrar employed photogrammetry to create backgrounds racing by. “It's a photoreal movie, so we photographed as many real buildings as possible. When we used computer models, the textural information was actual photography. We'd hang a camera over a street and photograph it in all directions. There were shots where a camera was flying down the street, and the shot was assembled from still photos. We'd relight it and add interesting shadows and maybe even change the colors of buildings to make it more appealing,” Farrar says.
source: http://digitalcontentproducer.com/mil/features/video_integrated_team_effort/index.html
Monday, June 20, 2011
10,000 B.C
"We did an extensive photoshoot of the miniature, and used Isis, a tool we wrote at MPC, for photogrammetry and geometry reconstructions from multiple pictures and a LIDAR scan for the miniature. With this, we were able to rebuild the entire environment. It looks good enough to pass in a few shots that were entirely CG."
Paul
![]() |
<Original plate> |
![]() |
<Final composite> |
Double Negative used survey data from sets and camera positions captured with a Leica Total Station. Some locations and the interior of the RV were LIDAR scanned. “An HDRI lighting pass was also taken for every set-up,” adds Couzens. “On other occasions witness cameras were used to record more complex moving lighting situations. And the mysterious Winklefix (an angle measuring tool) made many appearances while shooting inside the RV.”
Almost all of the scenes inside the RV were shot on a greenscreen stage. “The size and number of windows and reflective surfaces in the RV and the unrepeatable nature of road footage required us to capture large arcs of the driving background in one go,” says Couzens. “For this, the grips and camera department designed and built a purpose-specific six camera mount on a camera vehicle. It comprised of three Libra heads each mounting two cameras. Remarkably, the camera car was able to drive fast enough in forward or reverse gear, sparing any need to change the camera mounts between set-ups. This also enabled a near 180 degree overlapping field of view to be captured in one take.”
“These interlocking frames,” continues Couzens, “needed to be synced with a bloop-light, de-lensed, stabilized and stitched together in post before being projected onto a sphere or planar surface, re-lensed, graded and composited into the green-screen of the RV plates. Dust and dirt was added where needed. It was a considerable and necessary amount of work in 400 plus shots that hopefully the viewer never really notices!”
G.I. Joe: The Rise of Cobra
<G.I. Joe> |
Subscribe to:
Posts (Atom)