Nokia recently released a smartphone that shoots 41 megapixel images. What exactly can we videographers do with a phone that shoots photos at 16x the size of high-definition? Paul Trillo shows us, with a video that creates an “infinite zoom” effect by photographing one image per block for 41 New York City blocks with the Nokia Lumia 1020. Check out the video below, and an interview with Paul where he explains to nofilmschool how he made it.
to read the whole explanation go to NoFilmSchool
SlyPhone Camera Periscope is an inconspicuous, surreptitious way to take a photo of something or someone without them noticing you snapping way.
Designed by James George and Alexander Porter, the mirror fits on to any iPhone 4 or 5 device and acts as a periscope, similar to designs employed by submarines. Instead of having to peer straight ahead and lift your camera together, however, the SlyPhone Camera Periscope is intended to “reverse our culture of surveillance” and turn every person with a phone into a documentarian. Priced at $19, the periscope fits over the front facing camera and reflects a picture of what’s in front without you having to make direct eye contact with the subject.
As such, the Slyphone Camera Periscope is intended for a tool of “social critique,” while letting you return to your game of Candy Crush quickly should the subject notice anything fishy.
Recently, the Society of Motion Picture and Television Engineers (SMPTE) organized a meeting to review the standardization for Ultra High-Definition Television (UHDTV). The need for standards is especially important since shipments of ultra high-def TV sets are expected to reach four million units by 2017.
Two standards are actually being developed. Simply called UHDTV1 and UHDTV2, the easiest way to distinguish them is by their frame dimensions. UHDTV1 would have a 4k resolution of 3,840 x 2,160 pixels whereas UHDTV2 would have a whopping 8k resolution of 7,680 x 4,320 pixels. The standards would contain 10 and 12 bit depth, with chroma subsampling options at 4:4:4, 4:2:2 and 4:2:0. 8 bit, as well as interlacing and fractional framerates, would be discarded. The likely base framerate would be 120 frames per second, due in part that 120 is divisible by many popular framerates such as 24, 30, and 60. At such a high framerate, the “flicker fusion threshold,” a technical term for image flicker, would be greatly reduced.
UHD is about more than spatial resolution. The areas where [the Standards Community is] looking to push the image are dynamic range, peak luminance, wider color gamut, temporal resolution meaning framerate, and spatial resolution.
To Pliny, the most important of these is dynamic range. Increasing only the resolution would do nothing to improve the image without also increasing the other aspects of the image, a detail consumer TV and camera manufacturers often seem to forget. Pliny goes on:
If you want to display more colors, there are certain colors you can’t hit unless you have a higher peak brightness. If you have higher peak brightness overall, the flicker fusion threshold actually changes. So an image that looks constantly illuminated when you are at 100 nits [a unit of measure for luminance], if you crank it up high enough, suddenly that same image looks flickery. Now you have to increase your refresh rate just to maintain the status quo of appearing constantly illuminated. If you have wider dynamic range on the display you’re going to need more bits to cover it to not get banding in things like skies and gradients. So all these things need to move in unison.
Besides considerations relating to image quality, other issues pertain to the physical cabling that carries the signals. As of now, a 6G-SDI cable is unable to transport a 4k video signal running at 60 frames per second at 12 bit in 4:4:4 color space. Two of them can’t even do it. To bandage the situation, more cables would need to be added into the pipeline, something that SMPTE board member Bill Miller considers unsustainable.
During his presentation at the SMPTE meeting, he delves into further detail to clarify some of the points in the report, and states we need new SDI technology that is capable of more data throughput, or an improvement in the image compression technology. Higher framerates are necessary, he says, and illustrates this visually with a high-motion image shot at 100 frames per second and the same subject at 50 frames per second.
The 100 frames per second image is crisp. There’s even text in the image that can be read due to the video running at a higher shutter speed. The 50 frames per second image looks like the same motion-blurred image we’re accustomed to seeing in a movie clip. Miller maintains that if we’re not going to end up with a crisper image after we increase the resolution, what’s the point?
More frames means more data, and with 8k cameras shooting up to 72 gigabits per second, this data management soon becomes serious. The challenge is on for countries like Japan, who want to broadcast the 2020 Olympics in 8k.
As insurmountable as these issues seem, it’s prudent to consider them now to establish solid standards before hardware is developed and built. It will help ensure that technology is properly implemented and systems are integrated with a complete production pipeline that ends with a greatly enhanced viewing experience.
super interesting Time Lapse film from CHINA
I think the cinematography in this particular short film from Lana del Rey is spectacular and I like to share it, I hope you’ll enjoy it as much as I did:
Congratulations Emmanuel Lubezki AMC ASC, for the award on the 39TH annual LA Film Critics Association Awards
Congratulations Emmanuel Lubezki AMC ASC, for the award on the 39TH annual LA Film Critics Association Awards
Congratulations also to Alfonso Cuaron and all the involved with the film Gravity for the
Best Picture, Best Director and Best Editing Awards!
SOURCE and Credit: FRAMESTORE
Alfonso Cuarón’s remarkable blockbuster Gravity has enjoyed fantastic critical success, collecting enough stars from film reviewers to fill the galaxy it so devotedly depicts. But how were those stunning images made? By taking a film crew up 372 miles above the earth? In fact those mesmerising images were planned and created here in Soho, London. It’s a Hollywood blockbuster made in Britain, from pre-production, through filming, to its extensive time in post production.
“I first heard about Gravity at the beginning of 2010,” says Visual Effects Supervisor Tim Webber, a long-time collaborator of Cuarón’s and the man he approached to help realise a film no one knew how to make, “Alfonso came in and talked us through the movie for 45 minutes and it was gripping. We all came out really excited having heard it.” At that point it was unclear to what extent visual effects (VFX) and Webber’s team at Framestore would be needed.
“There was a stage initially where it was going to be made with actors in real space suits,” Webber continues, “they would have been hung up on wires on partial sets and we would have extended it and put in the background.” In the end considerably more of it is CGI than first discussed, and in fact considerably more of it is computer generated than real. In the majority of shots the only elements captured with a camera are the faces. The vastness of space, the Earth, the stars (all 30 million of them), the space shuttles, Hubble Telescope, the International Space Station (ISS), the copious and equally villainous fragments of debris, even the space suits: they were all made by visual effects artists at Framestore.
It was no simple process – everything had to be fastidiously planned. The first step down the three-and-a-half-year road to making Gravity’s visuals was the pre-vis, a basic animation process common to all films that blocks out the action in each scene before filming starts. But for Gravity it was anything but basic – it was meticulous. “We spent about a year planning it before we shot it. By the time we turned up on set the film was pretty much locked” says VFX Producer Charles Howell.
Webber elaborates on the need for such a detailed plan: “It needed to be heavily pre-vised for a number of reasons, obviously technically as we needed to work out the camera moves but also because when you’ve got a 12 minute continuous shot and it’s set in space where you’ve got a camera that can roam absolutely anywhere and you’ve got people that can roam absolutely anywhere too – upside down, underneath, over the top, everywhere – the degrees of freedom are much greater. Therefore to design the shot creatively so that it worked without the benefit of editing [Cuarón is famous for his long shots] takes a lot of work, so pre-vis was a big process.” It was lit by cinematographer Emmanuel ‘Chivo’ Lubezki during this period – an unheard of move for live action, but a necessary one ahead of a very complicated shoot at Shepperton studios. Even the use of Stereo 3D, so often criticised as an afterthought, was planned during pre-vis with Stereoscopic Supervisor Chris Parks. The original script did read ‘Gravity: A Space Adventure In 3D’ after all.
With the pre-vis complete, Webber and his accomplices set about creating the techniques that would help simulate micro gravity – cutting out the need for hundreds of trips in ‘the vomit comet’, the specially designed aircraft that Nasa use to provide a few seconds of near weightlessness. Both Tim and Alfonso had been up in it themselves for research, but instead chose a combination of motion controlled cameras and light rigs. Collaborating with Bot & Dolly Motion Control and the on-set special effects team (the masters of physical, in-camera effects as opposed to the computer-based world of visual effects), cameras were strapped to huge robotic arms and George Clooney and Sandra Bullock were put in a variety of different rigs, many newly developed for the film. Then the solution of the lightbox was hit upon and the cramped LED box known on set at ‘Sandy’s cage’ took over a large part of the filming responsibility. In its standard configuration the box would be a 10m cube, with huge LED panels containing almost two million lights making up its walls.
The use of LEDS allowed Chivo to light the actors which much greater flexibility than traditional film lights – the different colours reflecting off the Earth, moonlight, sunlight and starlight could all be replicated. Bullock would be strapped to a rig in the centre of the box as the camera moved around her, achieving the illusion that it is her that moves. The camera could zoom in and out from any position and it would race towards her and stop dead, just centimetres from her face. It was a highly unusual, VFX-led filming process. “When I was on set with the lightbox and the robots I thought ‘I’ve never seen a set-up like this’ says Framestore VFX Supervisor, Rich McBride. “I’d just never heard of anyone doing anything like it. I knew this film was going to be groundbreaking.”
“We were providing motion control moves for the rigs, but also generating a full immersive digital environment on-set using LED screens” explains CG Supervisor Chris Lawrence. “Having to control that in real time was an interesting challenge! I don’t think there’s great precedent of that being done before on a movie. At the time we did it I don’t think anyone had done it the way we had with a box that completely surrounds an actor and having to bring live CG elements in.” Cuarón himself has said that he sees the technique as the next step in cinematography because of the amazing complexity of colour that LED lights can give, so we may see the it become more common. After six months at Shepperton the film was shot, but it certainly wasn’t finished. Cuarón had captured the human performances – the raw emotions of Bullock’s character Ryan as she tries to stay alive in an inhospitable place – now it was time to head back to Framestore to create the universe around her.
For Cuarón accuracy was paramount. He wanted the film to feel like a space documentary gone wrong and for everything to be rooted in reality wherever it could be. “There was an awful lot of research to be done in the way things look and the way things work in space, the way things move” says Webber. “We had to retrain the animators to an extent as they are so used to portraying weight. It’s one of the hardest things to portray and our animators have it in their blood. Then suddenly there is no weight. The physics of outer space are completely different, it’s not just the zero gravity, it’s the zero air resistance, so once something starts moving it will keep moving and it won’t ever slow down. Things like that. We had little physics lessons with a whiteboard and discussed the implications of the physics with Alfonso.”
“He’s is a stickler for reality up to the point where there is no other way. We went to every length to be real and to desperately find ways to fit in with the story in a way that was possible in space. But every now and then you have to break it. At one point we were talking to an astronaut about how a shuttle disengages and he told us that one initial process of it take four minutes. Obviously we weren’t going to sit there for four minutes while something happened! That would make for a dull film.”
Building the galaxy’s biggest junk pile
One of the most difficult tasks was building everything. Just as they would be on a traditional set, every element had to be made in CG. “Building the space suits, the space shuttles, the Hubble Telescope, the ISS and everything else was a huge challenge because people know what they look like” says Howell. “The interior sets, which are all CG inside the ISS, were phenomenally detailed too, and every bit of that had to be modelled by someone. It took over a year to build everything. We never really stopped – we were constantly adding detail.”
Leading this digital construction team was Ben Lambert, who is proud of the lengths they went to make the models as accurate as possible. “With the ISS in particular it’s made up of around 50 modules, each sent up at a different time over the last 25 years. It’s the galaxy’s biggest jigsaw, but also its biggest junk pile – there’s actually a lot of redundant technology up there. So we couldn’t just throw a great big sci-fi kit all over it, make it look cool and put shiny chrome aerials on there. We had to source photographs really carefully. You could probably look at one of our interior shots and a photo of the ISS and work out what module the scene is in, it’s that accurate.”
Another demanding process was getting the faces shot at Shepperton to line up exactly inside their CG space suits – a difficult task even when the camera has been programmed. “You could plan an entire shot, but you couldn’t plan exactly what Sandra was going to do once she was in the box” says Compositing Supervisor Anthony Smith. “It’s like we made three films – we pre-vised a film, we shot a film and then we made another that was based on what we shot. Everything had to be fitted to what happened on set.”
When it comes to the animation some might assume much of the movement was achieved through motion capture, but as it was impossible to observe movement happening in zero gravity, much of what could be captured wasn’t relevant, despite being very useful as reference. The vast majority of the animation was actually painstakingly key-framed by hand. It wasn’t just the two actors that needed animating either: much like the Earth looming in the background, Gravity’s camera is in many ways another character. “We spent as much time animating it as we did the astronauts” says Animation Supervisor Max Solomon. “It’s used to disorientate the audience and to try and break the sense of there being an up and a down. We kept it shifting constantly.”
To render Gravity on just one machine you would need to start before the dawn of Egyptian civilisation.
Alfonso Cuarón’s characteristic longs shots made the whole process more difficult. A common remark from the VFX team is that there was nowhere to hide, no quick ways of establishing a shot – everything they created was on full display, maybe for ten minutes at a time. Their work had to stand up to intense scrutiny. “The amount of planning and additional work that came about because of the long shots was enormous, it shouldn’t be underestimated.” says Chris Lawrence. After hitting a button the team would often have to wait more than two days to see if a particular simulation had worked.
It wasn’t just the long shots, the whole process took a very long time and an awful lot of computer power. To render Gravity on a single core machine with a single processor in it and be ready for 2013 you would need to start before the dawn of Egyptian civilisation. Renders rarely look right the first time and comments need to be given and addressed – typing into a program called Shotgun, Gravity’s VFX Co-ordinators wrote the equivalent of four copies of War and Peace while taking notes during feedback.
One sequence that had people tearing their hair out is ironically one of the film’s most calming, when we see Bullock’s character Ryan curled up in the foetal position, floating in the relative safety of the ISS. It was filmed with one of Bullock’s legs strapped to a stool, with three robots, one for the camera, one to control a spotlight behind the ISS porthole and the other to move the porthole all revolving around her. In the finished shot it is her that spins around, both legs free, removing a space suit which never really existed. “It was one of the hardest shots” says Rigging Supervisor Nico Scapel. “We’d already built the suit, but now we had to take it off. We were really worried about it for a while.” Entire sections of her had to be made in CG, including the leg that had been strapped to the seat on set, and there are countless techniques used at every point. “It’s always difficult when you have interaction with a live actor and CG dynamics because you need to match the movement with something that has been shot” says Simulation Supervisor Sylvain Degrotte. “I’d like to see what the audience thinks is CG and what is life.”
“There are always bits people will assume are CG, like the wide shots of the shuttle, because they know it can’t be filmed” says Webber. “But there are bits that people just assume have been filmed, for instance a mid-range shot when she’s working on the Hubble [image above]. Lots of people have seen it and asked us what we did there – they had no idea that it’s basically all CG apart from her face. It’s like on Children of Men, [for which Tim supervised the delivery of a CG baby, and was nominated for a VFX Bafta] when I last worked with Alfonso, when someone came out of the cinema saying something along the lines of ‘I can’t believe they got that woman to give birth on screen’ and then you just think ‘yes!’”
That’s the aim for Gravity: that those years of extremely hard work by over 400 people went unnoticed and people walked out of the cinema wondering how they got a film crew up into space. That’s when you know you’ve done a good job.
- Production Company
- Warner Bros. Pictures
- Alfonso Cuarón
- Production Company Producer
- David Heyman
- Executive Producer
- Chris DeFaria, Nikki Penny, Stephen Jones
- Director of Photography
- Emmanuel Lubezki
- Steven Price
- VFX Supervisor
- Tim Webber (overall), Richard McBride (Framestore)
- VFX Producer
- Charles Howell
- CG Supervisor
- Chris Lawrence
- CG Sequence Supervisor
- Stuart Penn
- Compositing Supervisor
- Anthony Smith, Kyle McCulloch, Mark Bakowski
- Animation Supervisor
- Max Solomon, David Shirk
- FX Supervisor
- Alexis Wajsbrot
- Simulation Supervisor
- Sylvain Degrotte
- Modelling Supervisor
- Ben Lambert
- Rigging Supervisor
- Nico Scapel
I think this is a really interesting Video, on the history of Neon Lights and how to make them.
to read the original post go to: PetaPixel
To read the full list go TheBlackandBlue
1. Don’t Get Distracted with Technique
“Operating the wheels needs to become second nature as it can be a disaster if the technique of operating distracts from the relationship that an operator has with the subject.
When I was starting I practiced doing figures of eight with the wheels and progressed to signing my name with them. I don’t feel the need to practice anymore but I do reassure myself that I can still sign my name each time I start a new film, if I am using a gear head.
A gear head is not everyone’s choice and I don’t always carry one but it does have distinct advantages on certain set ups and on certain films.”
2. You Must Discover Your Own Style
“I am very wary of showing too much in the way of plans and diagrams. Not because I am secretive and I don’t want to give away something that is personal. Not at all!
I just remember that when I began as a film maker and a cinematographer I never watched another cinematographer at work. The closest I ever got to seeing ‘how it was done’ was by shooting some documentary footage of Doug Slocombe at work on ‘Pirates of Penzance’. I loved seeing him work but it had absolutely no influence on the way my work evolved.
Our styles could not be more different. That’s my point really. You can’t learn your craft by copying me or anyone else. I hope what I do can do is in some way inspire others but I would be appalled if I though my work was being studied as ‘the right way to do the job’.
My way is just one of an infinite number of ways to do the job.”
3. Compromise is Sometimes Needed for a Better Film
“Sometimes, as with the death row scenes on ‘Dead Man Walking’, it is better to compromise composition, lighting and perhaps even sound a little and shoot with two cameras in order to help an actor get their performance. Sometimes it is better to go wider to include a prop in frame than break an actor’s concentration.
When an actor appears on set ready to do a take it may be too late to change anything. At that time if I see a bad shadow or an eyeline that is slightly off I might talk to the actor or I might not. Perhaps I might think it better to change things for take two. If not then I judge it my mistake and I must try not to let it happen next time.
In the end a film can look lousy but work because of a great performance but not the other way round. That’s something always worth remembering.”
5. Every Film is the Director’s Film
“I do have a problem with the ease with which you call what we do ‘art’. That is for someone else to conclude. To me it is a job, a creative job that I love to do but a job nonetheless.
The collaborative aspect of the job is very important but then so is the hierarchical nature of a film crew. Every film is the Director’s film and we must never lose sight of that.”
7. “Cinematography is More Than a Camera”
“Cinematography is more than a camera, whether that camera is a Red an Alexa or a Bolex. There is a little more to it that resolution, colour depth, latitude, grain structure, lens aberration etc. etc. etc. The lenses use for ‘Citizen Kane’ were in no way as good as a Primo or a Master Prime and the grain structure in that film is, frankly, all over the place. But the cinematography? Well, you tell me.”
8. Aspect Ratio is Ultimately a Directorial Choice
“I usually do suggest one format over another for a particular film but the final decision belongs with the director, as with any other aspect of production. Like most of the decisions I make it is, for the most part, an instinctive one based on a sense of the film I get from reading the script.
Some films, like ‘The Assassination of Jesse James…’ or ‘Jarhead’, lend themselves more obviously to a wide screen format whereas I could never imagine ‘House of Sand and Fog’, “The Man Who Wasn’t There’ or even ‘Shawshank Redemption’ in a wide screen format.
I would say my preference is for a wide screen image shot in Super 35mm on spherical lenses but the majority of films I have shot have been standard 1:85.”
10. Camera Choice is a Personal Decision
“In the final analysis you can only judge picture quality by eye and make a personal decision as to what you like and what you don’t like. Perhaps some people really can not see a difference between a 2K scan and a 4K scan of the same negative and I am sure some people really do prefer the look of an image produced by the Red Camera to one shot on film.
The choice of a camera system is no different than the choice of a lens set, a camera position or where to put a lamp.”
14. Being Local Helps Your Chances for a Job
“I don’t know what other cinematographers do but my assistant hires our crew. We do sometimes take on a local PA but not often a trainee. Everything is done on per project and the budget has a big influence on who we hire and where they come from. I tend to do lower budget films and hence we hire at least the loader and the PA locally. Sometimes the 2nd AC also.”
16. Internships Are Scare, Learn By Discovery
“Personally, when I am shooting a film I am totally focused on the job in hand and find even having a silent observer detrimental. There are many people who ask to be a part of my crew or to merely observe on a production that I might be shooting. Because of my hesitancy to accede to their requests perhaps my consequent feeling of guilt has led to the creation of this site.
For good or bad I never, as a student, had the luxury of observing another cinematographer at work on a set. It was only when I came to work in the US that I actually visited another set. I say this because I genuinely feel that cinematography, like photography in general, is not something that can be learned but, pretentious as it may sound, can only be discovered.”
17. Pulling Focus is a Tough Job for the AC and the Operator
“The 1st AC’s job is one of the most responsible on the whole crew. I know I could never do it and I have great admiration for someone who does the job well. I have worked with the same 1st AC for many years and we are very much in sync. I do think judging focus is very much intuitive but it is also the job of the operator to watch for image sharpness and for the timing of a pull etc.
Sometimes, as when I am making up the shot or on a particularly tight close up, I will work on a fluid head and have one hand on the focus knob just as if I were shooting a documentary. When you are working fast and without real rehersals, as is becoming the norm, there is little choice to do otherwise.”
18. If You’re Going Handheld, Go with an Experienced AC
“The first thing I should say is that I work with a very special assistant and he rarely needs to work from marks. If I am shooting hand held, as I was in the boxing for ‘Hurricane’ or for pretty much all of ‘Jarhead’, my assistant will attach a remote focus to the camera or I will control the focus myself. I find this is the only way sometimes, especially if I am ‘creating’ shots as things unfold. I spent many years shooting documentaries where I always controlled the focus myself as the kind of films I was shooting demanded a very instinctive way of following the subject.
You could use a fast stock to get a greater depth of field but, in truth, it would give you relatively little advantage. You might need to build the light levels to an F8.0 to gain any real advantage from lens depth of field. I would suggest using an experienced assistant at the end of a remote focus system.”
20. Collaboration and Trust Between the DP and AC is Key
“My equipment list actually changes very little from film to film. Of course equipment has advanced and that has made for different choices but the basic idea of the package is the same.
I have worked with Andy for some time now and I rely on him to test the package before a shoot. We work together on concocting any special items such as the ‘helmet cam’ for shooting the game in ‘The Ladykillers’ and we usually spend a day shooting tests even if the film is quite straightforward.”
23. Feeling Intimidated is Normal
“I generally feel intimidated! One of my first films was with Richard Burton and I felt intimidated by his talent (‘The Spy Who Came in from the Cold’!!!), at least I felt was until he gathered the crew, thanked us all for one of the most pleasant days he had ever experienced on a film, and then told us he had in fact felt totally intimidated by our youth!”
25. It’s Your Job to Find a Way to Work with Others
“As I have said before every director is different and may require something different from a cinematographer. The onus is on the cinematographer to find out how best to work for and with a director and with other members of the crew, for that matter.”
To read the full list go TheBlackandBlue
This is what’s new in the C300:
EOS C300 Cinema EOS Camera & EOS C300 PL Cinema EOS Camera
1. Ability to move the magnification viewing area around the LCD using the MAGN Function.
2. Support for a 1440×1080/35Mbps recording mode.
3. ISO up to 80,000 has been added.
4. Added functionality to support the optional Canon GPS Receiver GP-E1.
5. A Key Lock menu setting has been added which now makes it possible to lock all operations, including the START/STOP button.
6. Using the optional Canon WFT-E6 Wireless File Transmitter, the camera’s remote-control application allows two users to access the same unit via a Wi-Fi® link providing simultaneous camera operation and control and metadata input simultaneously.
7. [Lens Exchange] and [ND+/ND-] have been added as functions that can be allocated to any assignable button.
8. A new Wide DR gamma setting provides an expanded dynamic range of 800%.
9. Flicker Reduction has been improved.
EOS C300 Cinema EOS Camera only
10. Provides Push Auto Iris and One-Shot AF operation.
11. A new AE Shift function and the selection of various light-metering modes are now available when used with some Canon Cinema lenses (EF mount) and Canon EF Lenses.
12. Ability to assign the two control dials to operate either Iris or ISO sensitivity independently.
13. Peripheral Illumination Correction Data has been added for seven (7) Canon Cinema lenses (EF mount) and fifteen (15) Canon EF Lenses.
14. A function has been added to enable continuous focus and iris setting on a subject in the middle of the screen when one of the two EF STM lenses** is attached.