D.M. Rosemark shares his thoughts on 4K in our exclusive interview: "I am am filmmaker, film-theorist and multiple Jerome Grantee; I have actually gone back to 35mm and 16mm film-making because of some tangentially related issues on this topic."

4K: Then and Now
What do you think about 4K production?
The first thing you have to understand with this HD/2K/4K/6K/8K race is that it is analogous to the "Mega-Pixel War" in still digital-photography. In the 2000's every camera release was heralded by another jump in mega pixels.. First it was 10, 15, then 25 megapixels... so on and so forth. This is in large part a marketing tactic by the camera industry, because resolution is a very easy thing for consumers to understand technically speaking. It is also a relatively easy engineering issue for camera makers to focus on, much easier than things like bit-rates and chrominance sub-sampling. But eventually consumers had enough with the nonsense (I think it was around the same time Nokia came out with a 41mp cell phone) and now still cameras are going through a transitional phase with things like mirror-less shutters and photo/video hybrids. So in short there is a lot of hype around the issue, manufactured by the industry and driven by the consumer.
Is it really worth all the hype or do you think it is over-rated?
This isn't to say that there are not applications for things like 4K, 8K etc., there are! Special Effects benefit from this greatly for example. Archiving is another awesome use for these technologies, the 8K/4K restoration of Lawrence of Arabia is magnificent. Movie theaters also benefit from this, because when you want to build a mega-screen that is 80ft wide you had better have a really high-resolution projector.

If you want to evaluate weather 4K/6K/8K is appropriate for you is to important to think about how you are going to deliver your content to an audience. The answer to this mucks around in a very subjective soft science called "apparent resolution", which is basically trying to reconcile 3-things:

1) The resolution of an image we are seeing
2) The size of the screen we are watching
3) The distance we are from that screen

Right now I am typing on a 27" computer screen at about 3 feet and my screen is equivalent to 2k resolution; and to me that is adequate and super-sharp. Now if I were to type this email in my living room on my 60" television sitting 4ft away at 2K it would seem comparatively soft and may actually be hard to see. This is of course just an example of apparent resolution, video behaves different than a computer interface.

If you are going to play your content on gigantic screens in movie theaters, then you need to shoot in the largest resolution available to you. If you are going to play your image exclusively on cell-phones you are probably fine with HD. Few people will get the opportunity to project in a movie theater, and televisions will only be able to be as large as they are easy to fit into someone's home. So eventually the industry/consumers will come to reconcile the fact that we don't need 8K 40" televisions we sit a few feet away from.
How soon should we expect a movement towards 6K, 8K, and so on?
This is precisely why I am working exclusively in 16mm/35mm celluloid film-making at the moment. As a independent film-maker I CANNOT shell out 10K, 20K, 50K every few years for a new 'gold-standard' camera. The industry is in such a state of flux I do not feel comfortable buying something which will be out like the tide next year. Jim Jannard's RED Camera Company is taking steps to address this issue by creating modular systems that can be upgraded later, but that is still a stop gap and it is expensive. Celluloid is timeless and I have made the conscious business decision to stay out of the digital cinema market until it plateaus in a similar fashion to still-cameras, which it will. I am saving money in the long-term and I am spending that savings on film-stock, processing and telecine. I am thinking 3-5 years I will be in the market for a digital cinema camera.
Are we going to have at least a few years with 4K?
I don't know if the industry will plateau at 4K it seems reasonable to me as an expert, but consumers are very irrational. Here is a story about that.

Last month a friend was eager to show off his new Kk television to my girlfriend and I. I told him, "You should see our 60-inch plasma.... the blacks are like night itself!", he said "yeah you should see the blacks on this thing... incredible." As soon as he turns on the TV I think to myself, "No.... the blacks are actually kind of gross looking." He goes on to demo his TV and I am generally unimpressed with it.... he is in rapture with it though so I didn't want to rain on his parade. After my girlfriend and I leave, the very first words out of her mouth were...."Yeah our TV at home is way better than his"... which were my thoughts exactly.

Increasing resolution is one of the most straight-forward engineering challenges camera manufactures face so they are going to keep cranking out larger sensors until the consumer stops buying them.
Will something else be the hot ticket item next year?
There is always something new around the corner, you can't prevent that. For a while 24p was the hot-item, then it was solid-state, most recently it became about DoF and we see this big rush to DSLR film-making.

I remember having a conversation with someone about 3D a few years back. He had just seen Pina in Berlin and walked away convinced that 3D was the future of cinema and that in a few short years there would only be 3D cinema. I knew the history of 3D and was convinced with was nothing more than a fad which would die out just as it did twice before in the 50's and 80's. That was about 3 years ago and since then you had seen 3D on a steady decline.

What I want to see in cameras of the future is an increase in chrominance data, not luminance. Color is SUCH an important thing in cinema, but no one seems to be talking about it. That is because Chrominance Sub-Sampling is NOT an easy thing for a consumer to understand.... heck I am an expert and I don't even know if I fully understand it. I would also like to see more native acquisition/uncompressed acquisition. I don't see much of a point in shooting 4K video if you have to compress the life out of it in-order to work with it.

Here is the real bugger in this whole discussion; This rush to 4K/6K/8K is actually making the development of Chrominance/Uncompressed acquisition harder from an engineering standpoint. These issues are hard to deal with on an SD/HD level because we are talking about moving around large chunks of data very very fast. Now that the industry is moving to 4x 6x... 8x standard, making the development of these technologies 4x, 6x, 8x harder.