Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
I will never buy or rent a HD Movie from iTunes so give me a bluray player. I mean, some minivans have bluray in them. When Steve is gone (a day I will probably cheer) I hope this is one of the first things they change.
 
If I've seen it once, I'm done. I never found the fascination with owning movies really.

Depends on the movie. IF a movie is good, I will watch it repeatedly. Not in the same day, likely not in the same week. But I will watch it repeatedly.

Take Back To The Future. I watched in the theaters, watched on TV in both English and Spanish (the Spanish only so I could hear them cuss in Spanish :p) watched on VHS, watched on DVD, and now on Blu-ray. For me it never gets old, so I own it. I watch the Blu-ray every now and then, because I enjoy it.

Another movie though, Lord of the Rings, I didn't like all that much, any of them. So I rented them once, just to see what the hype was about, but never bought any of them.

I ripped my old DVD collection off to an external drive and now it lives on as my home media collection without physical discs. That's another reason for me to purchase the DVD.

I've always considered Microsoft a company that take great pains in supporting existing users for years,

http://www.zdnet.com/blog/hardware/...home-server-effectively-neuters-product/10543

Not overly fond of either Apple or Microsoft - equal opportunity hater here - but let's be fair.
 
Last edited:
late bloomers

'Good to know that Apple is revamping Quicktime for Lion, and will have an interim FCS release for Snow Leopard. Apple has a history of being a late bloomer, but seizes the market with their products. Hopefully, that'll be the case with FCS.

Nevertheless, I think those who need native editing today should switch, although for teams of editors, that will be an initial disaster with a slowdown in production, not to mention significant expenditures. Otherwise, hang in there until 2012.
 
Good point, let's just hope that in the smoke of this fire we will see a new and better product to be it's successor. And not a Mac Pro Server. A MP Server is good for a school's computer lab but not a major company.

You've got to be kidding. I work in education and we don't want the useless waste of space toytown server option that is the mac pro "server" either (and don't even get me started on the mini "server").

I can only conclude that Apple don't want me to bother maintaining the mac workstations in our labs, because they've made it all but impossible to do so with the decision to drop the xserve. They've dented their credibility with networking pros with that decision (and they never had that much saved up to start with).

I love my personal MBP and the mini I use as a home media centre is awesome, Apple make great consumer and small business products but they also make some real boneheaded decisions at the higher end of things.
 
What I don't understand is that anybody in the enterprise trusts Apple to provide legacy users with an upgrade path. The handling of the carbon discontinuation was a very large hint.

I've always considered Microsoft a company that take great pains in supporting existing users for years, whereas Apple is a very nimble company that can change on a dime. I don't think that is a particularly astute observation.

Why take the risk? I don't get it. I'm not trying to defend Apple here. After all, they did screw people over in a major way. I just find it strange that anyone would trust them in the first place.

K12 and Higher Ed.

Which is where I somehow landed after a divorce and closing down my business. I wound up taking on a short term contract to as a means to occupy my mind while I put together a new five year plan and wound up being pulled in full time.

Education, as I've come to understand, is a completely different animal than business (as it should be). The problem is that the back end needs are exactly the same. Often, the people picking a platform by intended use and acquiring the funding (via grant or budget allocation) are not savvy in the server room.

I walked into a school district that had close to a thousand Macs spread across 30+ locations (a drop in the bucket compared to their fourteen thousand plus Dells) and was asked to build a network deployment mechanism and write a support policy for those Macs. Why? Because they existed entirely outside of the support department. In the past two years, that number is closer to fifteen hundred plus our iOS devices. I was asked to stay on permanently and now I've inherited all those systems full time (maintaining them from a sysadmin standpoint and training the Windows techs to support them in the field).

I'm not trying to sidetrack you with a personal story; I'm only looking to answer your question with a clear cut example. Write what you know, right?

The point is; I completely agree with you. But that doesn't matter. Regardless of whether or not I trust Apple, I have a lot of Macs in the field that must be maintained. They are integrated into lesson plans, teachers rely upon them and students use them on a daily basis. They are part of a system that existed before me and my job is to devise a method of continued support.

I'm a Linux guy, myself. I've made a living off of finding solutions to problems; which is probably why I've stayed on as long as I have. It's been a string of interesting challenges. Now I've begun to hit a wall: Reality. Can solutions be created? Yes; but they'll likely be very unique solutions due to Apple's evolving business model.

Interesting times, to say the least.
 
I will never buy or rent a HD Movie from iTunes so give me a bluray player. I mean, some minivans have bluray in them. When Steve is gone (a day I will probably cheer) I hope this is one of the first things they change.

Hell, I wouldn't be surprised if Steve Jobs has BR player in his car. Anyway he is the largest single shareholder of Disney which is pushing BR titles like crazy. So its good that Disney is full on BR but for Apple its "bag of hurt". :confused:
 
Hell, I wouldn't be surprised if Steve Jobs has BR player in his car. Anyway he is the largest single shareholder of Disney which is pushing BR titles like crazy. So its good that Disney is full on BR but for Apple its "bag of hurt". :confused:

Of course when Apple say "bag of hurt" they frequently mean "bag of not putting all the money on the table into our pocket, where it belongs"
 
Wow, this thread cracks me up.

It's amazing that people are are okay with "FairPlay" DRM for video playback, but not blu-ray disc? Which format has the more restrictive DRM? Let's see here.

"FairPlay" (iTunes) content requires HDCP for the monitor, display output, and connection. The video file can only be played on a certain number of computers, and it can only be played on Apple portable devices. You cannot lend the file to a friend nor can you sell it. You also can't take it over to a friends house unless you lug Apple hardware with you and all the appropriate adapters to connect the hardware.

Blu-ray disc requires HDCP the same as iTunes for HD content. The disc can be played on any device that has a blu-ray reader. It can be played on everything from game consoles to PC blu-ray readers, to any set top blu-ray player. It can literally be played anywhere. Plus you can lend it to a friend, take just the disc and not all the hardware to a friends house, and you can sell the disc if you ever choose to.

As for all the other nonsense you hear about blu-ray playback, it's just that, nonsense. I have absolutely NEVER had to update the firmware in my standalone player to play a new movie. Never. Not once. And I only ever had to update the software player in Windows ONE time to play a new movie.

Blu-ray disc is truly plug and play. While you have to jump through all kinds of hoops and use proprietary hardware as well as proprietary adapters to get your iTunes content on to your TV.

Then it comes down to quality. Blu-ray disc video is encoded at up to 1920x1080, up to 45Mbps H.264 or VC-1 video (only a handful of very old first generation discs used MPEG-2), with lossless or sometimes uncompressed audio. iTunes "HD" content? Encoded at 1280x720, slightly less than half the resolution, 4-5Mbps H.264 video (yes, about 1/10th the max bit-rate), and sub DVD quality 384Kbps Dolby Digital audio.

Now let's talk about price.

RedBox does individual blu-ray rentals for $1.50 per night. iTunes is $4.99. And given the file size, even on my FiOS connection, it's still faster for me to go to the local RedBox and get the blu-ray disc I want. Plus I can get that $1.50 blu-ray disc and take it over a friends house and watch it there and not have to lug my Mac, adapters, and cables along with me to watch it.

For purchases, iTunes charges around $20. Target, Walmart, Best Buy, Fry's, Amazon, etc. have a ton of blu-ray discs under $20. In fact, Target and Walmart both have a section dedicated to $10 blu-ray discs. These aren't the same movies you'll find in the $5 DVD bin, but recent releases.

As for sales numbers, I don't see why people keep saying "DVD is outselling blu-ray disc!". For one, thats not always true. Two, people seem to forget that DVD was on the market for 7 full years before it FINALLY overtook VHS, in 2003. DVD's success was NOT overnight. Blu-ray is being adopted faster than DVD was, about twice the rate. Blu-ray disc sales set records nearly every quarter. Blu-ray disc will kill DVD at some point in the future.

But online streaming? People seem to think that "On Demand" and online streaming services will kill blu-ray disc. Not a chance. For a few reasons. One, On Demand is expensive. Again, $5 or sometimes even $6 for a high definition movie. Same movie at RedBox with better quality is $1.50. Or if you watch a lot of movies, a Blockbuster or Netflix subscription costs the same as 3 On Demand movies. If people try to use On Demand services the way they currently watch blu-ray discs or DVDs, they'll be in for a major case of sticker shock at the end of the month when their TV bill comes and it has an extra $40-$50 tacked on.

Online streaming won't kill blu-ray disc or DVD any time soon either. Why? A couple of reasons. One is quality. As myself and others have pointed out already, blu-ray disc runs at twice the resolution with up to 10x the video bit-rate of iTunes "HD" content. Not to mention the better audio. Even the standard definition stuff doesn't match up because theres no good way of scaling H.264 SD video to higher quality displays, while a good upscaling DVD player can easily make a DVD look better (due to lack of compression artifacting) than iTunes "HD" video.

The second biggest reason online streaming won't kill optical media is both bandwidth and metered billing. Believe it or not, in most places around the world, metered billing (Bandwidth caps) is a way of life. ISPs here in the US drool over the thought of metered billing and several have already imposed caps. Smaller cable operators in small towns already have extremely low caps. We have yet to see how things will play out, but we'll probably see metered billing with high overages before we see online video overtake optical media. Then you'd have to factor in the cost to download plus the cost of the video. Online video would die pretty quick if you had to pay $5 for the file then another $2 for bandwidth. Plus theres actual speed. Some of us are lucky enough to live in areas that have FiOS or other high speed connections, but the recent FCC survey showed that most people are on sub-3Mbps connections. That means it would take longer to download that iTunes movie than it would to watch it. With the average blu-ray file being around 25-30Mbps, that means it could take as long as ten times as long as the movie is to download it. I don't know about you, but if I didn't have my FiOS connection, I sure wouldn't wait several hours just to download a movie. Not when I could go to RedBox and get it for a fraction of the price, or have a Blockbuster by Mail subscription that includes in-store exchanges. And the situation regarding internet service in this country isn't going to get any better since the Democrats have no spine and won't stand up for consumers stuck with a duopoly and the "Party of No" have enough people in Congress to stop progress.

So let's summarize here:

All of this nonsense about blu-ray DRM is a myth. OS X already supports HDCP. AACS and BD+ are the "encryption" schemes used by blu-ray disc, the same way CSS and other various forms of disc protection are used on DVD. This would be supported by the player. The same way CSS and other DVD protections have to be supported and updated in the player. blu-ray is absolutely 100% a plug and play technology for every single standalone blu-ray player and the vast majority of computers shipped over the last couple of years. Every shipping Mac supports HDCP, and every Mac with a mini DisplayPort since 2008 supports it.

And most importantly, all of this nonsense regarding blu-ray licensing is also 100% fiction. Shortly after Jobs made that short sighted "bag of hurt" comment, the BDA made changes to licensing fees. Now its a one stop shop and one fee. Just like DVD.

So, the fact that Macs support HDCP, the encryption DRM needs to be supported in the player the exact same way as DVD, and licensing is now no different than DVD means there is absolutely no reason whatsoever for Macs to not support blu-ray disc in OS X. Especially considering I can reboot into Windows on my 2008 unibody MacBook and play a blu-ray disc without issues.

Just to add another point to what you said.

What about people around the world that the stores don't have movies and tv series? How are they suppose to see movies and tv series if they take Steve Jobs word?
 
No Bluray is embarrassing

Whats seriously embarrassing is that Macs cant even play bluray discs, can it be that difficult to create a small play programme?
Whether Apple like it or not Bluray is a widespread medium and they need to accept reality and not stick their heads in the sand.
 
Wirelessly posted (Mozilla/5.0 (iPhone; U; CPU iPhone OS 4_2_1 like Mac OS X; sv-se) AppleWebKit/533.17.9 (KHTML, like Gecko) Version/5.0.2 Mobile/8C148 Safari/6533.18.5)

Hi all we are talking about content creation and pro machines here. Think Apple pro res, HDSDI, 1080P60 4:4:4, 4k etc

Everyone talks about BR for renting movies,
This is a consumer thing. Pros has a nice THX certified 4k theatre where they can consume BR video if they want to see low quality video sometimes...

If we talk pro users we have 100Mb+ connections. I have 100Mb Fiber at home for $30 at home (Sweden) and I live outside town in a small village!

At work 10G+ connection if I need it;-)
 
Apple never made compelling professional server software to distinguish XServe from other servers so IT managers were consistently stuck trying to justify spending more money for the hardware that had sketchy support pretty much running the same software the same identical way has been available on "Mainstream" solutions. I'm happy to hear that at least there's discussion about the server space from within Apple.
 
Final Cut

Its a 4 letter word and it starts with an A it ends with a D 'Avid' This is the problem for all your problems!

Good Luck
 
Its a 4 letter word and it starts with an A it ends with a D 'Avid' This is the problem for all your problems!

Good Luck

Well, many have invested a lot of time and money in FC and it's not like you can change software over night.
It takes time to master an application like FC or Avid.
 
Everyone talks about BR for renting movies,
This is a consumer thing. Pros has a nice THX certified 4k theatre where they can consume BR video if they want to see low quality video sometimes.

I don't buy it. At least not for the majority of things.

Play back a Blu-ray in a digitally projected theater vs the original 2k file and the difference is not going to smack you in the face. (The difference IS there, but calling BR "low quality" is not accurate.)

Maybe some film is scanned at 4k (usually effects) but really, most film is delivered to cinemas in 2k. Not even IMAX reaches 4K. (Limitation of the system and optics) This will change eventually, but right now, 2k is the absolute majority.

Even if the color is 4:4:4 for that 2k projected file, vs a properly produced Blu-ray, its going be closer than you allude. (And of course, the audio is usually identical-lossless in both cases.)

The real advancement for film should be in pumping up the frame rate. Then we can finally have some truly satisfying action scenes and not have to worry about seeing flicker when we crank up the brightness. :)
 
I don't buy it. At least not for the majority of things.

Play back a Blu-ray in a digitally projected theater vs the original 2k file and the difference is not going to smack you in the face. (The difference IS there, but calling BR "low quality" is not accurate.)

Maybe some film is scanned at 4k (usually effects) but really, most film is delivered to cinemas in 2k. Not even IMAX reaches 4K. (Limitation of the system and optics) This will change eventually, but right now, 2k is the absolute majority.

Even if the color is 4:4:4 for that 2k projected file, vs a properly produced Blu-ray, its going be closer than you allude. (And of course, the audio is usually identical-lossless in both cases.)

The real advancement for film should be in pumping up the frame rate. Then we can finally have some truly satisfying action scenes and not have to worry about seeing flicker when we crank up the brightness. :)

Boosting the frame rate makes movies not look like movies anymore. Unless you're talking about what they do with the good 120Hz televisions where they replay the same frame 5 times and you don't have to worry about 3:2 pulldown.
 
I don't buy it. At least not for the majority of things.

Play back a Blu-ray in a digitally projected theater vs the original 2k file and the difference is not going to smack you in the face. (The difference IS there, but calling BR "low quality" is not accurate.)

Maybe some film is scanned at 4k (usually effects) but really, most film is delivered to cinemas in 2k. Not even IMAX reaches 4K. (Limitation of the system and optics) This will change eventually, but right now, 2k is the absolute majority.

Even if the color is 4:4:4 for that 2k projected file, vs a properly produced Blu-ray, its going be closer than you allude. (And of course, the audio is usually identical-lossless in both cases.)

The real advancement for film should be in pumping up the frame rate. Then we can finally have some truly satisfying action scenes and not have to worry about seeing flicker when we crank up the brightness. :)

OR film is delivered to cinemas as film. Which is about 8K at best. And since when has IMAX been 4K? That stuff can reach as high as 28K, buddy. I think it's safe to say that the pros use 35mm, not 2K or 4K.

Bumping up the frame rate has to be one of the stupidest things Cameron has ever said. Smooth action just doesn't look right. Maybe for video games, but not for film.
 
OR film is delivered to cinemas as film. Which is about 8K at best. And since when has IMAX been 4K? That stuff can reach as high as 28K, buddy. I think it's safe to say that the pros use 35mm, not 2K or 4K.

Bumping up the frame rate has to be one of the stupidest things Cameron has ever said. Smooth action just doesn't look right. Maybe for video games, but not for film.
Not to derail this thread but 35mm, from what I've read, is generally considered to be around 6k and anything beyond that is just capturing the grain structure of the film but not any additional image detail. 2k is the most common res for scanning film in for post production although for FX shots they'll bump it up to 4k. The release prints, what are shown in the theaters, are around 2k but the quality quickly degrades over time as the film is handled and/or shown. IMAX I can't really comment on but they did just announce a 4k digital camera for IMAX (which seems too low res but I haven't tried to follow up on it to learn more).

Most pros, if they are digital, are shooting HD. Star Wars Ep. II and II, Zodica, Avatar, Miami Vice, The Curious Case of Benjamin Button, Apocalypto, Public Enemies, etc., were all shot 1080p. Red is only general purpose d-cinema camera that shoots above 2k and there are only 2-3 that shoot 2k (ex. Arri Alexa, SI-2k).


Lethal
 
OR film is delivered to cinemas as film. Which is about 8K at best. And since when has IMAX been 4K? That stuff can reach as high as 28K, buddy. I think it's safe to say that the pros use 35mm, not 2K or 4K.

Bumping up the frame rate has to be one of the stupidest things Cameron has ever said. Smooth action just doesn't look right. Maybe for video games, but not for film.

I don´t think bumping the frame rate is stupid at all. To use 24fps was a financial decision, not an astethic (sorry, don´t know how to spell it) one.

Read Robert Ebert´s take on it when he discusses 3D:

"What Hollywood needs is a “premium” experience that is obviously, dramatically better than anything at home, suitable for films aimed at all ages, and worth a surcharge. For years I’ve been praising a process invented by Dean Goodhill called MaxiVision48, which uses existing film technology but shoots at 48 frames per second and provides smooth projection that is absolutely jiggle-free. Modern film is projected at 24 frames per second (fps) because that is the lowest speed that would carry analog sound in the first days of the talkies. Analog sound has largely been replaced by digital sound. MaxiVision48 projects at 48fps, which doubles image quality. The result is dramatically better than existing 2-D. In terms of standard measurements used in the industry, it’s 400 percent better. That is not a misprint. Those who haven’t seen it have no idea how good it is. I’ve seen it, and also a system of some years ago, Douglas Trumbull’s Showscan. These systems are so good that the screen functions like a window into three dimensions. If moviegoers could see it, they would simply forget about 3-D."

Full link:
http://www.y2neil.com/blog/2010/05/01/robert-ebert-on-3d/
 
I don´t think bumping the frame rate is stupid at all. To use 24fps was a financial decision, not an astethic (sorry, don´t know how to spell it) one.
Even though it was selected because it was 'good enough' it still created an aesthetic that, in part, has become the look & feel generations of people associate with cinema. Upping the frame rate might provide a more clear picture from a technical stand point but that isn't necessary a better picture from an audience's or filmmaker's perspective.


Lethal
 
Take Back To The Future. I watched in the theaters, watched on TV in both English and Spanish (the Spanish only so I could hear them cuss in Spanish :p) watched on VHS, watched on DVD, and now on Blu-ray. For me it never gets old, so I own it. I watch the Blu-ray every now and then, because I enjoy it.

Just rewatched most of the entire trilogy. Had fond memories of all of em but, I don't know, they just don't hold up. A lot of it is just downright creepy (Hey let's hire Biff because without him, we wouldn't have met. Yeah, without him trying to RAPE YOU!) And marty is just a whiny little d-bag who would have grown up to be a loser with loser kids without the help of that guy from taxi.
 
I don´t think bumping the frame rate is stupid at all. To use 24fps was a financial decision, not an astethic (sorry, don´t know how to spell it) one.

Read Robert Ebert´s take on it when he discusses 3D:

"What Hollywood needs is a “premium” experience that is obviously, dramatically better than anything at home, suitable for films aimed at all ages, and worth a surcharge. For years I’ve been praising a process invented by Dean Goodhill called MaxiVision48, which uses existing film technology but shoots at 48 frames per second and provides smooth projection that is absolutely jiggle-free. Modern film is projected at 24 frames per second (fps) because that is the lowest speed that would carry analog sound in the first days of the talkies. Analog sound has largely been replaced by digital sound. MaxiVision48 projects at 48fps, which doubles image quality. The result is dramatically better than existing 2-D. In terms of standard measurements used in the industry, it’s 400 percent better. That is not a misprint. Those who haven’t seen it have no idea how good it is. I’ve seen it, and also a system of some years ago, Douglas Trumbull’s Showscan. These systems are so good that the screen functions like a window into three dimensions. If moviegoers could see it, they would simply forget about 3-D."

Full link:
http://www.y2neil.com/blog/2010/05/01/robert-ebert-on-3d/

Ok, I do admit that projecting 35mm at 48fps would dramatically increase the quality of the projection. This is because part of film's detail is due to the fact that film is a random assortment of grain. Our brains combine detail from adjacent frames (detail that might be present in one frame, but not in another) to form a higher resolution image. When you start increasing the rate at which frames are captured and exhibited, you're basically increasing this psychological effect, further increasing the resolution beyond what 35mm film can produce today.

What I'm saying is that digital projection doesn't have this luxury (which is what Cameron is suggesting). If you increase the frame rate, you are only increasing "motion resolution," but because digital media uses a set matrix of detail, you cannot increase spatial resolution by increasing frame rate.

However, in my opinion, 24fps is absolutely perfect for what film is trying to accomplish. It is slow enough for our brains to accept the image as a memory (vs reality), which is beneficial because of emotional connection to the film. But it is fast enough that we can still perceive motion. For fiction films, 24fps is great, however, I can totally see how visually strong documentaries, IMAX or otherwise, could really benefit from this.
 
Even though it was selected because it was 'good enough' it still created an aesthetic that, in part, has become the look & feel generations of people associate with cinema. Upping the frame rate might provide a more clear picture from a technical stand point but that isn't necessary a better picture from an audience's or filmmaker's perspective.


Lethal

That´s very true, and why almost every video production I´ve shot in the last couple of years have been shot in 25P not 50i.
It is a look we are accustomed to. I haven´t seen any 48fps footage projected so I can´t really comment, but to knock it like vatakarnic33 did is a bit premature I think
 
That´s very true, and why almost every video production I´ve shot in the last couple of years have been shot in 25P not 50i.
It is a look we are accustomed to. I haven´t seen any 48fps footage projected so I can´t really comment, but to knock it like vatakarnic33 did is a bit premature I think

Read my post above yours. I'm not knocking 48fps, I'm knocking the idea of choosing it for a technical reason, instead of an aesthetic reason. A lot of filmmakers these days don't seem to understand the concept of appropriate aesthetics... One of the biggest of these is, in my opinion, James Cameron.
 
OR film is delivered to cinemas as film. Which is about 8K at best. And since when has IMAX been 4K? That stuff can reach as high as 28K, buddy. I think it's safe to say that the pros use 35mm, not 2K or 4K.

Bumping up the frame rate has to be one of the stupidest things Cameron has ever said. Smooth action just doesn't look right. Maybe for video games, but not for film.

I think we're referring to two different things here. You're talking theoretical limits of media. I'm talking about what ends up on screen, as in what you would project VS a blu-ray.

The film rolls delivered to cinemas are 8K? Not a chance.

In a theoretical BEST CHANCE, BEST CASE, FIRST GENERATION, 35mm film role you'd have a chance of getting 6k. Maybe. That's generous. With analog you can debate away a few thousand lines here and there pretty easy.

I don't know many 35mm/Super 35/etc films that have a 6k pipeline from beginning to end. If it's perfectly transferred and played back digitally, most films probably hit 2k. If it's 35mm projected, it's far worse. Most film will see about 1,000 lines of vertical resolution or less when it's actually in a theater.

"At this point, the typical audience cannot see the difference between HD and 35mm. Even professionals have a hard time telling them apart. We go through this all the time at NYU ("Was this shot on film or video?")."

http://www.filmschooldirect.com/sample_lessons/sample_lesson_HD_vs_35mm.htm

As for IMAX, again, you're talking pie in the sky numbers.

From John Galt...

"The 4K system that most people know is IMAX -- and it doesn't quite make 4K, which is a surprise to people. “How can that possibly be?,” you say. “It's an enormous big frame.” Well, because of what I was talking about earlier: the physics of optics. When you take the entire system into account – from the lens of the camera, to the the movement of the light through the projector, all slightly reducing resolution -- you wind up with less than the full resolution you started with."

http://magazine.creativecow.net/article/the-truth-about-2k-4k-the-future-of-pixels

Maybe I'm way off base, but that's what I've seen, read, and been told. Anything you have to share to the contrary would obviously be an education for me, as I put my knowledge squarely at lower-middle. ^_^

BTW, IMAX trumps Blu-ray, no question. Huge difference. Ditto to a lesser extend for 65 and 70mm.
 
Last edited:
Read my post above yours. I'm not knocking 48fps, I'm knocking the idea of choosing it for a technical reason, instead of an aesthetic reason. A lot of filmmakers these days don't seem to understand the concept of appropriate aesthetics... One of the biggest of these is, in my opinion, James Cameron.

Don't forget, James is an action director. He probably says "can't do that" mentally a lot due to our "learned" preference for a lower frame rate.

A higher frame rate SHOULD be an option for film makers, just like a lack of color is still an option for film makers. It doesn't have to be an all or none proposition, just another tool in the tool box. You use the term aesthetics when referring to higher frame rates, but don't forget, widescreen, sound and color were added after the fact not because they were inferior to silent B&W films. I don't see how a higher frame rate doesn't also land in this category.

I don't look at higher frame rates the same as smooth screen technology on TVs. The tearing, smoothing, and general artifacts that come along for the ride are why I don't care for that option...
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.