Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
I’m waiting for the Beastcage to be available. I’m on their wait list. I’ve been doing some shooting with my 15 Max, Kingston SSD, a small hub (from Amazon, which I’ve had for a while), Rode. Ic p,urged I to the hub as well. It all works just fine using log and some luts with the Blackmagic app and editing in FCP. I haven’t tried it yet in Resolve which has a few other options.

it does look great. Frankly, I expected it to be better than when shooting with last year’s 14 Max because if log. But Apple has made some other improvements in the pipeline that helps as well. Looking forwards now to getting some lenses for.
 
  • Like
Reactions: drmacnut
Such a pointless exercise, the rest of the gear is $$$ why skimp on a $1k phone. Spend more on a proper cinema camera or just use that $1K on a more consumer level camera that will still be leaps ahead of an iPhone.
You obviously don’t get it. If you want to show off the ultimate quality possible with a camera, you use it with the best equipment possible. Remember that they didn’t use external lenses. Really, what we’re talking about are just what a good production would use on a large scale. If cameras like this will force a serious rethink for expensive video camera makers such as Arri, the most used camera for movie work, it will be a good thing. It’s around $100,000 for the base camera, sans lens which can easily add a minimum of $10,000 to that price, and over $100,000 to it.

no, the iPhone isnt going to replace an Arri, or even a RED. Not yet. But some serious film makers have said that the difference in footage isn’t that great.
 
Last edited:
I still can hardly tell the difference between my iPhone 12 and iPhone 15 camera honestly and by that I mean, the differences I do see, seem rather software related
If you can’t see the difference between a 48MP sensor and a 12MP one, then this shouldn’t matter to you.
 
Apple has indeed showcased the impressive video capabilities of the iPhone 15, enhanced by sophisticated AI image processing algorithms. However, while highly advanced, the iPhone’s camera system still doesn’t quite match up to professional cameras in some aspects. Professional camera setups generally have larger sensors, which are pivotal for better performance in a range of lighting conditions. They also offer greater aperture control and higher bit rates which reduce compression, thereby preserving more details in the footage. Though the differences can be subtle, a discerning eye might notice that iPhone footage can sometimes exhibit a ‘rubbery’ texture or display noise in low-light scenarios.
 
Very impressive video last night shot on a small bit of tech that fits in people's pockets.

The odd thing about the whole show in trying to show appreciation for this capability was how FCPX seemed left out of the equation. Adobe got explicit references, Blackmagic, etc but where was FCPX? In the same way one could showcase an impressive video capability of "our" phone hardware, it seems there was equal opportunity to showcase impressive video editing capability of "our" FCPX vs. Adobe and Blackmagic.

I presume the professionals involved in making it were just more accustomed to using Adobe, Blackmagic, etc instead of FCPX. But it still seemed a bit odd to me to not get some FCPX love in there somewhere. Apple Motion could have easily got a few callouts too.
Apple has done this a lot in the past too, showing a lot of MS office instead of iWork for example. I think it’s mainly advertising how much the big name third party developers are onboard. Most people already know Apple apps will work very well with Apple machines.
 
  • Like
Reactions: HobeSoundDarryl
Sadly does not surprise me, these are the same fun Folks who decided to dump Professional Photographers from the Equation by suddenly dropping support for Aperture. They didn't kill it outright they just stopped supplying RAW Profile Updates meaning files from newer Cameras could not be imported without a workaround. Aperture was a Brilliant Application it handled DAM, Editing and Tethering it was clean powerful and intuitive and they just Killed it. Right around the time the first iPad Pro was announced and Adobe wasted no time in cobbling a Pshop for iPad together, So  seceded all that screen real estate to Adobe , when they could have been an Aperture for iPad.

inexcusable!

Apple realized early on that Aperture could never compete with Lightroom and Lightroom's rate of progress. It was true back then when Apple killed it, and more true today seeing how Lightroom has evolved (especially over the last year).

The above is not surprising. Apple is an outstanding computer/phone/tablet company. That's Apple's focus.

Adobe is an outstanding image and video editing/processing image/color science company, that's immersed in producing superb software tools. That's Adobe's focus.

That's why early on years ago after evaluating both Aperture and Lightroom, especially with respect to non-destructive RAW editing capabilities, I chose Lightroom and never looked back.

That turned out to be a very good decision.
 
Very impressive video last night shot on a small bit of tech that fits in people's pockets.
And this is the point many negative commenters are missing. It looked pretty good overall, and considering the size of the iPhone cameras, the vast number of things iPhone does every day outside of being a camera, and can still capture the raw data to make a professional event is really impressive.
 
i wonder if not releasing 27inch iMacs was a ploy to make 'shot on the iphone' the real star of the show. It feels that much of a setup/slap in the face
 
Apple has done this a lot in the past too, showing a lot of MS office instead of iWork for example. I think it’s mainly advertising how much the big name third party developers are onboard. Most people already know Apple apps will work very well with Apple machines.
The problem isn't showing off competitors like they do with Office or Photoshop. The problem is their lack of seriousness with the program and 5 years of stagnant updates. The competition is so far ahead now and they aren't even hiding the fact. Last night they mentioned Adobe and DiVinci 3x more times than FCP. It's been demoted to a half-baked iPad app.
 
Such a pointless exercise, the rest of the gear is $$$ why skimp on a $1k phone. Spend more on a proper cinema camera or just use that $1K on a more consumer level camera that will still be leaps ahead of an iPhone.
Well yes but Apple is selling a phone with a decent camera so they are promoting that of course.

We are long past the camera even mattering anymore. Even The Creator shot on the FX3 isn't really that big of a deal and not a reason for people to dump their cameras and rush out and get a FX3. They chose the FX3 because it got out of their way and allowed them to get simple image data they could then work with. A camera really is just a means to convert light to pixels. Its everything else that makes it look great.

Thats kind of the point of the articles about the FX3 and iPhone. People need to stop obsessing over cameras and just use them. Learn the other crafts of the trade and you can make even an iPhone look good.

iPhone of course has a ton of limitations and this type of productions is well planned out and controlled. They lit every single shot for the optimal exposure on the iPhone camera and played to the strengths of the 24mm lens by using shots that made sense for such a wide angle. Drones, dolly shots and so forth. Its a production designed to work well with the camera vs a camera that can work well with any condition.

This is an important distinction for others to realize.
 
But why does it look so much more “artifical” than cinema? Although I have to say that some modern TV shows also seem to go for that look. I’m probably just getting old 😆
It looks artificial because it IS artificial. There’s a lot going on in the image, but a big issue is the lights. Modern LEDs can set the brightness level and the color of the lights. In the past we took non-dimmable halogens on location shoots and made due with diffusing the lights.

Consider the depth in the images. In reality we see light that is far away as dimmer than light that is close up. With LED lighting a designer can set the lights in the background brighter to achieve a more even look to the lights across the image. It will make the background pop more and look really even across the image, but it will look unnatural, also.

Add in post-production, such as color grading and other effects, and you get an image that is very unnatural when it’s all said and done. Gone are the days of film where what you shot was what you got. (That’s how I learned) Digital video combined with LED lighting tends to give everything a slightly ethereal look, IMO, rather than the stark, realistic look a film.

No doubt that there’s much more to this than my brief explanation. The lenses of the phone play a role, compressing the image changes the video, as well as software interpolation. Film essentially exposed film to light, and voila, an image. There’s a lot more between real life and the image in modern video than there used to be, especially with an iPhone.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.