Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
Compare to CAE

As a experience user in the CAE (3D modelling) I am very familiar when software must be rewritten completely. After a while the software is so old so you need a rewrite of some of the core parts, in CAE the solid modelling kernel.

so of course you will have a problem going from surface models to solid modeling you can't represent a surface model as a solid model. what you can do is create a tool that translate most of the geometry but often you end up with the final result, it's difficult to go back and edit the model.

CATIA had this problem going from v4 to v5, Pro Engineer when moving to windchill, IDEAS to NX etc...

Of course all big companys kept the old software in house and started new projects on the new systems, and translated only important carry overs etc.

Some persons always says that the old way of doing it is the best way, we miss some features in this new paradigm, why ahold we have solid modeling, we could do everything with surface modeling. People shouted that they had removed their favorite feature (creatin fillet, surface meshes, etc) in a year or two the software is much mor stable, faster and has more features, and everything becomes incremental development until the next evolutionary step!


even the standards for archiving and exchange evolves, IGES only supported surface models, and can be tricky to translate to solids. STEP translate solid models without history, like translating only the final render...

This is going to be a problem in video editing to, before you used tapes as the original/master format and could always import that if you had the timestamps in a file/edline transitions and other stuff is always difficult to translate and you may not get the right results.

Now we have a transition in Final Cut, and people are in uproar.

I think it's best for Apple to concentrate on the core program, of course plugins for exporting to tape etc shall be written by AJA/Black Magic why should apple write the support for new capture cards... Let AJA and others write the plugins for their own

someday Adobe/ avid needs to take the next step and rewrite from the bottom up and there will be an uproar again;-)

Just wait and see;-)
 
someday Adobe/ avid needs to take the next step and rewrite from the bottom up and there will be an uproar again;-)

Just wait and see;-)
Dude... Apple killed Color.

If there was any doubt left in my mind that Apple was not interested in the pro market, that little announcement did away with it.
 
This is going to be a problem in video editing to, before you used tapes as the original/master format and could always import that if you had the timestamps in a file/edline transitions and other stuff is always difficult to translate and you may not get the right results.

Now we have a transition in Final Cut, and people are in uproar.

I think it's best for Apple to concentrate on the core program, of course plugins for exporting to tape etc shall be written by AJA/Black Magic why should apple write the support for new capture cards... Let AJA and others write the plugins for their own
As I understand it FCP 10 doesn't offer the API 'hooks' for third party hardware to tap into so the best Matrox, AJA and Blackmagic can do is offer separate apps for tape I/O and an extended desktop mode to get FCP 10's Viewer onto a broadcast monitor for preview purposes only. I can see those guys doing their best to ramp up Adobe and Avid support real quick because you don't really need a card like that if FCP 10 is not able to send out a base band video signal.

someday Adobe/ avid needs to take the next step and rewrite from the bottom up and there will be an uproar again;-)

Just wait and see;-)
Adobe already did a ground up rewrite of Premiere 5 or 6 years ago and, as I understand it, Avid has been rewriting Media Composer from 32-bit to 64-bit in chunks. The idea that the only way for FCP 10 to move forward was to jettison functionality and flexibility that it had developed over the past decade is inaccurate. Over the past couple of years Adobe and Avid have incorporated things like native support for new codecs, GPU acceleration, etc., so it's not like FCP 10 is leading the pack in regards to 'new media' workflow. At best it finally caught up to Adobe and Avid and at worst it is still lagging a bit behind.

Maybe things will change after a few versions and FCP 10 will become a worthy successor to its namesake but potential doesn't pay the bills.


Lethal
 
As I understand it FCP 10 doesn't offer the API 'hooks' for third party hardware to tap into so the best Matrox, AJA and Blackmagic can do is offer separate apps for tape I/O and an extended desktop mode to get FCP 10's Viewer onto a broadcast monitor for preview purposes only.

The way I understood it, and please correct me if I am wrong, but Apple has not yet released their API's to developers as of yet, but intends on doing so. They hadn't gotten all of the standards for their new XML schema finished up so they were delaying the API's and the XML standards in light of that. (A good indication that even they are still working on the functionality of the app.)
 
The way I understood it, and please correct me if I am wrong, but Apple has not yet released their API's to developers as of yet, but intends on doing so. They hadn't gotten all of the standards for their new XML schema finished up so they were delaying the API's and the XML standards in light of that. (A good indication that even they are still working on the functionality of the app.)

If that is true, things are bad. If a company is really serious about 3rd party suport, the first thing to do is to get the API's right and share them with these companies so that they can start working on their solutions.

Imagine Intel first pushed out their new chips and only then let Apple know the specs.

I really like Apple's hardware, and OSX is a nice system, but their handling of FCP X is terribly amateurish.
 
Imagine Intel first pushed out their new chips and only then let Apple know the specs.

See the problem with that analogy is that software needs to be written to the hardware.. The lack of the features bigger houses desire does not affect the core functionality of the app.. FCP X isn't broken, it is just inaccessible to multi-app post houses without the features. And while it would have been nice for more 3rd party add ons to have been available at release time, this does give Apple the chance to work out any core bugs in the software before adding the bulk of extra features. Because of that, I use it for personal stuff, and for inconsequential side projects for work that won't take me long to piece back together in 7 if I hit a roadblock in X. That way I can start to see how it is to use the core editing functions, find bugs, and know for sure what I really feel I miss so that I can provide feedback to Apple for whatever it is worth.

Now I'm not saying that I think Apple handled this gracefully by any means, but the way that it is going is not out of their nature. They don't communicate a lot about anything, and the only products that they do eventually publicly push are their consumer products. I mean they didn't even release a video, or the keynote for the Las Vegas conference where they unveiled it.. we had to gather what we could from youtube leaks..

Apple needed to get this out because it was so long in the making, and so far behind their other apps which had already been ported to 64 bit.. Yes, a turnkey solution from the get go would have been nice, but this just doesn't seem that unreasonable. Time will tell though.
 
Last edited:
The way I understood it, and please correct me if I am wrong, but Apple has not yet released their API's to developers as of yet, but intends on doing so. They hadn't gotten all of the standards for their new XML schema finished up so they were delaying the API's and the XML standards in light of that. (A good indication that even they are still working on the functionality of the app.)
Only Apple knows Apple's plans but from what I've read from people than know more about programing that I do is that FCP X doesn't show signs of having the ability to bet a base band video signal into nor out of it. These same people are also taking the fact that Matrox and AJA (AJA has a very close relationship with Apple) have both released pressers talking about the 'new way' their products interact w/FCP X (which is basically to say that they don't) as indirect confirmation to this end.

Who knows, maybe when Lion comes out we'll get a pleasant surprise.


Lethal
 
See the problem with that analogy is that software needs to be written to the hardware..

Well, I am a software developer and I have absolutely no idea what you are talking about. Please enlighten me.

Apple cannot share the api's of their FCP X plug-in architecture early to plug-in developers because...
 
Well, I am a software developer and I have absolutely no idea what you are talking about. Please enlighten me.

Apple cannot share the api's of their FCP X plug-in architecture early to plug-in developers because...

Try installing OS X on an AMD machine.. won't work. Well not without altering it yourself...

I did not say they couldn't, I said they haven't.

Can Final Cut Pro X export XML?
Not yet, but we know how important XML export is to our developers and our users, and we expect to add this functionality to Final Cut Pro X. We will release a set of APIs in the next few weeks so that third-party developers can access the next-generation XML in Final Cut Pro X.

That is from the Apple FAQ about FCP X. That tells me that they just aren't ready.. Maybe because they want the API's to be more wholly based on Lion's architecture, or maybe it is just because they want to piss you off. :rolleyes:
 
Well, I am a software developer and I have absolutely no idea what you are talking about. Please enlighten me.

Apple cannot share the api's of their FCP X plug-in architecture early to plug-in developers because...

Is software written in connection with firmware, which sits between the application and the hardware, actually operating the hardware?
 
I know that there is a lot of media hype in all this, but Apple really did screw things up on this one. They really flung some mud into the eyes of some pro people that use Pro Apps on a regular basis.
 
??? And this has to do with sharing API's how?

I was just clarifying why I felt that your hardware based analogy was inappropriate, since it seemed unclear to you.

I'm not defending Apple on the merits of their handling of this release. My original post was simply offering up my understanding of the state of the API's for FCP X.
 
Only Apple knows Apple's plans but from what I've read from people than know more about programing that I do is that FCP X doesn't show signs of having the ability to bet a base band video signal into nor out of it. These same people are also taking the fact that Matrox and AJA (AJA has a very close relationship with Apple) have both released pressers talking about the 'new way' their products interact w/FCP X (which is basically to say that they don't) as indirect confirmation to this end.

Who knows, maybe when Lion comes out we'll get a pleasant surprise.


Lethal

I went and dug into what you were talking about here, and right now that seems to be true. It seems that Apple is really trying to deal with color accuracy via ColorSync, and not relying on monitors with embedded color spaces. This actually seems kinda cool to me in theory because a nice monitor with a wide gamut like Eizos or NECs, that can display damn near 100% of the Adobe 1998 color space, could have multiple profiles written for them so that an editor or colorist could change color spaces on the fly depending on what the final output of their current project will be. Much like 'soft proofing' an image for print in photoshop.
 
I was just clarifying why I felt that your hardware based analogy was inappropriate, since it seemed unclear to you.

I did not write a hardware based analogy. I mentioned intel because Apple needs to get specs early from Intel in (roughly) the same way that FCP plug-in writers need to get them early from Apple.

So no, I do not think that I misunderstood my own statements. Thank you very much for your concern.
 
I did not write a hardware based analogy. I mentioned intel because Apple needs to get specs early from Intel in (roughly) the same way that FCP plug-in writers need to get them early from Apple.

So no, I do not think that I misunderstood my own statements. Thank you very much for your concern.

I'm sorry, you mentioned Intel chips.. I must have mistaken that for a hardware reference.. my mistake.

And I don't see how the plug-in writers 'need' them early from Apple. If Apple was concerned with having a host of plug-ins ready by or near launch then they would have released them sooner. I think Apple wants the plug-ins that get written for FCP X to be ported to the new technologies in Lion, and with FCP's release being pre-Lion it wouldn't have done anyone any good if plug-in writers were all sitting on software that wouldn't operate properly on the current albeit outgoing OS..

That being my hypothesis, I'm not too sure why they didn't just wait for Lion to release. I don't disagree with idea of releasing the API's early, but it seems to me that the best way for that to have happened would have been for Apple to release the API's instead of launching the software itself. Then they could have released FCP in tandem with Lion, all while 3rd party developers would have had a few weeks head start.

Seeing as thats not how it happened though, an early release of the API's really wouldn't have helped any users at the release time they chose because there would be no real point for developers to write plug-ins for an outgoing OS just to have to rewrite them for the new OS..
 
I went and dug into what you were talking about here, and right now that seems to be true. It seems that Apple is really trying to deal with color accuracy via ColorSync, and not relying on monitors with embedded color spaces. This actually seems kinda cool to me in theory because a nice monitor with a wide gamut like Eizos or NECs, that can display damn near 100% of the Adobe 1998 color space, could have multiple profiles written for them so that an editor or colorist could change color spaces on the fly depending on what the final output of their current project will be. Much like 'soft proofing' an image for print in photoshop.

You can do that already with higher end displays like the NEC PA-271W.

You can create multiply profiles and switch, depending on your use, e.g. from photography to web design.

I mean, you can do it in the USA. European models do not have the ability of hardware calibration.
 
I went and dug into what you were talking about here, and right now that seems to be true. It seems that Apple is really trying to deal with color accuracy via ColorSync, and not relying on monitors with embedded color spaces. This actually seems kinda cool to me in theory because a nice monitor with a wide gamut like Eizos or NECs, that can display damn near 100% of the Adobe 1998 color space, could have multiple profiles written for them so that an editor or colorist could change color spaces on the fly depending on what the final output of their current project will be. Much like 'soft proofing' an image for print in photoshop.
In April when I heard that things were going to be managed by ColorSync I was happy because right now if you open a video in QT 7, FCP 7, DVD SP, and Color you could very likely get four different variations of the same clip because their is no global uniformity to how video gets displayed. Unfortunately color accuracy is only one part of the process.

Computer display systems (GFX card + monitor) are designed to handle progressive images at whole frame rates (ex. 60.00Hz) but a broadcast video signal can be interlaced and almost always runs at fractional frame rates (23.976, 29.97, 59.94). At least in NTSC/60Hz countries. In PAL/50Hz countries the video signals are whole frame rates (25.00 or 50.00 fps). So even if the color space is right what about proper handing of the frame rate and interlacing? For what it's worth, currently both AJA and Matrox describe the video signal from FCP X, which is basically running in extended desktop mode, as preview quality only.

Due to hardware and software limitations it is nearly impossible to get an accurate, broadcast quality signal from a computer's GFX card or to use a computer display as a broadcast accurate monitor. The only exceptions I know about is the Matrox MXO can be hooked up to certain GFX cards and successfully "extract" a broadcast quality stream (as well as used w/a 23" ACD to be a low budget broadcast monitor) and the HP DreamColor monitor which, using very precise methods, can be used as a broadcast accurate monitor.


Oh, and in a somewhat related statement, Apple had to have released some APIs early because a few companies, like AJA and AutomaticDuck, released FCP X products the same day it came out.


Lethal
 
You can do that already with higher end displays like the NEC PA-271W.

You can create multiply profiles and switch, depending on your use, e.g. from photography to web design.

I mean, you can do it in the USA. European models do not have the ability of hardware calibration.

Well the beauty of ColorSync is software calibration. With the use of something as simple as a Spyder or Colormunki, any display can be profiled, and separate profiles can be built for all kinds of color spaces, whether it be NTSC or PAL or even sRGB for products designed for internet distribution. It could really open the doors to editors and colorists of all walks to have a very capable solution for a very reasonable amount of money.

Now all of this in theory is really nice, but it'll be up to the actual execution of this to make it worthwhile.. That and, this is all just that, theory.
 
So even if the color space is right what about proper handing of the frame rate and interlacing?

My guess on interlacing is that they probably see it as a thing of the past.. Whether or not they jumped the gun on that is iffy.. But with digital broadcast standards it seems like they should be phasing out before too terribly long.

As to the frame rate issue, I am at a loss.. The only thing that it seems is viable currently with X would be to export at the target framerate (and field dominance for that matter) and test it in a player that can output the broadcast signal... That is definitely a stop-gap, and an unfortunate one at that for those that must deal with this issue.

Though many new TV's can display a variety of frame rates within their region standards, so it might be becoming time that the fractionals fall off to.. Again there is no telling when that might happen, but it seems to be a reasonable thing to foresee. Thoughts?

Oh, and in a somewhat related statement, Apple had to have released some APIs early because a few companies, like AJA and AutomaticDuck, released FCP X products the same day it came out.

Excellent point here too, though it would seem that the release was of a very limited nature. I really want some competition on the XML side of things as AutomaticDuck seems a bit pricy.. I don't mind paying out the nose for something, so long as 'out the nose' is a fair/reasonable price. I have big hopes for new flexibility with XML based output.. especially on the audio side (which I've laid out my thoughts on in an earlier post)
 
I believe the lack of communication, while creating FCPX through now... IS the communication. Translation: Apple is done with the pro market.

They are and they aren't. They make it clear a few years back that they aren't going to make the professional world their touchstone for all decisions. They are going to make consumer/prosumer products that can be used by pros sometimes just as they are, sometimes with the help of plug-ins etc.
 
Well the beauty of ColorSync is software calibration. With the use of something as simple as a Spyder or Colormunki, any display can be profiled, and separate profiles can be built for all kinds of color spaces, whether it be NTSC or PAL or even sRGB for products designed for internet distribution. It could really open the doors to editors and colorists of all walks to have a very capable solution for a very reasonable amount of money.

Now all of this in theory is really nice, but it'll be up to the actual execution of this to make it worthwhile.. That and, this is all just that, theory.

Of course it's a great thing. (Should have been done much earlier. ).

Just pointed out that that good color management is also possible today.
 
My guess on interlacing is that they probably see it as a thing of the past.. Whether or not they jumped the gun on that is iffy.. But with digital broadcast standards it seems like they should be phasing out before too terribly long.

As to the frame rate issue, I am at a loss.. The only thing that it seems is viable currently with X would be to export at the target framerate (and field dominance for that matter) and test it in a player that can output the broadcast signal... That is definitely a stop-gap, and an unfortunate one at that for those that must deal with this issue.

Though many new TV's can display a variety of frame rates within their region standards, so it might be becoming time that the fractionals fall off to.. Again there is no telling when that might happen, but it seems to be a reasonable thing to foresee. Thoughts?
Interlacing is very much alive and well in digital broadcasting (channels are typically either 720p60 or 1080i60) and I don't see the fractional frame rates going away anytime soon either. Originally the fractional frame rate came about with the switch over from B&W to color in the US and I assume it's still in use today for legacy reasons. There is a lot of old, expensive gear (like communication satellites) that can't readily be upgraded to work with new technology so the new technology has to made to work with the old.


Lethal
 
Agree. A modular approach is more modern than an all-in-one solution.

Especially for software with a thriving plug-in community, just like Photoshop, Wordpress...

Yep. Apple's about making money, same as the Studios that haven't made any statements bitching about the new software. So creating something that at its core could be used by prosumers but also enhanced by plug-ins etc to be used by the big boys is the way to make that money.
 
I know that there is a lot of media hype in all this, but Apple really did screw things up on this one. They really flung some mud into the eyes of some pro people that use Pro Apps on a regular basis.

A lot of folks have been saying that but we don't really have all the facts. Such as the total number of copies bought and how many folks were buying intending to use it for actual editing right away versus having it to learn the interface etc to be ready to use it in 4-5 months. How many negative opinions there are versus position out of the whole buying group. And what those folks actually do. it's a lot different if there are 1000 negative reviews out of 10k buys versus 1k out of 1 million buys. And if it turns out that 900 of those negatives are folks that have never actually done any even prosumer editing how much weight should we give their opinions. And so on.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.