Minimum Used Mac Pro/Macbook for editing?

Discussion in 'Digital Video' started by jtara, Sep 28, 2011.

  1. jtara macrumors 65816

    Mar 23, 2009
    What is the minimum model of used Intel Mac Pro or Macbook that you would consider adequate for editing?

    I have a friend who is a film student. He has a G5 (which is basically worthless, right?) Mac Pro and an old, old copy of Adobe Premiere. He should be able to get the latest Premiere at a really good price since he is a student.

    I assume the G5 and old Premiere is going to be useless for HD editing.

    He's on a budget, so it makes sense for him to buy used.

    What models should he consider, and what prices might he expect to pay?

    I'm not too concerned about RAM/hard disk as these are inexpensive third-party upgrades.

    I'm a software engineer with a lot of hardware background, so I can help him out with "creative" solutions. Are Hackintosh's dead? Are there issues with installating Adobe software on one? I mean, geez, I could put together a pretty decent system with a high-end workstation motherboard, and I've got a lot of spare parts that could be used - e.g. rackmount case, power supply, etc.
  2. MisterMe macrumors G4


    Jul 17, 2002
    An awful lot of video has been edited on G5 and lesser Macintoshes. Those old machines cannot be used to edit HD video. For lesser video, they are hardly worthless.

    As for Hackintoshes, they are not just one thing. A Hackintosh is a non-Apple Intel-based computer running MacOS X. They run the gamut from netbooks to work station-class desktop computers. Their ability to edit video depends on their power.
  3. cgbier macrumors 6502a

    Jun 6, 2011
    The problem with Hackintshs is that you only have a limited range of hardware (mostly motherboards and GPU) you can use. i have been toying with the thought of building one myself, but the more I researched, the more I figured out that it can be a big PITA - even with supported hardware.

    What kind of HD is your buddy doing? I have successfully edited HDV on my old C2D MacBook (FCE) and a G5 (FCS). The editing wasn't so much a problem, it was the rendering that went like molasses.
    Right now I'm using a 2010 MBP with FCS2 and FCP X. No problem at all. Not sure if I wanted to go with a Mac Pro anymore since Thunderbolt exists.

    If he has to work with AVCHD, he can't use the G5. He'll need an Intel for it.
  4. Sweetfeld28 macrumors 65816


    Feb 10, 2003
    Buckeye Country, O-H
    I used G5s in college for video production; they were great for what they were at the time.

    However, most newer software [ie] Final Cut X, requires a 64 bit processor. If you click the link to the requirements in the App Store to FCX, it gives you this page.

    Personally this is why i sold my quad G5 for my current MP. To many of the apps I needed for work, were rapidly being out dated due to dropped support for the 64 bit G5s, and now being supported on the newer Intel processors.

    Another reason I jumped was because of the video cards, some software, Flash, Photoshop, FCP, and some others; take advantage of the GPUs for acceleration of rendering [or can].

  5. alust2013 macrumors 601


    Feb 6, 2010
    On the fence
    The G5 quad is probably still fairly useful, but otherwise, you definitely need to look into intel. I'm not sure what the budget is, but a mini wouldn't be a terrible idea. They start at $599 new, so you can find used ones pretty cheap. Any of the sandy bridge models actually have more power than the original Mac Pro quad Xeons. It won't be the fastest for rendering, but an intel mini (at least Core 2 Duo) can cope with it.
  6. CaptainChunk, Sep 28, 2011
    Last edited: Sep 29, 2011

    CaptainChunk macrumors 68020


    Apr 16, 2008
    Phoenix, AZ
    Well, that isn't entirely true. Codecs like DVCPRO-HD cut very smoothly on G5s. Even HDV video isn't a huge chore for a G5. But AVCHD, RED, Alexa, etc. footage? Yes, out of the question.


    I think you may have you ask your buddy some probing questions as to what type of footage he'll be working with often (he may not know the answer to that yet - he should check with his school for that) and how deep he intends on getting into editing in general. Also, it would be a good idea to know what platform(s) the school is editing on. FCP (prior to Version X) and Avid MC are the most prominent.

    And BTW, just to make a minor correction in nomenclature, the "Mac Pro" designation refers to the Intel-based Mac towers, introduced in 2006. G3/G4/G5 machines are PowerPC-based and are called "Power Macs". Though the G5 towers are dressed in a similar "cheese grater" aluminum box, the Mac Pros are radically different beasts, hence the name distinction. ;)

    Many film schools award degrees (or perhaps minors) in several different specializations (e.g. production, screenwriting, editing, etc.). So say for example, he was a screenwriting student. While he may have to take an editing class or two as part of his curriculum, he's probably not going to be too concerned with editing the 2-hour epic with advanced compositing and special effects. When I was in film school (it was a while ago...hah), we still shot our student projects on 16mm film and our footage got telecined to DV tape for editing. You can do that stuff on G5s, G4s and even on G3s. For students that fit this category, I'd say there's absolutely nothing wrong with a 15" MacBook Pro built in the last couple of years. Even the older Core 2 models are reasonably powerful and you can take them wherever you go.

    If your friend thinks post production may be his calling, then yes, I'd consider more powerful options, like a Mac Pro or a quad-core iMac. And if he's learning, I wouldn't necessarily recommend buying new (unless he's really adamant about it). For instance, an older Mac Pro in good condition would likely get him through college and save him some considerable cash if he's not all too worried about having the latest and greatest.

    Or he may go the laptop (MBP) route and use lab computers for the heavy lifting. Practically all good film schools have fairly recent machines in their editing labs.

    Personally, I'm not a huge fan of the Mac Mini, but the more recent models can be good alternatives, too.

    To answer your question about Hackintoshes... No, these wouldn't necessarily be "dead' and out of the question (I know several that went this route), though there are some things you'd seriously want to consider:

    One would be that you'll be limited to using hardware that is similar to what gets put into other Macs (i.e. Intel CPUs, graphics cards that are supported by the operating system, etc.). This does require a fair bit of research to be sure you're not buying certain components OS X won't be able to see and use. For the most part, device drivers are part of the OS and cannot obtained via separate downloads. Secondly, on Hackintoshes, things can break with OS updates. For example, Apple may decide to remove a driver from the OS you need, or maybe update a graphics driver to one that doesn't play nice with your aftermarket card.

    So with a Hackintosh, you have to determine whether or not these drawbacks will put a serious impact on your friend's (or yourself, who just involuntarily just became his tech support) ability to finish his school work reliably. Granted, this may be a non-issue for a tech-savvy person. But if there's any doubt, save the headaches and get a real Mac.
  7. jtara, Sep 29, 2011
    Last edited: Sep 29, 2011

    jtara thread starter macrumors 65816

    Mar 23, 2009
    Thanks for all the answers!

    Currently, my friend is in community college, just graduating. He wants to go on to film school, and will probably wind-up going up to L.A. He definitely wants to be able to do stuff on his own, outside of any school lab.

    It sounds to me like the current Mac Mini with an i7 would give him the most bang for the buck. Maybe it makes sense to install editing software on my 2008 aluminum Macbook and see how it goes. If it's acceptable, he could get a used Mini with a Core2 Duo. Otherwise, go with the current model.

    He does have a particular interest in learning stuff like After Effects. So, that might push him into a Pro.

    (Off the subject, he seems really hung-up on classroom training/certification/whatever on specific software packages. Is that really important? Sounds too much like what I consider largely-worthless IT certification training... I'm urging him to go for a more general education that will emphasize theory rather than current tools that will be obsolete in a few years. But different people have different learning styles. I'm a "just do it" kinda guy when it comes to specific tools.)

    No, he doesn't know yet what format he'll be shooting. I know he wants to get a low-end professional camera, and I know too-much-money is going to go into the camera, because he has a professor that's really pushing large-format sensors, but that's emerging technology that is going to cost. If he goes that route, then, it's probably going to be an edit-friendly format.

    That is, unless he goes with a DSLR as an inexpensive stop-gap measure. (He's really got the small depth-of-field bug...) Then it's probably going to be a bitch to edit.

    BTW, when I asked if Hackintosh was dead, I didn't mean to characterize "hackintosh" as a single product. I was just wondering if the whole notion of Mac clones is dead, has Apple made things even more difficult or impossible, etc. I did a little research, and my conclusion is that it takes a lot of research and fiddling, as different hardware requires different solutions. I think I can throw out that option.
  8. CaptainChunk, Sep 29, 2011
    Last edited: Sep 29, 2011

    CaptainChunk macrumors 68020


    Apr 16, 2008
    Phoenix, AZ
    I can understand that because I was kind of the same way when I was in college. At the time, I used a 15" PowerBook G4 (Titanium) and edited on both Avid and FCP (I had to learn both in college). A laptop was the best all-around option for me because I couldn't afford to have both. But there were times when I needed the horsepower of the hardware-enhanced Avid workstations, so I would use the lab in those situations. But times have changed a bit and laptops have gotten a lot more powerful, so this may not be an issue anymore.

    And regarding LA (and any major film market, for that matter), make sure that your friend knows that it isn't easy to establish yourself. You'll do a lot of interning and eat a lot of ramen starting out. And there are no guarantees. There are over 50,000 people in post production in LA alone. You're competing with ALL of that. But don't think I'm trying to scare people away. It's a fun (but often stressful) career and I couldn't imagine doing anything else. :)

    What I will say about the new version of After Effects (CS5/5.5) is that it needs a 64-bit CPU and OS to run. So in that case, you'd want to look at a newer Mac that meets that criteria. And though it's pretty scalable with different hardware configurations, it has a love affair with RAM (especially in HD and 4K projects). I have 16GB in my Mac Pro and AE would certainly use more if I had it.

    I agree wholeheartedly. There's great editors all over the world that lack software certifications. Not being "certified" in Final Cut Pro certainly doesn't prevent competent editors from getting work. It hasn't prevented me... In major film markets (LA, NYC, Chicago, Toronto), employers put a lot more emphasis on quality of work and efficiency than anything else. Nobody even cares where you went to school. They care about what you can do. To give an example, Quentin Tarantino was a high school drop-out. James Cameron dropped out of college and drove trucks before getting his big break.

    I would assert that software certifications matter a lot more for educators. All a certification tells someone is that "yeah, he/she can push the buttons and make things work". But filmmaking in general is really about storytelling, not just technical prowess.

    By "large format", his professor is probably referring to DSLR. There are pros and cons to shooting DSLR and personally, I only use them for shooting interviews for a number of technical reasons. There were a number of cameras introduced (like the Panasonic AF-100) that use 4/3" sensors and combine the good qualities of DSLR with the good qualities of a proper video camera. But it would be important for your friend to budget accordingly. For example, the AF-100 costs $5,000 without lenses. It records in AVCHD/AVCCAM, which means he'll need a more powerful machine to edit its footage efficiently. A good story can be shot on an older pro camera, like the Panasonic DVX100. It's DV, but has great color reproduction, can be found used pretty cheap (under $1,000 in some cases) and DV can be cut by practically any computer made in the last decade. Just a consideration...

    Any good instructor will know that story comes first, over everything else. Student films shot on "lowly" formats such as DV get accepted to the large festivals all the time and some even win awards. This is not to say that you shouldn't make every attempt to make your project look and sound its best, but you get the idea. You don't need the latest/greatest gear to win over an audience.

    It seems that EVERY new film student has the DoF bug. To me, it's the most overused shooting technique on the planet. There are surely times when it "fits" to use it, but when it's all the time, it's cheesy. The biggest problem you'll deal with in shallow DoF shooting situations is not necessarily during editing, but during production. On a professional camera rig, the operator typically has a 1st assistant working as his focus puller. It's significantly harder to keep moving subjects in focus using shallow DoF.

    Well, the thing is that there's always going to be some smart guy somewhere that will figure out how to make things work on a Hackintosh again. This is sort of akin to open source software development, only this time you're dealing with hardware. But the same principles apply. Hobbyists are typically willing to deal with things that break and actually enjoy tinkering with stuff anyway. But people who rely on tools to get their work done day in and day out probably won't have that same level of tolerance. I'm a pretty tech-savvy person and I could build a Hackintosh if I REALLY wanted to, but because I run a video business (that keeps the lights on and my stomach full), I can't afford the downtime should something go wrong. In many cases, I'd lose more money than I actually saved by building that Hackintosh in the first place.
  9. jtara thread starter macrumors 65816

    Mar 23, 2009
    I clarified that. I was wrong. He isn't concerned about the piece of paper. It's just that he doesn't feel he learns well from a book or manual. I asked him if he found free resources - videos, recorded lectures, etc. on the Internet would that be OK? He said yes - but he needs somebody to show him - to walk him through the steps. He doesn't feel he can get it from a book.

    No, and yes.

    Actually, he's referring to recently-announced products (AF100, and I think there's a Sony too). I believe there are a couple in the $5000 category and there are rumors of a Canon announcement coming next month. But he also was advocating DSLR if he can't afford that.

    I did agree that it made sense to get a DSLR with a large-format sensor anyway. (e.g. in addition to a video camera). It would be good for getting the innate sense of relationships between aperture, shutter, DOF, etc. I got that in high school with a Leica. But now the public shoots with cameras that have infinite DOF and automatic everything and you just don't develop that sense.
  10. CaptainChunk macrumors 68020


    Apr 16, 2008
    Phoenix, AZ
    If your friend is more of a visual learner, he might find some value in registering for a training website like They have online classes/seminars for many popular creative and business applications. And if he's going to film school, there's also a pretty good chance that he would have to take a basic editing class anyway, where he would learn the ins and outs of a non-linear editing package.

    For a lot of these reasons, I still advocate learning to shoot movies on film prior to doing video. Prosumer and professional video cameras will often have full auto modes that you really never want to get in the habit of using all the time. A lot of good film programs still teach how to shoot on old-fashioned film (often on 16mm, given 35mm's prohibitive cost for most students).

    Now regarding DSLR bodies, I'll elaborate a bit more on why I don't really like using them for video in most cases:

    1. They use big sensors. This statement might puzzle you, but I'll explain. Even the smallest of DSLR imagers have a significantly larger frame than the industry-standard Super35 found on high-end motion picture cameras. This is great for shooting high-res photos with tons of detail, but not so great when you learn about how these imagers have to work when shooting 1080p video. A 1920x1080 frame has about 2 million pixels. A still photo taken with that same sensor in its native mode may have over 12 million pixels. This is where "line-skipping" comes in. In video mode, the camera will tell the imager to "skip" an appropriate number of lines so that it's recording the 2 million pixels present in HD video instead of the 12 million the still photo records. This process has a couple of nasty side effects: rolling shutter ("jello" effect), where objects will appear to "wobble" during fast pans; and moire artifacts when shooting subjects (or even other objects in the frame) that have lots of fine detail. Several examples of these phenomenons can be found online.

    2. DSLRs are designed to be operated like well, still cameras. So, it's easy to see how they can be potentially cumbersome when being used as video cameras. A lot of manufacturers have released rail kits, pull focuses, and similar accessories over the last couple of years to compensate for this. But these are all additional costs many don't think of when they expect to use a DSLR as a video camera replacement.

    3. Many of the less-expensive DSLRs use cropped (not full-frame) sensors, which makes lens shopping a little more difficult. While these cameras can use any lens available in their system, the more abundant full-frame lenses will have a "crop factor" adjustment to consider, where a 100mm lens designed for full-frame will become a 160mm lens on "cropped-frame" camera whose crop factor is 1.6.

    4. The built-in mics on DSLR cameras are useless. While many of these cameras do have mic inputs, the actual recording quality is only so-so, forcing many to use external recorders instead and sync to picture in post.

    Now, none of this is to say that I wouldn't give DSLR credit where it's due. It is possible with a bit of work and planning to get quality footage out of a DSLR that rivals video cameras that cost significantly more.

    Cameras like AF-100 are applying what manufacturers have learned from DSLR to a traditional video camera. The AF-100 in particular uses a 4/3" imager that is designed for nothing but HD video. It has the proper 1920x1080 pixel arrangement (no line-skipping necessary), meaning that it doesn't suffer from the visual anomalies apparent in DSLR video nearly as much. Additionally, it has the proper features (like XLR audio inputs) expected in a professional video camera.

    But I do think it's good that you're suggesting that perhaps he owned a DSLR for stills (for learning basic photography principles that even apply to video) and perhaps a separate video camera. It guess it'll really come down to how much he'll realistically get to spend on these items.

Share This Page