Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

pakeha

macrumors newbie
Aug 8, 2009
9
0
New Zealand
Perhaps the reason for the delay in releasing new mac pros is display related. It's a better than 50% chance that ATIs 5870 ifinity gpu will be an option, with a lower end 5xxx card standard(one that has a max power draw that comes in under the pci slots 75w limit)
If they are releasing a new thin bezel monitor (for multiple display setups) that use a 2560x1440 screen then there may be q&a issues similar to what happened to the 27" Imacs.
An update to the display lineup must surely be due by now.
 

jmull

macrumors regular
Sep 16, 2009
190
0
Not happening

I just pray for the day that OSX.x will run on any PC without hacks or hackintoshes. Imagine the possibilities.

If Mac OSX ran on any hardware, why buy expensive hardware from Apple? I'm surprised it runs on some Netbooks.
 

PeterQVenkman

macrumors 68020
Mar 4, 2005
2,023
0
If Mac OSX ran on any hardware, why buy expensive hardware from Apple? I'm surprised it runs on some Netbooks.

perhaps by then Apple will be a gadget company that sells iPods, iPads, iPhones, and... an operating system? Even I think I'm reaching there.

Oh well, as long as I'm dreaming, I'd like a couple million dollars...

;)
 

deconstruct60

macrumors G5
Mar 10, 2009
12,309
3,900
You are mistakenly assuming everyone runs Cadence or Avanti. Hell, even at AMD we ran mostly in-house software for everything except routing and spice.

Avanti --> Synopsus at this point. If folks are running around with Unix command line simulators and/or X-Windows GUI tools ( like the Cadence employees bolting to Avanti with IP or more legal moves ), it is easier/cheaper to port those to Mac OS X than it would be to Windows; presuming have a source code dragging along from start up to start up. Just keep it X-Windows for what already have. Don't really need Aqua eye candy if it just works and you're a broke start up. If getting good answers out of the simulator that is good enough. Likewise having X-Windows come standard makes it nice Xterm to a unix box (with a single fixed license key software) too .


Could port to Linux, but if small company and folks need to do MS Office docs in stuff in addition work it is a better support non-custom software environment than Linux/Unix would be.


A port of the original spice codeline is on the mac. http://www.macspice.com/
 

deconstruct60

macrumors G5
Mar 10, 2009
12,309
3,900
Perhaps the reason for the delay in releasing new mac pros is display related. It's a better than 50% chance that ATIs 5870 ifinity gpu will be an option,

The standard 4870 is a 150W draw whereas the 5870 is a 188W one.

http://en.wikipedia.org/wiki/Compar...ocessing_Units#Evergreen_.28HD_5xxx.29_series

You can't put two 5970's in the box if the max W across PCI slots stays at 300W (http://www.apple.com/macpro/specs.html)

With two 4870 max out the expansion slots. Seems more likely something closer to a 5770 ( at 108W , better than 4870 and released a while ago so that development could have been done this this point since need a special card for Mac Pros and EFI booting. ) With two of those could still conceptually put a modest 50-60W card (e.g., storage , etc. ) in there also.


with a lower end 5xxx card standard(one that has a max power draw that comes in under the pci slots 75w limit)

The stuff that ATI/AMD released in the last couple of months ( Jan - March ) is too soon. All development for those would be focused on the substantially larger Windows/Linux market. The EFI boot cards come after the regular cards would be launched.



If they are releasing a new thin bezel monitor (for multiple display setups) that use a 2560x1440 screen then there may be q&a issues similar to what happened to the 27" Imacs.

The canonical Apple display setup is with one of the "built in" models and the 24" Cinema display. Doubt the bezel on one of those is going to get any thinner with just a new Cinema display release.



An update to the display lineup must surely be due by now.

It has been 3 years since Apple even 'speed bumped' the display stats. Let alone a new design. The only real change is that the 20" and 23" got dropped
last year. ( https://www.macrumors.com/2009/02/19/apple-discontinues-20-cinema-display-product-refreshes-coming/ ) They had already effectively been replaced in line up by the 24" offering.


The Mac Pro doesn't require an Apple display to work. There is no sense to jam up its release with a product it doesn't require to work. ( indeed for folks swapping out old (3, 4 , 5+ year old MacPro or equiv ) many likely already have a display going to drag into the future or have already replaced
in last couple of years.


Surprised just how much of a "new Xeons new Mac Pro ? " thread get consumed with MacBook and/or Display gotta be coming comments. The Xeon part is such a critical component to those products (not ).

Would make far more sense if the Mac Pro and XServe updates where hooked together. Both of which would share the updated Xeons as a component. The XServes updated in April 2009 . Late march / early april would make it about a year for both.
 

MorphingDragon

macrumors 603
Mar 27, 2009
5,160
6
The World Inbetween
The stuff that ATI/AMD released in the last couple of months ( Jan - March ) is too soon. All development for those would be focused on the substantially larger Windows/Linux market. The EFI boot cards come after the regular cards would be launched.

Technically no, EFI capable graphics card only take little modifications to the ROM.

Netkas has made card drivers/ROMs before within a couple of months. It wouldnt take Apple long with a development team several times his size.
 

pakeha

macrumors newbie
Aug 8, 2009
9
0
New Zealand
I mentioned the ati eyefinity card mainly because a picture of one I saw had 6 mini display ports on it. All it would need is some drivers and possibly a psu upgrade (more 6/8 pin gpu plugs), and the drivers would be similar for the standard option graphics card.
Samsung is supposed to be working on thin bezel monitors, does apple deal with them at all?
The other reason was that I could see apple boasting about the possibility of connecting upto 6 or maybe 12 monitors to one mac.
To get things back on track ,ie xeon related, the press release didn't have much detail on the single processor xeons /w36xx. Only one seemed to be mentioned all the rest were the dual cpu models.
 

MorphingDragon

macrumors 603
Mar 27, 2009
5,160
6
The World Inbetween
I mentioned the ati eyefinity card mainly because a picture of one I saw had 6 mini display ports on it. All it would need is some drivers and possibly a psu upgrade (more 6/8 pin gpu plugs), and the drivers would be similar for the standard option graphics card.
Samsung is supposed to be working on thin bezel monitors, does apple deal with them at all?
The other reason was that I could see apple boasting about the possibility of connecting upto 6 or maybe 12 monitors to one mac.
To get things back on track ,ie xeon related, the press release didn't have much detail on the single processor xeons /w36xx. Only one seemed to be mentioned all the rest were the dual cpu models.

Apple deals with LG/Philips mainly.
 

AskMeIfICare

macrumors member
I'm sorry I have to ask this question; but do you even have the ability yo read for content? Did you even read the questions asked by the person that originally posted them? It is hard for me to grasp the idea that you have the ability to log on yet can't keep a line on the context of the discussion.

The question posed concerned the usability of twelve core machines and was not directed specifically at professional users. In any event I don't really give a damn about Logic, it is only a minor consideration in jusifying multi core machines. Frankly the market for Logic is so small it would never jusify keeping the Mac Pro line around. If you want multi core machines from Apple you really have to understand the larger market for the machines. If you don't then the machines won't be around long.

In any event the point of my response to the original poster is that multi core machines do have uses that encompass everything from the desktop user to the bio engineer.


So? Look at it this way the machine did work didn't it? You did make some bread with it didn't you? The simple fact is there has not been a computer made without bugs in it. Did it take Apple to long to fix it, maybe but I don't have the internal information to say positively. In the mean time they fixed a hell of a lot more important bugs.

So yeah it is not a perfect world but at least Apple had the wisdom to address real security and reliability issues first. It is the only rational thing for them to do because all of the whining about machines that where otherwise working is just that when you are working through a list of critical bugs.

I don't want to undersell the issue of bugs because when they are causing you problems they basically suck. However I do know that for each rev of Mac OS/X that I've installed my machine has gotten better and not worst.

It probably would make sense for me to reflect this question back to you. The reason is simple your needs are so narrow and minor that Apple can't adressed them directly. There is no way that they can let the needs of a small minority of users dictate long term plans for the Mac Pro.

As to why I'm on this thread; well it is pretty simple really, it is to adress the people making irrational demands for rapid Mac Pro releases and to provide balanced answers to questions being posed here.

In other words I'm hoping to provide a perspective that I hope is a little more grounded.


If I'm in the discussion it concerns me. Further the discussion can't concern you because you don't seem to grasp the technology involved nor the value in making random knee jerk updates to the Mac Pro. Just because Intel has released chips over the last few months that might plug into a Mac Pro doesn't imply that they are of value to Apple or that they even fit into Apples Mac Pro plans.

It is very easy for people here to point to the latest ATI or Nvidia video card or the latest Intel CPU and then swear at Apple for not offering said chips immediately. But really what good does that do? Even if Apple delivered said Intel improvement would it offer a significant performance boost? In case you are wondering they wouldn't have.

Oh dismiss my Mac Book Pro if you want but the comments where important to the question I was responding to. Besides it isn't the only machine I have at home, just my favorite. At work there are a lot more but they run Windows so I avoided bringing them into the discussion. The point remains though the utility of multi core processors depends upon what you are doing with them.


Dave


Davey boy, we need to have a little lesson here. (I'm a very busy person and I don't have the kind of to play "tit for tat" like you seem to have.)

First and foremost, 11 months of overheating on the Mac Pro when ANY audio is played is unacceptable. 11 months to fixed a problem that didn't exist in MacPro3,1 and prior. This was a problem that didn't exist even in Hackintosh with the same 1366 socket motherboard. "Security and reliability?" Don't you think that overheating is a serious reliability issue that should have been addressed immediately...not 11 MONTHS later? What were the other "security" issues? The way the cultish types around here strut, you'd think that a Mac could stop a nuclear warhead let alone an intruder or virus. I realize that you don't care about Logic, but it IS the crown jewel of Apple software right now. Nearly every major producer is using it or contemplating switching from Pro Tools. It's a small niche but take $500 per copy and multiply that by over 1,000,000 users and climbing, it isn't as "niche" as you might think in terms of market value. Onwards:

1) I understand the topic. The topic is about whether or not the 6 core processor is appropriate for the Mac Pro. That includes, but is not limited to, those users of said possible future Mac Pro and current models of the Mac Pro. I am one of those individuals.

2) The Mac Pro is important to Apple just as big ticket yet low volume automobiles are to companies like Chrysler who buy into sports car companies. There are others but that's the most recent example. If you do not understand why or why not this is the case then you are not in this class of buyer. Ironically even when I first started out and had nothing, I still understood this concept even if nobody else I knew understood it.

3) The problem with your argument only BEGAN you cited Garageband. Honestly I have no idea if it uses all of the cores or not. It probably does not and it would not need to do so since it's a toy app. I'm sure that a talented producer could use it but they'd be foolish to do so when time constraints and deadlines become an issue. This is where 64 bit Logic and all of its monster plugins enters the picture (they are optimized for multi-core also). But once again you miss the mark entirely with your ramblings. You see Dave, the difference between you and I is that I LOVE IT when I'm WRONG. Because then I can LEARN something new. But I'm not wrong in this case. Take your time to study Compressor and the many 3rd party plugins that it uses. The biggest movie studios uses these same apps even though they have proprietary software. Sony uses the same compressors that are used by podcast or Youtube producers. Said software uses ALL of the cores. It has to do so or you will be sitting there for DAYS if it didn't or if you ran it on a Mac Pro.

Compressor is only the beginning. Obviously there is Logic. Take your time and research and find out the rest. You might find it fairly interesting. The next major one will be Adobe CS5 and probably Final Cut Studio 4. About F'n time.

4) You speak of the lack of hardware and how it "doesn't exist". This is where your argument not only falls down, but it crashes and burns. The new Xeon 5600 chips work on the SAME 1366 socket Intel board that the 2009 Mac Pro uses! NO RETOOLING REQUIRED. Did you know that another Intel based motherboard company, EVGA, just released their EVGA CLASSIFIED SR2? This is designed for the 5600. If EVGA had it done before the end of 2009 and released it recently at an SRP of $599 then why can't Apple do the same?

So again, this is a case of what is and what is not. Check out the compressors. Look at motherboard companies who have updated their 1366 socket motherboard bios to be compatible with the Gulftown 6 core processors. If they worked in Hackintosh systems well before the official release in the form of Engineering Samples, then what is stopping Apple? Nothing...unless they want the Hackintosh community to build the drivers to USB 3.0 hardware for them. That and SATA III support would be nice.

I'll agree with you on one thing. The non-use of higher threads is really getting on my nerves. Sometimes I get lazy and play with iWeb for websites I help for my computer illiterate friends who have business endeavors. That bloated piece of crap bogs down my Mac Pro more than Compressor in full render mode using all 16 virtual cores.

So again, I know exactly where you are coming from but your blanket statement and our "whinings" are not covered by said statement. The fact that you've shown that you don't know anything about my industry and the importance of it to Apple is why I broke my usual silence on polarized forums such as these. So if you don't want to do your homework with respect to what I've offered to you now, then there's no more to say.
 

MorphingDragon

macrumors 603
Mar 27, 2009
5,160
6
The World Inbetween
Davey boy, we need to have a little lesson here. (I'm a very busy person and I don't have the kind of to play "tit for tat" like you seem to have.)

First and foremost, 11 months of overheating on the Mac Pro when ANY audio is played is unacceptable. 11 months to fixed a problem that didn't exist in MacPro3,1 and prior. This was a problem that didn't exist even in Hackintosh with the same 1366 socket motherboard. "Security and reliability?" Don't you think that overheating is a serious reliability issue that should have been addressed immediately...not 11 MONTHS later? What were the other "security" issues? The way the cultish types around here strut, you'd think that a Mac could stop a nuclear warhead let alone an intruder or virus. I realize that you don't care about Logic, but it IS the crown jewel of Apple software right now. Nearly every major producer is using it or contemplating switching from Pro Tools. It's a small niche but take $500 per copy and multiply that by over 1,000,000 users and climbing, it isn't as "niche" as you might think in terms of market value. Onwards:

1) I understand the topic. The topic is about whether or not the 6 core processor is appropriate for the Mac Pro. That includes, but is not limited to, those users of said possible future Mac Pro and current models of the Mac Pro. I am one of those individuals.

2) The Mac Pro is important to Apple just as big ticket yet low volume automobiles are to companies like Chrysler who buy into sports car companies. There are others but that's the most recent example. If you do not understand why or why not this is the case then you are not in this class of buyer. Ironically even when I first started out and had nothing, I still understood this concept even if nobody else I knew understood it.

3) The problem with your argument only BEGAN you cited Garageband. Honestly I have no idea if it uses all of the cores or not. It probably does not and it would not need to do so since it's a toy app. I'm sure that a talented producer could use it but they'd be foolish to do so when time constraints and deadlines become an issue. This is where 64 bit Logic and all of its monster plugins enters the picture (they are optimized for multi-core also). But once again you miss the mark entirely with your ramblings. You see Dave, the difference between you and I is that I LOVE IT when I'm WRONG. Because then I can LEARN something new. But I'm not wrong in this case. Take your time to study Compressor and the many 3rd party plugins that it uses. The biggest movie studios uses these same apps even though they have proprietary software. Sony uses the same compressors that are used by podcast or Youtube producers. Said software uses ALL of the cores. It has to do so or you will be sitting there for DAYS if it didn't or if you ran it on a Mac Pro.

Compressor is only the beginning. Obviously there is Logic. Take your time and research and find out the rest. You might find it fairly interesting. The next major one will be Adobe CS5 and probably Final Cut Studio 4. About F'n time.

4) You speak of the lack of hardware and how it "doesn't exist". This is where your argument not only falls down, but it crashes and burns. The new Xeon 5600 chips work on the SAME 1366 socket Intel board that the 2009 Mac Pro uses! NO RETOOLING REQUIRED. Did you know that another Intel based motherboard company, EVGA, just released their EVGA CLASSIFIED SR2? This is designed for the 5600. If EVGA had it done before the end of 2009 and released it recently at an SRP of $599 then why can't Apple do the same?

So again, this is a case of what is and what is not. Check out the compressors. Look at motherboard companies who have updated their 1366 socket motherboard bios to be compatible with the Gulftown 6 core processors. If they worked in Hackintosh systems well before the official release in the form of Engineering Samples, then what is stopping Apple? Nothing...unless they want the Hackintosh community to build the drivers to USB 3.0 hardware for them. That and SATA III support would be nice.

I'll agree with you on one thing. The non-use of higher threads is really getting on my nerves. Sometimes I get lazy and play with iWeb for websites I help for my computer illiterate friends who have business endeavors. That bloated piece of crap bogs down my Mac Pro more than Compressor in full render mode using all 16 virtual cores.

So again, I know exactly where you are coming from but your blanket statement and our "whinings" are not covered by said statement. The fact that you've shown that you don't know anything about my industry and the importance of it to Apple is why I broke my usual silence on polarized forums such as these. So if you don't want to do your homework with respect to what I've offered to you now, then there's no more to say.

I think the only thing needed for EFI to support newer Processors is to update the processor IDs in the ROM.
 

German

macrumors regular
Jul 3, 2007
198
0
As far as I know OSX is not NUMA aware.
-> Systems with two CPUs (Mac Pro and Xserver) are loosing performance
(compared to Linux and Windows -> till 30% memory performance: http://www.heise.de/newsticker/meld...o-und-Xserve-weiterhin-mangelhaft-755713.html ).

"Non-Uniform Memory Access or Non-Uniform Memory Architecture (NUMA) is a computer memory design used in multiprocessors, where the memory access time depends on the memory location relative to a processor. Under NUMA, a processor can access its own local memory faster than non-local memory, that is, memory local to another processor or memory shared between processors."
http://en.wikipedia.org/wiki/Non-Uniform_Memory_Access


If the OS is not aware of hyperthreading, it could easily happen, that a "virtual" CPU-Core (hyperthreading) is used instead of an unused physical CPU-Core.
Does OSX has something like SMT Parking (e.g. Windows 7 is using SMT Parking)?
No? So it is more likely that it could happen, that you are loosing performance with hyperthreading.

The basic notion behind SMT parking is that the Windows scheduler will attempt to schedule threads so that all physical cores are occupied before any core gets two threads scheduled on its two front-ends (or logical cores). Since Hyper-Threading involves some cache partitioning and other forms of resource sharing, this is a potentially important feature. We've seen scheduler quirks cause poor and oddly unpredictable performance on Core i7 processors in the past.
http://209.85.129.132/search?q=cach...&hl=de&ct=clnk&gl=de&lr=lang_en&client=safari


Got my new MP 2.26 octo yesterday.
Standard 6 gig ram

OS X 10.6.2, Logic 9.0.2
Presonus Firestudio Lightpipe

51 tracks with HT ON
71 tracks with HT OFF


My MBP 2007 2.2 4 gig ram will play 16-18 tracks depending on the mood at the moment. Sometimes it will go nu further than 12.

I have the dual Quad 2.93 Nehalem Mac Pro, and I got 97 tracks with hyper threading turned off. But when I had hyper threading on I just got 68 tracks. I really big difference of 29 tracks!
http://www.gearspace.com/board/music-computers/371545-logic-pro-multicore-benchmarktest-10.html
 

MorphingDragon

macrumors 603
Mar 27, 2009
5,160
6
The World Inbetween
As far as I know OSX is not NUMA aware.
-> Systems with two CPUs (Mac Pro and Xserver) are loosing performance
(compared to Linux and Windows -> till 30% memory performance: http://www.heise.de/newsticker/meld...o-und-Xserve-weiterhin-mangelhaft-755713.html ).

"Non-Uniform Memory Access or Non-Uniform Memory Architecture (NUMA) is a computer memory design used in multiprocessors, where the memory access time depends on the memory location relative to a processor. Under NUMA, a processor can access its own local memory faster than non-local memory, that is, memory local to another processor or memory shared between processors."
http://en.wikipedia.org/wiki/Non-Uniform_Memory_Access


If the OS is not aware of hyperthreading, it could easily happen, that a "virtual" CPU-Core (hyperthreading) is used instead of an unused physical CPU-Core.
Does OSX has something like SMT Parking (e.g. Windows 7 is using SMT Parking)?
No? So it is more likely that it could happen, that you are loosing performance with hyperthreading.

The basic notion behind SMT parking is that the Windows scheduler will attempt to schedule threads so that all physical cores are occupied before any core gets two threads scheduled on its two front-ends (or logical cores). Since Hyper-Threading involves some cache partitioning and other forms of resource sharing, this is a potentially important feature. We've seen scheduler quirks cause poor and oddly unpredictable performance on Core i7 processors in the past.
http://209.85.129.132/search?q=cach...&hl=de&ct=clnk&gl=de&lr=lang_en&client=safari





http://www.gearspace.com/board/music-computers/371545-logic-pro-multicore-benchmarktest-10.html

NUMA is a memory controller technology. Mac OSX doesn't need to be aware of anything.

A lot of programs lose performance with hyperthreading depending on the how the program was coded.
 

cmaier

Suspended
Jul 25, 2007
25,405
33,471
California
NUMA is a memory controller technology. Mac OSX doesn't need to be aware of anything.

A lot of programs lose performance with hyperthreading depending on the how the program was coded.

Yes it does. All "cores" are not equal - some are virtual, of course, and not all cores have equal access to all memory. In advance multiprocessing schedulers, the scheduler decides the processor affinity for each process. Only the OS, not the memory controller, has any idea what processes have what priority, and what processes have what memory-access patterns. For example, two processes that have to communicate with each other via shared memory should be set with affinity to the same processor core, if possible, in a NUMA architecture. Similarly, since some processes need more RAM than other processes, the affinity should take into account which cores have local access to which globs of physical memory.
 

MorphingDragon

macrumors 603
Mar 27, 2009
5,160
6
The World Inbetween
Yes it does. All "cores" are not equal - some are virtual, of course, and not all cores have equal access to all memory. In advance multiprocessing schedulers, the scheduler decides the processor affinity for each process. Only the OS, not the memory controller, has any idea what processes have what priority, and what processes have what memory-access patterns. For example, two processes that have to communicate with each other via shared memory should be set with affinity to the same processor core, if possible, in a NUMA architecture. Similarly, since some processes need more RAM than other processes, the affinity should take into account which cores have local access to which globs of physical memory.

Thats what I get for looking at Wikipedia. :eek:
 

German

macrumors regular
Jul 3, 2007
198
0
NUMA is a memory controller technology. Mac OSX doesn't need to be aware of anything.
:rolleyes:

"On a NUMA system however the distance of the memory to the executing process matters. Performance can sink dramatically if memory references are made too frequently to pages on remote nodes. Local memory is special to the execution context because it has minimal latency and optimal bandwidth characteristics.
The kernel has the task to assign memory to a process in the most optimal way so that the process can execute with the highest performance. For that purpose, the kernel has been equipped with various mechanisms that determine node locality and insure that memory is allocated in such a way that the NUMA distances are minimized."

http://kernel.org/pub/linux/kernel/people/christoph/pmig/numamemory.pdf

German c't magazin figured out, that OSX is loosing till 30 Percent of performance compared to Linux and Windows on similiar hardware, because of the lack of NUMA. ;)

- c't (13/2009, 8.6.2009, page 156ff) Apple Xserve 3.1 VS. Fujitsu RX200 S5
- http://www.heise.de/newsticker/meld...o-und-Xserve-weiterhin-mangelhaft-755713.html -> 10.6

A lot of programs lose performance with hyperthreading depending on the how the program was coded.

It depends also (or mainly) from the scheduler in the OS (Windows 7 has SMT parking).

"When you combine Idle Core and Quantum End with SMT parking, you've got a recipe for better power consumption, as parts of the chip are powered down when they aren't required. SMT parking works by 'parking' the HT cores until the physical cores are busy enough for the HT cores to be needed. Intel also talked about Core parking as well, but that only relates to server CPUs."
http://www.bit-tech.net/hardware/cpus/2009/09/25/idf-day-3-32nm-westmere-performance/1
 

Bister Mungle

macrumors newbie
Apr 29, 2010
1
0
I just swapped out my 920 for the 980 on a Hackintosh, my backup system to my Mac Pro. I've had it on backorder with a distributor out of Singapore for 2 months. I could have gotten an Engineering Sample or ES but I don't trust those based on what I've seen before.

@ AskMeIfICare:

I've been dying for a build guide, components list, etc. for an Intel 980x Hackintosh build. Can you post any more information about that, or your system specifically - e.g., your components and what build guide you may have followed?

I've heard from a few sources that it's not possible yet, and from others that it should be no problem. I've built several PC systems but this would be my first Hackintosh, and if I'm going to spend $2500 on components (which I want to do ASAP!), I want to be sure that a 980x-based system is possible.

Thanks for any help!
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.