Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
Apple is such a big monopoly in the music industry, they cannot even force the record companies to give Apple a license to sell DRM-free music. A monopoly is in a position to say "You do as you're told or else...". Apple is nowhere near that position.

If you follow that path then you would be saying that MS was not a monopoly.

I think you're a victim to what the definition of monopoly is. Remember it is perfectly legal to be a monopoly. It's how you wield that power is when the justice dept comes knocking. No matter though my point is more about those "companies" that come out of the woodwork when they research their patents and look for angles to go after Apple. Some may even attempt going through the government if they feel Apple is mishandling their position but that wasn't my point to begin with.
 
There are lots of people (like me) who have no need for anything better than integrated graphics. It handles everything fine that I need it for. Others have different needs, but I don't want to pay more for their needs.
Options would be good. :)

However, Apple is planning further ahead anyway. And Apple is really best buddies with Intel. And Intel has a new chip on its way that is at the same time a powerful engine for a graphics card and at the same time perfectly useful for high performance computing. And Apple is right now developing both the software to use many cores easily, and to use graphics hardware. I could see a laptop chip containing a single Core2 core plus two or four Larrabee cores, which would give you a combination that is reasonably low power, faster than current laptops for normal task, has good graphics power and beats a current MacPro with specialised software.
Perfectly stated. I would REALLY LIKE that!
 
Yep, you are totally correct. I agree with you and the others here that this rumor doesn't make any sense. At this point, full H.264/VC-1 decoding is done by all relatively new discrete GPUs, and now finally on the Intel "Montevina" X4500 integrated graphics chipset.

Where has it been said that the Montevina paired with the basic X4500 can do full decoding of H.264 ?
"The difference between the GMA X4500 and the GMA X4500HD is that the GMA X4500HD is capable of "full 1080p high-definition video playback, including Blu-ray disc movies", the GMA X4500 however does not have that capability."
I somehow doubt Apple will be using the X4500HD.
 
Not this again

Really? We're going all "PITCH A TENT" over this Blu-Ray nonsense again? Currently a mid range player is in the $500.00 range. Sure there are a few cheaper models and they may work 10% of the time. Unless the cost becomes worth it and the market grows Blu-ray will not be in Apple's line up. Besides, why would you want a Blu-Ray on anything less than a 20- 24" screen? Additionally, the cost of the disks are still fairly high, unless you have Netflicks......ya know what? Never mind. Demand the Blu-Ray, the solid gold case, and then bitch about the weight and cost. Honestly...... yes I'm cranky, due to a cold, which I swear death himself has created.
 
Wirelessly posted (BlackBerry8320/4.2.2 Profile/MIDP-2.0 Configuration/CLDC-1.1 VendorID/167)

Sorry but what's a dedicated encoder? Is that like a software update that I could install on my current macbook with performance enhancements to follow.
 
Apple's got over 20 BILLION in cash, not what one would consider "balanced" books. They need to be putting discreet graphics in laptops. Period. And they need to stop buying the previous generation graphics. - This was a quote. Not sure why it did not show up that way.

I think Apple is holding on to their cash as a reminder to save for a rainy day. If memory serves the company almost closed their doors back in the 80's for many reasons, including not enough cash on hand. With that said, they really should invest in a video company and bring their game up, so to speak.
 
Where has it been said that the Montevina paired with the basic X4500 can do full decoding of H.264 ?
"The difference between the GMA X4500 and the GMA X4500HD is that the GMA X4500HD is capable of "full 1080p high-definition video playback, including Blu-ray disc movies", the GMA X4500 however does not have that capability."
I somehow doubt Apple will be using the X4500HD.

The X4500HD is apparently only for desktop systems at the moment.
 
What if encoder chip is for low-power processor?

Everybody keeps mentioning how the newer processors and GPUs have enough computational power to handle h.264 encoding themselves.

What about the newer, low-power Atom processors? I'm guessing that systems based on these chips will not have high-power GPUs and an h.264 encoding chip on the motherboard may make sense.
 
Depending on what this chip really does, it may allow Apple to support Blu-Ray movie playback without ruining the rest of Mac OS X.

One of the big problems with Windows Vista is that it supports software playback of HD content. Thanks to the requirements imposed by the movie studios, Microsoft has had to specify and develop (whether it works or not) massive amounts of DRM infrastructure, including CODECs that keep the video encrypted during the decoding process, the ability to selectively detect and disable video cards that don't observe the rules, tilt-bits to detect bus-snoopers, etc.

I suspect Apple doesn't want to play this game. If I were them, I'd refuse to compromise a good multimedia architecture simply because some movie studios demand it.

One possible way around all this might be a dedicated HD-decode chip, as a part of the video card. Mac OS can then feed the raw, encrypted, content from the disc straight to the chip, where the video goes straight to the video output, without ever passing through main memory. Sort of like the overlay model used by video-playback cards, back before CPUs were powerful enough to play real-time video.

This way, the studios get their insane encryption requirements met and Mac OS doesn't get crippled by the attempt to enforce it.

And, of course, when you're not playing a BD movie, you've got a nice powerful auxiliary GPU that can be used for whatever else the system needs.

This is the most plausible explanation I have heard so far.
 
Not going to happen. No way, no how. It makes no sense either economically or as a potential feature. Core 2 processors combined with modern, low-end GPUs can already handle HD content for decode and their just isn't enough need to do HD encoding in real-time or anything even approaching realtime (for the average user).

Beside that, with next year's Core i7 processors, embedded GPUs, OpenCL, and Grand Central there won't be any need for this rumored, dedicated, hardware-based H.264 acceleration. Apple isn't going to introduce a completely new and costly hardware architecture that will be obsolete and/or unnecessary in less than a year.
 
I'm certain that if it is true (and I hope it is) then it is merely about including drivers for the video acceleration that is present in any recent ATI/AMD and NVIDIA graphics card or integrated graphics

Yes. I suspect very soon all Macs will use specialized hardware to accelerate video decompression. The best way to do this would be to use the GPU. I seriously doubt Apple will build in video decoders other then GPUs
 
Apple's got over 20 BILLION in cash, not what one would consider "balanced" books. They need to be putting discreet graphics in laptops. Period. And they need to stop buying the previous generation graphics. - This was a quote. Not sure why it did not show up that way.

I think Apple is holding on to their cash as a reminder to save for a rainy day. If memory serves the company almost closed their doors back in the 80's for many reasons, including not enough cash on hand. With that said, they really should invest in a video company and bring their game up, so to speak.

No, Apple is holding onto their cash because they've accumulated it so quickly over the past, say, five years, they haven't had time to strategize how best to divest it. I don't think anyone could've predicted the explosion of the "Apple" brand that we have seen since the introduction of the iPod, and subsequently the iPhone.
 
QuickTime dedicated encoder/decoder chips? Don't think so. Although the GPU seems to be "the return of the co-processor" nowadays, but are they capable to support encoding?

Personally I think that Apple needs to improve much in the software version of QT first. Currently QT Pro still doesn't have a decent H.264 encoder that doesn't change colorspace when not asked for. Not to mention correct handling of ITU601 video-levels. And what about support for all H.264 level formats, like 4:2:2 and 10 bit support?
 
I'll believe it when I see it. I know some GPUs offer h.264 decoding so Apple would only have to make an encoder chip and maybe decoder for non-h.264 Quicktime formats (if there are any). It would be interesting to see if this happens, how they'd use Grand Central w/ it.

Also, this does kinda fit w/ the rumors of Apple leaving Intel as their chipset maker as well as Apple buying PA Semi-Conductor. Anyone else think this can also go towards iPods/iPhones? I'm sure they have hardware decoders, but maybe Apple's going to bring the designing of those chips in-house? We'll see if/when it happens.
 
Honestly if Apple would use the Power that the MacPro's Already have.... we would not need any other chips.

But in a sneaky way this is a great way for Apple to make unique motherboard that the Hackers cannot HACK.... Because OS X would need the Specialty Graphics hardware to run... and Apple could finally say UP yours to the Pystar like Clone hawkers.
 
Honestly if Apple would use the Power that the MacPro's Already have.... we would not need any other chips.

But in a sneaky way this is a great way for Apple to make unique motherboard that the Hackers cannot HACK.... Because OS X would need the Specialty Graphics hardware to run... and Apple could finally say UP yours to the Pystar like Clone hawkers.

I've been thinking about how Apple might do this, and it brings up an interesting senerio:

Let's say Apple comes up with a chipset that isn't hackable (at least more difficult). OK, maybe it will allow the install of Snow Leopard, which I'm assuming will run on my MB as well. At what point do they cut us off and say that the next OS X won't install on "old" machines? Will it be Snow Leopard? Talk about wanting to wait for an update before buying. This would be a pretty big line in the sand, not that being stuck with Leopard or even Tiger would be the end of the world.
 
Personally I think that Apple needs to improve much in the software version of QT first. Currently QT Pro still doesn't have a decent H.264 encoder that doesn't change colorspace when not asked for. Not to mention correct handling of ITU601 video-levels. And what about support for all H.264 level formats, like 4:2:2 and 10 bit support?
Maybe QuickTime X will solve that.

Apple said:
Using media technology pioneered in OS X iPhone, Snow Leopard introduces QuickTime X, a streamlined, next-generation platform that advances modern media and Internet standards. QuickTime X features optimized support for modern codecs and more efficient media playback, making it ideal for any application that needs to play media content.
 
Realtime Uncompressed playback of HD needs FAST HD Array's and DATA throughput rather than a GPU which would do nothing to move the massive amounts of RAW data that HD video is.
Case in point: Many years ago, my employer was demonstrating some new network switching equipment by feeding raw uncompressed 1080i video through it (from camera through network to display, without any compressors). The stream consumed 1.3Gbps.

That is far higher than any consumer device can deliver. For comparison, it is 1.6x a FireWire 800 port's theoretical maximum, 2.2x the throughput of the fastest single hard drive, and 43% of the total bandwidth of a SATA 2.0 interface.

You can double that for a 1080p60 video stream.

A Blu-Ray disc's data rate is 36Mbps, for a 1x drive. So a 1080p movie is getting, at minimum 72:1 compression (2600Mbps/36Mbps = 72.2).
... Comcast gives you (45KB up) you can't even stream 640x480 in full quality - ever see what you get on the other end?
That explains why, when I use iChat to talk with my uncle (who is on Comcast), he can see me fine, but his video is mostly frozen.

Fortunately, we're not all using Comcast. My FiOS connection has a 5Mbps up-channel. (Which is why my uncle sees me just fine.)
 
Where has it been said that the Montevina paired with the basic X4500 can do full decoding of H.264 ?
"The difference between the GMA X4500 and the GMA X4500HD is that the GMA X4500HD is capable of "full 1080p high-definition video playback, including Blu-ray disc movies", the GMA X4500 however does not have that capability."
I somehow doubt Apple will be using the X4500HD.
The where is very simple. Just take a look at Intel's chipset product page.

http://www.intel.com/products/notebook/chipsets/gm45/gm45-overview.htm

The only mobile IGP 4 series chipset they have listed is the GM45 which includes the GMA X4500MHD. Which, if it isn't obvious from the name, is the mobile version of the desktop GMA X4500HD.

http://www.hkepc.com/?id=1525&page=3&fs=idn#view

And even the "regular" desktop GMA X4500 non-HD can do h.264 and Blu-ray acceleration just fine. The GMA X4500 uses 13% CPU utilization in h.264 compared to 10% in the GMA X4500HD and 86% in the GMA X3500. In Blu-ray, the GMA X4500 uses 27% CPU compared to 23% in the GMA X4500HD and 95% on the GMA X3500. And this is on early drivers with a 1.6GHz Celeron.
 
[snip]



Apple isn't going to use any more Intel onboard graphics in upcoming Macbooks/Mac Mini/Apple TV. That's one of the reasons they purchased their own chip company.

HDMI ports for every Mac.;)

Which graphics chip company did Apple purchase?
 
What is wrong with the sentence? It seems correct, "which" immediately followed the object being further described.

It is correct if the context is read correctly and the context is H.264, not the object - "blu ray".

Lets take your example and extrapolate: "Incidentally, H.264 is one of the codecs used in Blu-Ray high definition video discs installed in computers and used in homes which Apple has yet to adopt." - So Apple has yet to adopt homes? The which follows homes.

No, the context is initially presented and in this case it is H.264.
 
Not likely, GPUs have been encoding video for 3 years

New flash! Apple has had such chips built-in ever since they began shipping Macs with real video cards. ATI has been using their GPUs to encode video since 2005 (Avivo).

This article makes no sense. Why would Apple invest in OpenCL (to expose existing GPU resources), and then turn their back on it to develop yet another alternative?
 
Dell "mini" with slot load BD

295


http://www.dell.com/content/product...tudio-hybrid?c=us&cs=19&l=en&s=dhs&ref=homepg

Core 2 Duo (up to 2.6GHz)
up to 4 GiB
up to 320 GB
available BD-ROM, DVD RW
optical audio out
HDMI/DVI
GbE
1394

starts at $499, $749 with slot loading BD

desktop_studio_hybrid_design2.jpg
desktop_studio_hybrid_design4.jpg
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.