Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
Proper credit where it is due...

as-mac-pro-concept-png.1117840
 
  • Like
Reactions: boyorion
For sure the new MacPro will not have a battery at all, since it's a desktop. It might be able to run on USB power however. That would be pretty cool. Forget adding power hungry extension boards then though.
Do you remember the graphic that Apple used to show how AS power consumption and speed compared to laptops and desktops? There was only one blue blob of Apple Silicon, not two. There will be no AS desktops that don’t have batteries and have to stay plugged in all the time. The thick power cord is going away, as is the screen cord.
 
Last edited:
Do you remember the graphic that Apple used to show how AS power consumption and speed compared to laptops and desktops? There was only one blue blob of Apple Silicon, not two. There will be no AS desktops that don’t have batteries and have to stay plugged in all the time. The think white power cord is going away, as is the screen cord.
I think you misinterpreted that chart. It’s just saying that no Apple Silicon (laptop or desktop) will require as much wattage to drive as current solutions. Even the highest performance desktop ASi solutions will use far less power but achieve the same performance. There will still be solutions that plug into the wall, they’ll just draw less current.
 
This could be an ‘all ARM’ machine, meaning that MPX bays could be shrunk to fit Apple’s own cards.

That is to say, you wouldn’t be able to use existing PCIe components, but versions made by Apple.

I’m not entirely sure how they hope to compete with the likes of AMD’s 6900 and even the Pro Vegas, but if they’re going for an all ARM future then they just have something up their sleeves...
But people want Non apple storage / non raid 0 storage. Maybe if apple can drop there storage prices down a lot.
 
I think you misinterpreted that chart. It’s just saying that no Apple Silicon (laptop or desktop) will require as much wattage to drive as current solutions. Even the highest performance desktop ASi solutions will use far less power but achieve the same performance. There will still be solutions that plug into the wall, they’ll just draw less current.
The ASi chip and Mac OS already have the battery management built in. The last update improved battery management for laptops that are left plugged in most of the time and used as desktops. (Unsuccessfully, as the second monitor causes clamshell to overheat and blast the fans) A line level power supply is big and hot too. It’ll be smaller and cooler if it can run on battery level wattage.
 
The Apple team's code name for this project is 'junior'. Objective is to use the previously invested R&D from the new mac pro to stretch ROI and monetize a market gap at a lower price point as the economy is uncertain.

Apple tries to fill as many price points for their products as possible, e.g. marketing prices for iPhones and iPads and across their computer lines, even if it takes a while to hit them all with evolving products: something for everybody! Mac Pro Jr. fits in here and can showcase ARM power in a pro box to boost the ARM caché for lower end products like the early ARM macbooks, and later iMacs.

Also an attempted hackintosh killer by converting a high proportion of the cashflow in that segment to a robust Mac product by offering some customization, time saving and reliability. Got to have a machine for enthusiasts that not only a corporate budget could afford.

It won't replace the '19 Mac Pro just complement the line. There's no way it will be like the trashcan because that was such a disaster of design and a disgrace for Apple's workstation brand. They'll want to stay as far away from that form-over-function dead-end aesthetic in the Pro line as possible.

Seems like quite a broad market for Mac Pro Jr./Mac Pro mini based on the variety of comments on this thread so far!

edits: grammar and clarity
and removed joking about the pandemic that was inappropriate
 
Last edited:
The ASi chip and Mac OS already have the battery management built in. The last update improved battery management for laptops that are left plugged in most of the time and used as desktops. (Unsuccessfully, as the second monitor causes clamshell to overheat and blast the fans) A line level power supply is big and hot too. It’ll be smaller and cooler if it can run on battery level wattage.

Initial versions of the Mac Mini and also the G4 Cube used an external power supply and had "battery level wattage".
It would be pretty cool to have just the usbc cable for powering a desktop machine and use existing chargers as power supplies. This still does not mean that desktop and pro machines will have a battery though.
 
  • Like
Reactions: Unregistered 4U
Also an attempted hackintosh killer by converting a high proportion of the cashflow in that segment to a robust Mac product
Piecing together the parts required to build your own one of a kind system is a hacking challenge, but I think that’s why most folks are doing it. Providing a solution that you just “buy” defeats a lot of that.

However, I COULD see Apple doing something interesting in the “Raspberry Pi” space that WOULD scratch some hacking itches. A small powerful solution that you could use to build a product around... they could even create a learning system around it ala Swift Playgrounds.

Just a few more days!
 
  • Like
Reactions: wallah
Do you remember the graphic that Apple used to show how AS power consumption and speed compared to laptops and desktops? There was only one blue blob of Apple Silicon, not two. There will be no AS desktops that don’t have batteries and have to stay plugged in all the time. The thick power cord is going away, as is the screen cord.
Yes, there will be. Batteries add size, cost, weight and hazardous materials to machines. There is absolutely no reason that they would put them in systems that are not designed to be portable.
 
Only battery going in non-portable Mac desktops is the little round CMOS battery..
The other good thing about adding a battery to the new form factor ASi iMac, besides it being a built-in UPS, is that it becomes portable. So laptops don’t have to be powerful.
 
From Wikipedia:

Moore's law is the observation that the number of transistors in a dense integrated circuit (IC) doubles about every two years. Moore's law is an observation and projection of a historical trend. Rather than a law of physics, it is an empirical relationship linked to gains from experience in production.

The observation is named after Gordon Moore, the co-founder of Fairchild Semiconductor and CEO and co-founder of Intel, who in 1965 posited a doubling every year in the number of components per integrated circuit,[a] and projected this rate of growth would continue for at least another decade. In 1975, looking forward to the next decade, he revised the forecast to doubling every two years, a compound annual growth rate (CAGR) of 40%. While Moore did not use empirical evidence in forecasting that the historical trend would continue, his prediction held since 1975 and has since become known as a "law."
Forgot about these other points in the body of the very same Wikipedia article, didn't we?
*"Moore posited a log-linear relationship between device complexity (higher circuit density at reduced cost) and time."
*"Moore wrote only about the density of components, "a component being a transistor, resistor, diode or capacitor" at minimum cost."

As demonstrated by the WWII article the lede in is not always accurate: "World War II (WWII or WW2), also known as the Second World War, was a global war that lasted from 1939 to 1945." despite the following that prove otherwise:

* "While some historians argue that the war started on 18 September 1931 when Japan occupied Manchuria..." Cheng, Chu-chueh (2010) The Margin Without Centre: Kazuo Ishiguro Peter Lang Page 116 - Wernar Ghuhl's (2007) Imperial Japan's World War Two Transaction Publishers the "Publisher of Record in International Social Science" for his September 18, 1931 date
* LIFE - Sep 21, 1942 - Page 6 a letter to the editors states "You think World War II began in 1933, by Hitler's seizing power, but the Chinese people shall insist that World War II began on Sept. 18, 1931 by Japan's invasion of Manchuria."
* Prelude to War the United states "remember that date: Sept 18, 1931 a date you should remember as well as Dec 7, 1941. For on that date in 1931 the war we are now fighting begun."

*The United State Holocaust Memorial Museum's World War II: Timeline start with September 18, 1931 though it notes July 7, 1937 as when WWII started in the Pacific

*"World War II began along a stretch of railroad track near the northeastern Chinese city of Mukden (now Shenyang). There, on Sept. 18, 1931,..." ( Polmar, Norman; Thomas B. Allen (1991) World War II: America at war, 1941-1945 ISBN-13: 978-0394585307

*"He knew the story well, because it had been he who transmitted the orders for the Japanese troops to march that snowy September 18, 1931, which is actually the date when World War II started." Lee, Clark (1943) They Call It Pacific

The point of all that is wikipedia's ledes in are sometimes POVed out the wazzo because sources weren't considered reliable by the clueless majority. Think about that, a film made by the United State government at the height of the conflict is considered unreliable fore the lede in by the consensus of the wikipedia community. o_O:eek:

WTF is “investopedia” anyway?
Another wikipedia like database that "provides investing and finance education along with reviews, ratings, and comparisons of various financial products such as brokerage accounts. Investopedia currently reaches 17 million unique viewers in the US each month" Oh and where did that come from? Wikipedia. :p
 
Last edited:
Forgot about these other points in the body of the very same Wikipedia article, didn't we?
*"Moore posited a log-linear relationship between device complexity (higher circuit density at reduced cost) and time."
*"Moore wrote only about the density of components, "a component being a transistor, resistor, diode or capacitor" at minimum cost."

As demonstrated by the WWII article the lede in is not always accurate: "World War II (WWII or WW2), also known as the Second World War, was a global war that lasted from 1939 to 1945." despite the following that prove otherwise:

* "While some historians argue that the war started on 18 September 1931 when Japan occupied Manchuria..." Cheng, Chu-chueh (2010) The Margin Without Centre: Kazuo Ishiguro Peter Lang Page 116 - Wernar Ghuhl's (2007) Imperial Japan's World War Two Transaction Publishers the "Publisher of Record in International Social Science" for his September 18, 1931 date
* LIFE - Sep 21, 1942 - Page 6 a letter to the editors states "You think World War II began in 1933, by Hitler's seizing power, but the Chinese people shall insist that World War II began on Sept. 18, 1931 by Japan's invasion of Manchuria."
* Prelude to War the United states "remember that date: Sept 18, 1931 a date you should remember as well as Dec 7, 1941. For on that date in 1931 the war we are now fighting begun."

*The United State Holocaust Memorial Museum's World War II: Timeline start with September 18, 1931 though it notes July 7, 1937 as when WWII started in the Pacific

*"World War II began along a stretch of railroad track near the northeastern Chinese city of Mukden (now Shenyang). There, on Sept. 18, 1931,..." ( Polmar, Norman; Thomas B. Allen (1991) World War II: America at war, 1941-1945 ISBN-13: 978-0394585307

*"He knew the story well, because it had been he who transmitted the orders for the Japanese troops to march that snowy September 18, 1931, which is actually the date when World War II started." Lee, Clark (1943) They Call It Pacific

The point of all that is wikipedia's ledes in are sometimes POVed out the wazzo because sources weren't considered reliable by the clueless majority. Think about that, a film made by the United State government at the height of the conflict is considered unreliable fore the lede in by the consensus of the wikipedia community. o_O:eek:


Another wikipedia like database that "provides investing and finance education along with reviews, ratings, and comparisons of various financial products such as brokerage accounts. Investopedia currently reaches 17 million unique viewers in the US each month" Oh and where did that come from? Wikipedia. :p
Nothing you cited states that the law is other than what I said the law is. And the Wikipedia article says the law is what I said it was. All you’ve cited shows, at most, is that he thought the *reason* the number of transistors would double every two years was because of increasing density. But the law does not require increasing density.

More sources that state that Moore’s Law refers to the number of transistors on a die, and not the size of the transistors or their density:


(I can do this all day)
 
Nothing you cited states that the law is other than what I said the law is. And the Wikipedia article says the law is what I said it was. All you’ve cited shows, at most, is that he thought the *reason* the number of transistors would double every two years was because of increasing density. But the law does not require increasing density.
"There are several formulations of “Moore's Law,” but roughly it says that the density of circuit components per unit area that it is most economically profitable for commercial..." -
Mody, Cyrus C. M. (2017) The Long Arm of Moore's Law The MIT Press

I would say MIT Press (a university press BTW) trumps another other reference you can come up with. And they are not the only high quality publication that says this either.

"Moore's lawbasically states that transistor density per integrated circuit area doubles every two years ( Chang , 2003 ) " (2007) Annual Review of Communications: Volume 59 - Page 16

"Moore's law for integration density in terms of equivalent number of elements per square micron of integrated photonics devices, showing a growth faster than the IC" - (2014) Monolithic Nanoscale Photonics-Electronics Integration Page 146

Also think about how leaving out the density part results in a totally absurd conclusion - you just double the area of chip and presto you keep to Moore's law even if you are building planet sized computers using ordinary transistors. :oops:
 
"There are several formulations of “Moore's Law,” but roughly it says that the density of circuit components per unit area that it is most economically profitable for commercial..." -
Mody, Cyrus C. M. (2017) The Long Arm of Moore's Law The MIT Press

I would say MIT Press (a university press BTW) trumps another other reference you can come up with. And they are not the only high quality publication that says this either.

"Moore's lawbasically states that transistor density per integrated circuit area doubles every two years ( Chang , 2003 ) " (2007) Annual Review of Communications: Volume 59 - Page 16

"Moore's law for integration density in terms of equivalent number of elements per square micron of integrated photonics devices, showing a growth faster than the IC" - (2014) Monolithic Nanoscale Photonics-Electronics Integration Page 146

Also think about how leaving out the density part results in a totally absurd conclusion - you just double the area of chip and presto you keep to Moore's law even if you are building planet sized computers using ordinary transistors. :oops:

It’s not at all an “absurd conclusion.” Do you have any idea how difficult it is to ”just double the area of chip?” Chip size has been going up consistently over the years because as semiconductor technology improves, it allows that to happen. It’s an important component of what makes Moore’s Law work.

In case you are not aware, there are many reasons “doubling the area of the chip” is very difficult. To do it you have to cope with the fact that more area means lower yield. You have to come up with methods of lithography that cope with distortion at the edges of the reticle. You have to figure out how to architect the design so that time-of-flight (6 picoseconds per mm) allows decent performance even though things may be further apart. You have to take into account that more devices means more power consumption, which means you need a way to distribute it on the chip. You have to account for the fact that clock edges arrive at different times at different parts of the chip (clock skew).

The idea that Moore’s Law means ”transistors shrink so there’s more density” is something only a person who’s never designed a CPU would say.
 
It’s not at all an “absurd conclusion.” Do you have any idea how difficult it is to ”just double the area of chip?” Chip size has been going up consistently over the years because as semiconductor technology improves,
Got to love the naming convention which implies the opposite. Which is smaller 7 nm or 5 nm? :p For example, the old 8088 used a 3 µm process while the Comet Lake-S uses 14 nm. The average Joe will look at that and go "on what planet is 14 nm bigger than 3 µm?!" Of course there are things like that 8.5 inch chip but that thing is not cheap.
 
It’s not at all an “absurd conclusion.” Do you have any idea how difficult it is to ”just double the area of chip?” Chip size has been going up consistently over the years because as semiconductor technology improves, it allows that to happen. It’s an important component of what makes Moore’s Law work.

In case you are not aware, there are many reasons “doubling the area of the chip” is very difficult. To do it you have to cope with the fact that more area means lower yield. You have to come up with methods of lithography that cope with distortion at the edges of the reticle. You have to figure out how to architect the design so that time-of-flight (6 picoseconds per mm) allows decent performance even though things may be further apart. You have to take into account that more devices means more power consumption, which means you need a way to distribute it on the chip. You have to account for the fact that clock edges arrive at different times at different parts of the chip (clock skew).

The idea that Moore’s Law means ”transistors shrink so there’s more density” is something only a person who’s never designed a CPU would say.
you can double the sockets.
 
  • Haha
Reactions: Flint Ironstag
you can double the sockets.
I find the idea that MIT with its "There are several formulations of “Moore's Law,” but roughly it says that the density of circuit components per unit area that it is most economically profitable for commercial..." (Mody, Cyrus C. M. (2017) The Long Arm of Moore's Law The MIT Press) be what "a person who’s never designed a CPU would say." totally hysterical given how many CPUs they have designed including that carbon nanotube one. :eek: 🤣

MIT never designed a CPU?! 🤣
 
Got to love the naming convention which implies the opposite. Which is smaller 7 nm or 5 nm? :p For example, the old 8088 used a 3 µm process while the Comet Lake-S uses 14 nm. The average Joe will look at that and go "on what planet is 14 nm bigger than 3 µm?!" Of course there are things like that 8.5 inch chip but that thing is not cheap.
Given that people can actually see the physical die, without a microscope, unlikely a lot of people think it‘s 7nm in any dimension :)

8086 was 33mm^2 at 3.2 micron feature size
80286 was 49mm^2 at 1.5 microns
P55C was 140mm^2 at 0.28 microns
Coffee Lake was 149mm^2 at 14nm

etc.

Chips keep growing.
 
I find the idea that MIT with its "There are several formulations of “Moore's Law,” but roughly it says that the density of circuit components per unit area that it is most economically profitable for commercial..." (Mody, Cyrus C. M. (2017) The Long Arm of Moore's Law The MIT Press) be what "a person who’s never designed a CPU would say." totally hysterical given how many CPUs they have designed including that carbon nanotube one. :eek: 🤣

MIT never designed a CPU?! 🤣

How many MIT cpus do you have in your house?

Decent chance you have, or have had, a CPU I designed in your house.
 
How many MIT cpus do you have in your house?
How many Intel cpus do you have in your house? (I have at least one powering my iMac :) ) A aske because that is how Raja Koduri,Intel’s chief architect, effectively defined Moore's Law! He even said "We firmly believe there is a lot more transistor density to come.” Not transistor count but density.

If density is not part Moore's Law then why do so many people define it that way?

"Moore's Law which predicts the doubling of transistor density about every 18 months" - A Department of Defense Perspective (2003)
"How was Gordon Moore able to make such an accurate prediction of the transistor density vs. time that has lasted for over 40 years..." - Learning Bio-Micro-Nanotechnology (2013)
"The fundamental reason for the increase in transistor density is described by what is called Moore's law ( Chang , 2003 ) . Moore's lawbasically states that transistor density per integrated circuit area doubles every two years ( Chang , 2003 )" - Annual Review of Communications: Volume 59
"[Moore's] 'law' has changed. It originally referred to the density of transistors on a piece of silicon ." - Higher National Computing Tutor Resource Pack CRC Press
"to increase exponentially as evidenced by the transistor density and the microprocessor clock frequency ( Moore's law )"- The Journal of the Korean Physical Society
"transistor density on a chip every 12–18 months — known as Moore's Law" - Journal of the Patent and Trademark Office Society 1992
"Moore's Law states that transistor density on chips will double every 18 months." - Maximum PC - Apr 2007 - Page 8

On that why does the Journal of Micro/nanolithography define it as performance doubling?
 
Nice chat about Moore's law. So anyway, from what I saw today on Apple's OMT event, it seems *very* plausible that there could be an AS-based, half-sized Mac Pro before long.

Based on the existing power per watt and incredible system on a chip integration, there's huge potential for future chips and systems once Apple really gets rolling. I think AS is as transformative to PCs as iPhone was to mobiles. What they showed today was really mindblowing--today and for what's to come.

Interestingly, they didn't bubble the prices for these huge performance gains. Now it makes me wonder if the Mac Pro mini/junior, with e-gpu support for example, will in fact replace the 7,1 MP. At least the entire bottom half of the price/options range to start. Until we get the M10 o_O
 
Last edited:
In case you are not aware, there are many reasons “doubling the area of the chip” is very difficult. To do it you have to cope with the fact that more area means lower yield. You have to come up with methods of lithography that cope with distortion at the edges of the reticle. You have to figure out how to architect the design so that time-of-flight (6 picoseconds per mm) allows decent performance even though things may be further apart. You have to take into account that more devices means more power consumption, which means you need a way to distribute it on the chip. You have to account for the fact that clock edges arrive at different times at different parts of the chip (clock skew).
(Late to the party, but still...)

And don't forget about Electronic Design Automation (EDA) software, and server capacity. The chip I'm designing right now... well, I can't say anything about it…
…but I _can_ say that the server that is set aside just for my block has 6TB of ECC RAM in it. Not local SSD space, RAM. (SIX. FRACKING. TERABYTES. OF… RAM!!!)

And I have that because my block _needs_ that to build & analyze the thing. Big chips are terrifying data management problems.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.