Do you remember the graphic that Apple used to show how AS power consumption and speed compared to laptops and desktops? There was only one blue blob of Apple Silicon, not two. There will be no AS desktops that don’t have batteries and have to stay plugged in all the time. The thick power cord is going away, as is the screen cord.For sure the new MacPro will not have a battery at all, since it's a desktop. It might be able to run on USB power however. That would be pretty cool. Forget adding power hungry extension boards then though.
I think you misinterpreted that chart. It’s just saying that no Apple Silicon (laptop or desktop) will require as much wattage to drive as current solutions. Even the highest performance desktop ASi solutions will use far less power but achieve the same performance. There will still be solutions that plug into the wall, they’ll just draw less current.Do you remember the graphic that Apple used to show how AS power consumption and speed compared to laptops and desktops? There was only one blue blob of Apple Silicon, not two. There will be no AS desktops that don’t have batteries and have to stay plugged in all the time. The think white power cord is going away, as is the screen cord.
TB3 needs pci-e and 4 bus at X4 each is 16 lanesPCIe still takes a lot of space, and TB3 is not a viable alternative yet.
But people want Non apple storage / non raid 0 storage. Maybe if apple can drop there storage prices down a lot.This could be an ‘all ARM’ machine, meaning that MPX bays could be shrunk to fit Apple’s own cards.
That is to say, you wouldn’t be able to use existing PCIe components, but versions made by Apple.
I’m not entirely sure how they hope to compete with the likes of AMD’s 6900 and even the Pro Vegas, but if they’re going for an all ARM future then they just have something up their sleeves...
The ASi chip and Mac OS already have the battery management built in. The last update improved battery management for laptops that are left plugged in most of the time and used as desktops. (Unsuccessfully, as the second monitor causes clamshell to overheat and blast the fans) A line level power supply is big and hot too. It’ll be smaller and cooler if it can run on battery level wattage.I think you misinterpreted that chart. It’s just saying that no Apple Silicon (laptop or desktop) will require as much wattage to drive as current solutions. Even the highest performance desktop ASi solutions will use far less power but achieve the same performance. There will still be solutions that plug into the wall, they’ll just draw less current.
The ASi chip and Mac OS already have the battery management built in. The last update improved battery management for laptops that are left plugged in most of the time and used as desktops. (Unsuccessfully, as the second monitor causes clamshell to overheat and blast the fans) A line level power supply is big and hot too. It’ll be smaller and cooler if it can run on battery level wattage.
Piecing together the parts required to build your own one of a kind system is a hacking challenge, but I think that’s why most folks are doing it. Providing a solution that you just “buy” defeats a lot of that.Also an attempted hackintosh killer by converting a high proportion of the cashflow in that segment to a robust Mac product
Yes, there will be. Batteries add size, cost, weight and hazardous materials to machines. There is absolutely no reason that they would put them in systems that are not designed to be portable.Do you remember the graphic that Apple used to show how AS power consumption and speed compared to laptops and desktops? There was only one blue blob of Apple Silicon, not two. There will be no AS desktops that don’t have batteries and have to stay plugged in all the time. The thick power cord is going away, as is the screen cord.
The other good thing about adding a battery to the new form factor ASi iMac, besides it being a built-in UPS, is that it becomes portable. So laptops don’t have to be powerful.Only battery going in non-portable Mac desktops is the little round CMOS battery..
Forgot about these other points in the body of the very same Wikipedia article, didn't we?From Wikipedia:
Moore's law is the observation that the number of transistors in a dense integrated circuit (IC) doubles about every two years. Moore's law is an observation and projection of a historical trend. Rather than a law of physics, it is an empirical relationship linked to gains from experience in production.
The observation is named after Gordon Moore, the co-founder of Fairchild Semiconductor and CEO and co-founder of Intel, who in 1965 posited a doubling every year in the number of components per integrated circuit,[a] and projected this rate of growth would continue for at least another decade. In 1975, looking forward to the next decade, he revised the forecast to doubling every two years, a compound annual growth rate (CAGR) of 40%. While Moore did not use empirical evidence in forecasting that the historical trend would continue, his prediction held since 1975 and has since become known as a "law."
Another wikipedia like database that "provides investing and finance education along with reviews, ratings, and comparisons of various financial products such as brokerage accounts. Investopedia currently reaches 17 million unique viewers in the US each month" Oh and where did that come from? Wikipedia.WTF is “investopedia” anyway?
Nothing you cited states that the law is other than what I said the law is. And the Wikipedia article says the law is what I said it was. All you’ve cited shows, at most, is that he thought the *reason* the number of transistors would double every two years was because of increasing density. But the law does not require increasing density.Forgot about these other points in the body of the very same Wikipedia article, didn't we?
*"Moore posited a log-linear relationship between device complexity (higher circuit density at reduced cost) and time."
*"Moore wrote only about the density of components, "a component being a transistor, resistor, diode or capacitor" at minimum cost."
As demonstrated by the WWII article the lede in is not always accurate: "World War II (WWII or WW2), also known as the Second World War, was a global war that lasted from 1939 to 1945." despite the following that prove otherwise:
* "While some historians argue that the war started on 18 September 1931 when Japan occupied Manchuria..." Cheng, Chu-chueh (2010) The Margin Without Centre: Kazuo Ishiguro Peter Lang Page 116 - Wernar Ghuhl's (2007) Imperial Japan's World War Two Transaction Publishers the "Publisher of Record in International Social Science" for his September 18, 1931 date
* LIFE - Sep 21, 1942 - Page 6 a letter to the editors states "You think World War II began in 1933, by Hitler's seizing power, but the Chinese people shall insist that World War II began on Sept. 18, 1931 by Japan's invasion of Manchuria."
* Prelude to War the United states "remember that date: Sept 18, 1931 a date you should remember as well as Dec 7, 1941. For on that date in 1931 the war we are now fighting begun."
*The United State Holocaust Memorial Museum's World War II: Timeline start with September 18, 1931 though it notes July 7, 1937 as when WWII started in the Pacific
*"World War II began along a stretch of railroad track near the northeastern Chinese city of Mukden (now Shenyang). There, on Sept. 18, 1931,..." ( Polmar, Norman; Thomas B. Allen (1991) World War II: America at war, 1941-1945 ISBN-13: 978-0394585307
*"He knew the story well, because it had been he who transmitted the orders for the Japanese troops to march that snowy September 18, 1931, which is actually the date when World War II started." Lee, Clark (1943) They Call It Pacific
The point of all that is wikipedia's ledes in are sometimes POVed out the wazzo because sources weren't considered reliable by the clueless majority. Think about that, a film made by the United State government at the height of the conflict is considered unreliable fore the lede in by the consensus of the wikipedia community.
Another wikipedia like database that "provides investing and finance education along with reviews, ratings, and comparisons of various financial products such as brokerage accounts. Investopedia currently reaches 17 million unique viewers in the US each month" Oh and where did that come from? Wikipedia.![]()
"There are several formulations of “Moore's Law,” but roughly it says that the density of circuit components per unit area that it is most economically profitable for commercial..." -Nothing you cited states that the law is other than what I said the law is. And the Wikipedia article says the law is what I said it was. All you’ve cited shows, at most, is that he thought the *reason* the number of transistors would double every two years was because of increasing density. But the law does not require increasing density.
"There are several formulations of “Moore's Law,” but roughly it says that the density of circuit components per unit area that it is most economically profitable for commercial..." -
Mody, Cyrus C. M. (2017) The Long Arm of Moore's Law The MIT Press
I would say MIT Press (a university press BTW) trumps another other reference you can come up with. And they are not the only high quality publication that says this either.
"Moore's lawbasically states that transistor density per integrated circuit area doubles every two years ( Chang , 2003 ) " (2007) Annual Review of Communications: Volume 59 - Page 16
"Moore's law for integration density in terms of equivalent number of elements per square micron of integrated photonics devices, showing a growth faster than the IC" - (2014) Monolithic Nanoscale Photonics-Electronics Integration Page 146
Also think about how leaving out the density part results in a totally absurd conclusion - you just double the area of chip and presto you keep to Moore's law even if you are building planet sized computers using ordinary transistors.![]()
Got to love the naming convention which implies the opposite. Which is smaller 7 nm or 5 nm?It’s not at all an “absurd conclusion.” Do you have any idea how difficult it is to ”just double the area of chip?” Chip size has been going up consistently over the years because as semiconductor technology improves,
you can double the sockets.It’s not at all an “absurd conclusion.” Do you have any idea how difficult it is to ”just double the area of chip?” Chip size has been going up consistently over the years because as semiconductor technology improves, it allows that to happen. It’s an important component of what makes Moore’s Law work.
In case you are not aware, there are many reasons “doubling the area of the chip” is very difficult. To do it you have to cope with the fact that more area means lower yield. You have to come up with methods of lithography that cope with distortion at the edges of the reticle. You have to figure out how to architect the design so that time-of-flight (6 picoseconds per mm) allows decent performance even though things may be further apart. You have to take into account that more devices means more power consumption, which means you need a way to distribute it on the chip. You have to account for the fact that clock edges arrive at different times at different parts of the chip (clock skew).
The idea that Moore’s Law means ”transistors shrink so there’s more density” is something only a person who’s never designed a CPU would say.
I find the idea that MIT with its "There are several formulations of “Moore's Law,” but roughly it says that the density of circuit components per unit area that it is most economically profitable for commercial..." (Mody, Cyrus C. M. (2017) The Long Arm of Moore's Law The MIT Press) be what "a person who’s never designed a CPU would say." totally hysterical given how many CPUs they have designed including that carbon nanotube one.you can double the sockets.
Given that people can actually see the physical die, without a microscope, unlikely a lot of people think it‘s 7nm in any dimensionGot to love the naming convention which implies the opposite. Which is smaller 7 nm or 5 nm?For example, the old 8088 used a 3 µm process while the Comet Lake-S uses 14 nm. The average Joe will look at that and go "on what planet is 14 nm bigger than 3 µm?!" Of course there are things like that 8.5 inch chip but that thing is not cheap.
I find the idea that MIT with its "There are several formulations of “Moore's Law,” but roughly it says that the density of circuit components per unit area that it is most economically profitable for commercial..." (Mody, Cyrus C. M. (2017) The Long Arm of Moore's Law The MIT Press) be what "a person who’s never designed a CPU would say." totally hysterical given how many CPUs they have designed including that carbon nanotube one.🤣
MIT never designed a CPU?! 🤣
How many Intel cpus do you have in your house? (I have at least one powering my iMacHow many MIT cpus do you have in your house?
(Late to the party, but still...)In case you are not aware, there are many reasons “doubling the area of the chip” is very difficult. To do it you have to cope with the fact that more area means lower yield. You have to come up with methods of lithography that cope with distortion at the edges of the reticle. You have to figure out how to architect the design so that time-of-flight (6 picoseconds per mm) allows decent performance even though things may be further apart. You have to take into account that more devices means more power consumption, which means you need a way to distribute it on the chip. You have to account for the fact that clock edges arrive at different times at different parts of the chip (clock skew).