By digging over Apples Engineers.
isn't this illegal to take the info from past employer to new employer?
By digging over Apples Engineers.
isn't this illegal to take the info from past employer to new employer?
This move will probably spell bad news more to Intel than to Apple. Another competitor in the market for them.
With Apple, they sell computers, not chips, so it’s a non-issue to them either way.
They did. Apple will be on their own modems soon enough.I thought Apple bought Intel's modem IP?
Love it! More cat and computer quotes please, Captain.Sinclair, one of my cats that likes to hang out around my Macs, says: "Ooh! Ooh! Hire me, I have been in close proximity to Apple devices! I can help you make world beating processors! Really!"
View attachment 1801082
Until the enterprise, government and many others use and are dependent on x86 software, Intel does not need to be afraid of competitors.This move will probably spell bad news more to Intel than to Apple. Another competitor in the market for them.
Sherlock, who joined Qualcomm as part of the recent acquisition of Nuvia that had ex-Apple architects.Love it! More cat and computer quotes please, Captain.
Interesting.on a serious note, who will use Qualcomm's whatever architecture they are making? all the OSs and software is made for x86
What this really means, IF Qualcomm takes the correct approach, is more competition for innovation that benefits users/consumers.
It is a given Apple will developed their own chips to go end to end with the rest of their hardware.
Qualcomm should compete and not rest on their laurels by doing something similar.
I would prefer to see corporations actually tale the time and investment in their own critical chips instead of buying up smaller entities.
looking at Nvidia.
Also drop kick outside shareholders as THEY are the ones who stranglehold companies bleeding them dry with quarterly expectations to feed never ending dividends, when those funds can go towards better worker conditions, research and investment to innovations.
I do wonder if the Windows gaming market could flip to Arm more easily than expected - given Nvidia at the core of it all. Current GPUs are very power hungry and generate a lot of heat, so a relatively cool, power sipping Arm CPU could free up a lot of TDP for even more insatiable graphics cards. A number of games are successfully ported to the Switch, using an old and pretty weak Arm chip. I think if the hardware starts appearing, game studios might embrace it quicker than you'd expect going forward? Not to mention if the PS6 and Xbox nextnewnamingscheme also switch over...Unfortunately, the kind of applications that are either outright unavailable on Mac or have less functionality on it (eg games, CAD, enterprise software) are also the least likely to be ported to Windows in Arm. And the reason that those applications have not been ported to Mac is often because they are huge apps full of Windows dependencies and/or spaghetti code going back decades, meaning that emulating these programs will either not work or have huge performance hits. I think Windows on ARM can get to the point where it is fine for most mainstream consumer software (if Microsoft and Qualcomm really are serious about it), but Macs (M1 or otherwise) are fine for that too. If you need Windows support for some niche application, Windows on ARM is probably not for you.
Not to mention Android, iOS, MacOS... y'know just >50% of computing devices in use globally today...Interesting.
Makes me wonder how my raspberry pi and bananapi seem to work, since there aren‘t any OSs that run on non-x86 hardware.
Interesting.
Makes me wonder how my raspberry pi and bananapi seem to work, since there aren‘t any OSs that run on non-x86 hardware.
A lot of smug arrogance in this thread...
How do you know the basic design principle Nuvia is based on? That's a ridiculous claim.nuvia is basically selling reimplemented A12 plus replicate in terms of design. QCOM is fortunate to find some know-hows in Apple. So I am sure they will have competitive chips soon.
How do you know the basic design principle Nuvia is based on? That's a ridiculous claim.
M1 is very good; that doesn't mean it's perfect. To take just one example, it doesn't use virtual registers. It's fairly understandable why this is the case when you plumb deep into the details; and it's likely something that fixed with the A15 redesign. I imagine GW3 and Manu have a long mental list of issues like that, "thing we wish we'd done differently when we did the A11 redesign", and have based the Nuvia design on fixing some large number of those pain points.
The problem QC have is that competing with an M1 is a much harder task than Apple had competing with the A57! Although the Nuvia team have a good mental overview of what's required, you still need a LOT of people (and a LOT of tools, and a LOT of time) to fill in the details. QC can provide some of this. Can they provide enough?
The second issue is that a SoC is more than just the CPUs. Again Apple have built up a deep bench of competence in GPU, NPU, DMA, NoC, security enclave etc. Again QC have competence in that space -- but they don't have Apple's deep competence and I don't know how long it will take them to get there -- or if they are willing to pay what it takes, rather than doing what they have always done before and making the decision "good enough for our level of customers". Which may be the right business decision, but will only further the split between customers and engineers that you buy/work for Apple if you want the best, and for QC is you want adequate.
They won’t, they will bring technical Engineering expertises.isn't this illegal to take the info from past employer to new employer?
I don’t think so, the people who are apt to jump on the Apple bandwagon against Intel are very likely to jump on the QUALCOMM bandwagon against Intel. If anything, coming behind Apple, folks have a long list of statements that have already been used in favor of Apple that they can use in favor of QUALCOMM.And competing against the M1 is going to mean competing against the stacks and stacks of laudatory utterances from so many different sources. The 'coverage' was getting embarrassing. 'Apple is eating Intel's lunch' was a common one (I may have said something to that effect, and regret it somewhat).