Maybe like XMas for Christmas.why is it being called a global health crisis… it’s a pandemic and it’s still happening…
Maybe like XMas for Christmas.why is it being called a global health crisis… it’s a pandemic and it’s still happening…
I can’t see it very well but the equipment on the right appears to show fiber optic connections so my guess would be it is just networking equipment.I just think it's funny that the background image is a rack full of MacPros... the last computer to be available with no Apple Silicone option.
Of course, those MacPros might have Apple Silicone... testing maybe?
That’s what I did in the 60’s.computer science and electronic engineering would be a good start…
Start a petition and get the government to make it law.Meh, no infos about shaders, and still no Realtime Raytracing…
It still looks like a set piece though considering what is missing from the image.
Probably had to, as a lot of the (very expensive) commercial CAE applications for IC design are x86-only.Suppose Apple use computers with intel processors to design Apple silicon?
I just think it's funny that the background image is a rack full of MacPros... the last computer to be available with no Apple Silicone option.
Of course, those MacPros might have Apple Silicone... testing maybe?
At which US universities can a person study and graduate from a program in chip design and architecture to be employed by Apple? Stated differently, what engineering degree would be a good prerequisite before going onto masters and doctoral studies? (Electrical engineering, Physics, Applied Mathematics)? More importantly what can a high school student do to prepare in order to enter what I presume to be a highly competitive field?
While technically possible to get hired for chip development at Apple or other similar companies without a related 4 year degree and or masters, it will be a much longer road to get there. I know the industry well, and to get into the top tier companies like Apple, it requires a personal referral from an existing team member to skip past the initial credential screen, and even so such a hire would be treated suspiciously. So, you would have to be extraordinary, and have "an in".Stop obsessing with credentials. Start obsessing with DOING.
I have spoken to so many kids who think they want to get into "computers" but I ask them questions about what they care about, and blank stares -- they assume the college will do it for them.
If you are the sort of person Apple wants to hire, your degree does not matter, what matters is what you do right now. Are you spending every free moment reading papers about micro-architecture? Are you reading patents? Are you reading PhD theses? Are you designing micro-benchmarks to figure out how CPU's work internally?
If not, why not? That's what people who LOVE this stuff do...
And every penny of it earned.I imagine his compensation is a blank check.
And tie them to weekly multi million dollar fines (that go… somewhere quite useful I guess?) until the monopoly on “non-raytraced hardwares” is brought down. Add-in option: let people have third party RT cores added to it!Start a petition and get the government to make it law.
Can get behind this… I actually have a B.S in Electronics Engineering and even did learn about the basics of computer architecture (RISC and CISC and all that) but I forgot all of that because on my free time I was obsessed with anything 3D, CG and video games…Stop obsessing with credentials. Start obsessing with DOING.
I have spoken to so many kids who think they want to get into "computers" but I ask them questions about what they care about, and blank stares -- they assume the college will do it for them. […]
Here is an interesting quote, “the company’s software designers tuned the computers to favor the specifications it most desired, such as smooth videogame graphics.”Meh, no infos about shaders, and still no Realtime Raytracing…
Yeah, got a lot of time for the dude in charge of Apple Silicon. Surely he is the most competent of all the Apple executives at the moment.Nice interview! I have lots of respect for Srouji. He should definitely be considered to be running Apple. M1 chips is his baby and he has done a stellar job.
Well, that's the difference between people who want to work with the best people in the world, vs people who want to work with a group of friends...clearly very skilled and talented but seems horribly uncharismatic, like a robot. I would actually pay money not to work for him.
He's also the only one who was able to get his team into an environment where they could thrive, instead of sitting in cubicles in the donut.I imagine his compensation is a blank check.
Apple still has a very long way to go with GPU performance and getting Metal adopted, the M1 and M2 are just to replace Intel/AMD for the majority of people. I would not hold my breath for any advanced GPU stuff in the coming year. If this is what your are looking for you are on the wrong platform.Meh, no infos about shaders, and still no Realtime Raytracing…
Which is itself still interesting, since they’re willing to show one or two machines running Windows, if you look closely…That background furniture looks far more real-world compared to the set piece they have in pre-recorded Apple events.
You want to work in high level/deep engineering? These are the types of human beings you’re going to interact with regularly. It’s great that he’s socially functional enough to appear in scripted videos and do interviews.clearly very skilled and talented but seems horribly uncharismatic, like a robot. I would actually pay money not to work for him.
Being amazingly competent, skilled, and credentialed in one area does not necessarily make a person the same in other areas. Specialization is fine, and ladder-climbing for the sake of it is irrational and mindless.Nice interview! I have lots of respect for Srouji. He should definitely be considered to be running Apple. M1 chips is his baby and he has done a stellar job.