Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

Mr Screech

macrumors 6502
Original poster
Mar 2, 2018
260
264
Can someone explain why the entry m1 models would use die-space for a neural engine?

As I'm understanding right now, the neural engine is used for face recognition, photography enhancement and machinelearning. The first two seem more of a mobile phone feature and the last one is used by such a small percentage of people, it seems odd to put it in.

What is the benefit for most users?
 
Can someone explain why the entry m1 models would use die-space for a neural engine?

As I'm understanding right now, the neural engine is used for face recognition, photography enhancement and machinelearning. The first two seem more of a mobile phone feature and the last one is used by such a small percentage of people, it seems odd to put it in.

What is the benefit for most users?
AI (and in particular Deep Learning) is so new that no-one yet knows quite what it will be good for. But one hope )that's especially relevant to Macs more than iDevices) is a variety of language tasks.
Translation is one example, but there are others. Summarize documents. Better grammar correction. Sentiment analysis. Semantic search (ie "search my mac files by meaning, not by literal words").
Basically why CAN'T your mac be like a high quality administrative assistant, with all that implies about language knowledge?

Now this is all something of a hope and a gamble. No-one is quite sure how it will work out. But one thing that is sure is that if hardware of this sort is present on all macs going forward, it's a lot more likely that various people trying various experiments (inside Apple, inside other companies, at universities) will have reason to keep trying along these lines, trying to make models better and to get them to run on laptop class machines.
 
  • Like
Reactions: Mr. Awesome
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.