K ksj1 macrumors 6502 Original poster Jul 17, 2018 294 535 Jan 20, 2024 #1 I'm thinking a Mini size machine with an Apple supported LLM, 4e cores and a huge number of gpu cores. This would be for inference with voice response, etc. Thoughts?
I'm thinking a Mini size machine with an Apple supported LLM, 4e cores and a huge number of gpu cores. This would be for inference with voice response, etc. Thoughts?