Let's make up our mind here...
I've been burned by HighPoint in the past - horrible drivers and no support. Simple things work, but under load there are timeouts and volumes going offline, random errors. No updates, no response to questions.Highpoint have a nvme raid pci card. It is possible to get it running in a MacPro ? .
I've been burned by HighPoint in the past - horrible drivers and no support. Simple things work, but under load there are timeouts and volumes going offline, random errors. No updates, no response to questions
Anything here that ends in "8i8e" (8internal 8external). Get the battery/capacitor option.My sentiments exactly. I’m disappointed with the Highpoint card I have for my RAID set up. Can you recommend a RAID card with both internal and external SAS ports?
Regarding the highpoint NVMe card, I was curious about it but couldn’t find any reviews how it would work on the Mac. So I got the Amfeltec and have been happy with it.
Anything here that ends in "8i8e" (8internal 8external). Get the battery/capacitor option.
why people always assume that the person who has a raid0 array doesn't already know this, we know what will happen to the data if the raid array fails, that's why we have our precious data on a separate drive, well in my case I'm a bit too extreme, I have a backup of a backup, call it a dual backup, honestly I don't want to sound mean but cmon man, what benefits, I have four 970 pro in raid0 and hitting almost 12,000 in read and close 9,000 in write, so you tell me what are the benefits, everything is just faster, ssd's or nmve's drivers are not mechanical drives, yes there is still a risk or a chance that it might happens but mechanical drives have a higher rate of failing because of the spinning platter. and even if the raid0 fail for whatever reason is not the end of the world, simply verified your drives and make a new raid0. have you ever heard of carbon copy cloner and just in case I'm running High Sierra booting from my raid0I can't help with your question sorry, but what benefits would you get by RAIDing two NVMe?
With RAID0 you get the benefit of pooling (volume spanning) the space of your drives, but if you lose one you lose the lot. You'd be better to have each NVMe in a separate PCIe card so each drive can achieve maximum R/W. If you're limited on PCIe slots, then having both NVMe in a single PCIe card might be the trade off.
With RAID1 you won't realise any extra speed advantage as the maximum R/W you'll get (~1400 MB/s) from a single NVMe will already saturate the maximum system PCIe bus speed (2000MB/s), and many NVMe are capable of much higher R/W (Samsung 960 EVO can R/W 3100/1900) so we never actually get to realise the full speed capability of our NVMe in our cMP anyway.