Jump to content

RAID0 PCIe M2 SSD & DCS


Recommended Posts

Anyone using such an array with success? Why use RAID0 with DCS, and most importantly, why NOT use RAID0 with DCS?

Asus ROG Maximus X Apex//Core I7 8700K @ 5.3Ghz //32GB DDR4 RAM//Asus 3090 RTX//4K monitor w/ TrackIR 5

 

 

 

Link to comment
Share on other sites

RAID and gaming applications don't really suit each other.

 

games are single applications that want large amounts of linear data.

 

RAID likes lots of applications requesting lots of random data.

 

A single SSD will probably be faster for DCS.

but a raid would be better for 10 virtual servers all running a copy of DCS, on one box..

 

i used to RAID0 spinners because they were faster for linear tasks in raid.

 

but SSD you dont get as much boost if any. and with 2 disks a raid has twice the risk of failure.

 

a single pcie SSD is 3x faster than a SATA SSD anyway.

 

it would be interesting to see if RAID0 speeds up VR as that is not just the game requiring cpu and data. but i doubt it.

My Rig: AM5 7950X, 32GB DDR5 6000, M2 SSD, EVGA 1080 Superclocked, Warthog Throttle and Stick, MFG Crosswinds, Oculus Rift.

Link to comment
Share on other sites

You will actually have to use 2 x PCIe SLOTS at 4x speed and thus limit your GPU to 8x by doing so.

 

Your Z97 chipset/cpu has 16 PCIe lanes for the Slots and I assume an additional 4 lanes for the PCH to connect to the CPU ( for Sata and USB, Sound..etc ).

 

What you want will cost you some downgrading somewhere else as you do not have that many PCIe lanes to drive all those devices at their full speed ( 16x-GPU + 2 x PCIe-SSD at 4x = 24 lanes ).

 

YOu should check your mobo manual for valid and supported PCIe Slot configurations before you go out and buy.

 

What you wanna do is a thing one should do with a X99/X299-Intel or X390-AMD Threadripper chipset and a CPU with at least 28 lanes.

Gigabyte Aorus X570S Master - Ryzen 5900X - Gskill 64GB 3200/CL14@3600/CL14 - Asus 1080ti EK-waterblock - 4x Samsung 980Pro 1TB - 1x Samsung 870 Evo 1TB - 1x SanDisc 120GB SSD - Heatkiller IV - MoRa3-360LT@9x120mm Noctua F12 - Corsair AXi-1200 - TiR5-Pro - Warthog Hotas - Saitek Combat Pedals - Asus PG278Q 27" QHD Gsync 144Hz - Corsair K70 RGB Pro - Win11 Pro/Linux - Phanteks Evolv-X 

Link to comment
Share on other sites

just saw that you run SLI and are already at 8x +8x with the GPU....I am pretty sure that will NOT work at all as you dont have enough lanes to spare = new CPU+Motherboard+RAM to make that happen.

 

As I said, doublecheck the motherboard manual !!

 

Sitting in train with Tethering...otherwise I'd check 4 you..

Gigabyte Aorus X570S Master - Ryzen 5900X - Gskill 64GB 3200/CL14@3600/CL14 - Asus 1080ti EK-waterblock - 4x Samsung 980Pro 1TB - 1x Samsung 870 Evo 1TB - 1x SanDisc 120GB SSD - Heatkiller IV - MoRa3-360LT@9x120mm Noctua F12 - Corsair AXi-1200 - TiR5-Pro - Warthog Hotas - Saitek Combat Pedals - Asus PG278Q 27" QHD Gsync 144Hz - Corsair K70 RGB Pro - Win11 Pro/Linux - Phanteks Evolv-X 

Link to comment
Share on other sites

RAID likes lots of applications requesting lots of random data.

 

I am sorry but this is just not true. RAID can be perfectly configured for linear reads which is done all the time for video postproduction.

My controls & seat

 

Main controls: , BRD-N v4 Flightstick (Kreml C5 controller), TM Warthog Throttle (Kreml F3 controller), BRD-F2 Restyling Bf-109 Pedals w. damper, TrackIR5, Gametrix KW-908 (integrated into RAV4 seat)

Stick grips:

Thrustmaster Warthog

Thrustmaster Cougar (x2)

Thrustmaster F-16 FLCS

BRD KG13

 

Standby controls:

BRD-M2 Mi-8 Pedals (Ruddermaster controller)

BRD-N v3 Flightstick w. exch. grip upgrade (Kreml C5 controller)

Thrustmaster Cougar Throttle

Pilot seat

 

 

Link to comment
Share on other sites

I am sorry but this is just not true. RAID can be perfectly configured for linear reads which is done all the time for video postproduction.

 

+1

 

Raid is for many things but for sure not for random small reads, that's where any Raid but Raid-0 looses ground as it needs to sync all drives before anything is read.

 

Anyway, there is a PCIe lane issue and thus that wont work anyway, 99% sure.

Gigabyte Aorus X570S Master - Ryzen 5900X - Gskill 64GB 3200/CL14@3600/CL14 - Asus 1080ti EK-waterblock - 4x Samsung 980Pro 1TB - 1x Samsung 870 Evo 1TB - 1x SanDisc 120GB SSD - Heatkiller IV - MoRa3-360LT@9x120mm Noctua F12 - Corsair AXi-1200 - TiR5-Pro - Warthog Hotas - Saitek Combat Pedals - Asus PG278Q 27" QHD Gsync 144Hz - Corsair K70 RGB Pro - Win11 Pro/Linux - Phanteks Evolv-X 

Link to comment
Share on other sites

More lanes are coming with threadripper and skylake-X.

 

Fortunately we won't have to worry about lacking PCI-e.

 

 

Regarding RAID-0: the key will be when the VROC boards come out for Intel motherboards. RAID-0 through the CPU directly. Doesn't even go through the DMI 3.0 chipset.

 

Should be blazing speed!

[sIGPIC][/sIGPIC]

Link to comment
Share on other sites

More lanes are coming with threadripper and skylake-X.

 

Fortunately we won't have to worry about lacking PCI-e.

 

 

Regarding RAID-0: the key will be when the VROC boards come out for Intel motherboards. RAID-0 through the CPU directly. Doesn't even go through the DMI 3.0 chipset.

 

Should be blazing speed!

 

 

I would rather invest in Threadripper and plug a big fat Adaptec Raid-Controller in a 8x PCIe slot, add 2 - 16 SSD drives and forget about that VROC thing. The unlock key is so expensive that I would rather add some more money and get a real dedicated Hardware driven Raid-On-Chip device.

 

Threadripper would have enough lanes to host 2 of those cards = 32 SSD drives fully accelerated ;)

Gigabyte Aorus X570S Master - Ryzen 5900X - Gskill 64GB 3200/CL14@3600/CL14 - Asus 1080ti EK-waterblock - 4x Samsung 980Pro 1TB - 1x Samsung 870 Evo 1TB - 1x SanDisc 120GB SSD - Heatkiller IV - MoRa3-360LT@9x120mm Noctua F12 - Corsair AXi-1200 - TiR5-Pro - Warthog Hotas - Saitek Combat Pedals - Asus PG278Q 27" QHD Gsync 144Hz - Corsair K70 RGB Pro - Win11 Pro/Linux - Phanteks Evolv-X 

Link to comment
Share on other sites

I would rather invest in Threadripper and plug a big fat Adaptec Raid-Controller in a 8x PCIe slot, add 2 - 16 SSD drives and forget about that VROC thing. The unlock key is so expensive that I would rather add some more money and get a real dedicated Hardware driven Raid-On-Chip device.

 

Threadripper would have enough lanes to host 2 of those cards = 32 SSD drives fully accelerated ;)

 

I think the VROC unlock key is for other kinds of RAID besides RAID0. I don't know what the advantage of RAID redundancy with SSDs is, but I wouldn't dedicate redundant M.2 SSDs for data security, would rather just backup to a RAID hard disk array or maybe a regular 2.5" SSD array.

 

 

There haven't been any benchmarks with VROC, but I would imagine that any drive even if non-RAID that goes through the CPU lanes directly without going through DMI is going to have blazing performance for games that load data from storage while in the simulation. We all know how big and detailed the new maps are, and I doubt that the RAM and VRAM will be loaded with every possible texture and model for a map.

 

 

My goal is to minimize stutter to provide the smoothest simulation. I am hoping that AMD has some sort of similar technology in the works, because the PPF ratio for threadripper is so much better than Intel at the moment.

[sIGPIC][/sIGPIC]

Link to comment
Share on other sites

I think the VROC unlock key is for other kinds of RAID besides RAID0. I don't know what the advantage of RAID redundancy with SSDs is, but I wouldn't dedicate redundant M.2 SSDs for data security, would rather just backup to a RAID hard disk array or maybe a regular 2.5" SSD array.

 

 

There haven't been any benchmarks with VROC, but I would imagine that any drive even if non-RAID that goes through the CPU lanes directly without going through DMI is going to have blazing performance for games that load data from storage while in the simulation. We all know how big and detailed the new maps are, and I doubt that the RAM and VRAM will be loaded with every possible texture and model for a map.

 

 

My goal is to minimize stutter to provide the smoothest simulation. I am hoping that AMD has some sort of similar technology in the works, because the PPF ratio for threadripper is so much better than Intel at the moment.

 

 

Yes, from what I have read Raid-0 is for free on VROC. The prices for any other Raid are in such a regime that one could really consider a proper RoC controller and have many more features that you may like once the shit hits the fan. I am talking out of experience with RAID systems over a long time and I have seen many RAIDs fail over time for different reasons, broken Backplates, bad drives, broken controllers, faulty cables...etc etc.. The good thing was, I never ever lost a single Bit till today on thos eproduction servers.

 

If one only aims for speed, heck..go for it..connect two NVMe-PCIe over VROC and call it a day.

I would not call this RAID as it misses the "R" function of the said idea behind it.

 

If you have a RAID-1 it also may go well if your board dies and the key gets invalid with the broken board ( new board = new key needed afaik ). With a proper controller, you get another RoC, plug it in and go on, this will not work the same way with VROC as far as I can tell from what I now.

 

The most critical part in RAID is to have your data redundant and that you always should have options to counter failures in each and every part involved.

 

With the PCIe lanes on X299 I dont see a real chance to fully equip a WorkStation in a way that it makes sense. You want 1-3 GPUs, 1-2 10Gbit NIC, multiple NVMe....that sound more like Threadripper to me. More lanes, more bandwidth, more raw power when you need to shuffle lots of data...also from 1 PC to another..hence the 10Gbit NICs. 1Gbit is way obsolete, 112MB/sec does not match the current speed of the other "new&fast" parts of our machines. The 1Gbit NIC is my bottleneck when I copy data from PC-PC with SSDs involved, not even NVMe.

 

We will see how VROC is accepted, I doubt it will play any significant role anywhere but for gamers with RAID-0 that dont care about redundancy and easy ways back from disaster.

 

I also didnt read it supports RAID-6 or 60...which is my favourite for highly secured data. I have seen 2 drives fail in a RAID before, luckily it was RAID-6 and kept on...until the new drives were in. It does happen and it will happen again ;)

Gigabyte Aorus X570S Master - Ryzen 5900X - Gskill 64GB 3200/CL14@3600/CL14 - Asus 1080ti EK-waterblock - 4x Samsung 980Pro 1TB - 1x Samsung 870 Evo 1TB - 1x SanDisc 120GB SSD - Heatkiller IV - MoRa3-360LT@9x120mm Noctua F12 - Corsair AXi-1200 - TiR5-Pro - Warthog Hotas - Saitek Combat Pedals - Asus PG278Q 27" QHD Gsync 144Hz - Corsair K70 RGB Pro - Win11 Pro/Linux - Phanteks Evolv-X 

Link to comment
Share on other sites

Yes, from what I have read Raid-0 is for free on VROC. The prices for any other Raid are in such a regime that one could really consider a proper RoC controller and have many more features that you may like once the shit hits the fan. I am talking out of experience with RAID systems over a long time and I have seen many RAIDs fail over time for different reasons, broken Backplates, bad drives, broken controllers, faulty cables...etc etc.. The good thing was, I never ever lost a single Bit till today on thos eproduction servers.

 

If one only aims for speed, heck..go for it..connect two NVMe-PCIe over VROC and call it a day.

I would not call this RAID as it misses the "R" function of the said idea behind it.

 

If you have a RAID-1 it also may go well if your board dies and the key gets invalid with the broken board ( new board = new key needed afaik ). With a proper controller, you get another RoC, plug it in and go on, this will not work the same way with VROC as far as I can tell from what I now.

 

The most critical part in RAID is to have your data redundant and that you always should have options to counter failures in each and every part involved.

 

With the PCIe lanes on X299 I dont see a real chance to fully equip a WorkStation in a way that it makes sense. You want 1-3 GPUs, 1-2 10Gbit NIC, multiple NVMe....that sound more like Threadripper to me. More lanes, more bandwidth, more raw power when you need to shuffle lots of data...also from 1 PC to another..hence the 10Gbit NICs. 1Gbit is way obsolete, 112MB/sec does not match the current speed of the other "new&fast" parts of our machines. The 1Gbit NIC is my bottleneck when I copy data from PC-PC with SSDs involved, not even NVMe.

 

We will see how VROC is accepted, I doubt it will play any significant role anywhere but for gamers with RAID-0 that dont care about redundancy and easy ways back from disaster.

 

I also didnt read it supports RAID-6 or 60...which is my favourite for highly secured data. I have seen 2 drives fail in a RAID before, luckily it was RAID-6 and kept on...until the new drives were in. It does happen and it will happen again ;)

 

Good points.

 

Do you see a high failure rate of SSDs? Or are you specifically talking about spinning platters?

 

We're not sure on 7920x+ intel chips either, they might have 64 lanes like threadripper, though we'll find out in a few months.

[sIGPIC][/sIGPIC]

Link to comment
Share on other sites

Good points.

 

Do you see a high failure rate of SSDs? Or are you specifically talking about spinning platters?

 

We're not sure on 7920x+ intel chips either, they might have 64 lanes like threadripper, though we'll find out in a few months.

 

 

I watch/service like 20 PC based on SSD's, including my few PC's and all but my MacbookPro is based on Samsung 850Pro, all 256GB but 2 drives in a server which are 512GB. Not a single failure till today on those...I have built all those PC's as well, I never sell any other than 850Pro, if you want an evo, go elsewhere ;) !!

 

My MacBookPro's original SSD ( Samsung 830 Evo afaik that was ) failed after 3 years and 1 month heavy usage ( lots of VMware and BootCamp ). I guess I had worn it out. I now have a 480GB 3rd party drive in it ( not Samsung as it is a special FormFactor and only Apple + 2 other vendors make that ). I really hope that one lasts longer but I have my doubts, I trust Samsung SSDs and barely any other.

 

If I would buy NVMe I would only buy 960Pro from Samsung, never an Evo tbh, but my usage is more WorkStation like and for a traditional HomePC they are good, in fact I got a few for free and use 1 of them in my PC as my 3rd SSD, it's ok for my sons games :)

 

One should know that SSD's die an instant death, there is ZERO warning. Keep that in mind, backup your data while you can and dont get lazy doing that. HDD's usually gave a warning, SSDs dont.

Gigabyte Aorus X570S Master - Ryzen 5900X - Gskill 64GB 3200/CL14@3600/CL14 - Asus 1080ti EK-waterblock - 4x Samsung 980Pro 1TB - 1x Samsung 870 Evo 1TB - 1x SanDisc 120GB SSD - Heatkiller IV - MoRa3-360LT@9x120mm Noctua F12 - Corsair AXi-1200 - TiR5-Pro - Warthog Hotas - Saitek Combat Pedals - Asus PG278Q 27" QHD Gsync 144Hz - Corsair K70 RGB Pro - Win11 Pro/Linux - Phanteks Evolv-X 

Link to comment
Share on other sites

Regarding HDD's..hard to say...but my worst experience ever was a few years back with my last Workstation, 16-Port Adaptec SAS Raid controller and 16 Samsung drives...OMG..one after the other failed !!! I love their SSD's but wouldnt buy any HDD from them anymore. Good thing I had RAID-6 everywhere and that saved my ass/data. I used 250GB and 2TB drives, Sata ( not SAS ).

 

I have seen them all fail in 24/7 usage if they are Sata based, no matetr what they say on the box..NAS certified..blabla...they exchange the drive but you got the hassle.

 

On production servers I service I only run Dell provisioned SAS drives and SSD's, those cost like 4-8x the price but I have only seen 2 drives fail on those servers yet, 7 years 24/7/365 now..thats a good reliability but that is only me and my servers I manage, others may have totally different experiences.

 

Samsung released new SAS SSD's 2.5" drives for like 400€ for 400GB, those I wanna look into for Workstations and self made servers. They have never been that cheap and reliable ( mixed usage type drive ). They just wont run on Sata conns, you have to invest 500-1500€ in a RoC controller and cableset to connect them.

Gigabyte Aorus X570S Master - Ryzen 5900X - Gskill 64GB 3200/CL14@3600/CL14 - Asus 1080ti EK-waterblock - 4x Samsung 980Pro 1TB - 1x Samsung 870 Evo 1TB - 1x SanDisc 120GB SSD - Heatkiller IV - MoRa3-360LT@9x120mm Noctua F12 - Corsair AXi-1200 - TiR5-Pro - Warthog Hotas - Saitek Combat Pedals - Asus PG278Q 27" QHD Gsync 144Hz - Corsair K70 RGB Pro - Win11 Pro/Linux - Phanteks Evolv-X 

Link to comment
Share on other sites

Anyone using such an array with success? Why use RAID0 with DCS, and most importantly, why NOT use RAID0 with DCS?

 

I`m using 3 SSD`s in RAID 0, works very nice. getting 3x performance of single drive.

IAF.Tomer

My Rig:

Core i7 6700K + Corsair Hydro H100i GTX

Gigabyte Z170X Gaming 7,G.Skill 32GB DDR4 3000Mhz

Gigabyte GTX 980 OC

Samsung 840EVO 250GB + 3xCrucial 275GB in RAID 0 (1500 MB/s)

Asus MG279Q | TM Warthog + Saitek Combat Pedals + TrackIR 5

[sIGPIC][/sIGPIC]

Link to comment
Share on other sites

I`m using 3 SSD`s in RAID 0, works very nice. getting 3x performance of single drive.

 

Sure, that is all routed through the PCH chip and falls under the DMI bandwidth limitation. The OT was asking for PCIe NVMe, that is a totally different game..and somehow not.

 

 

Add a few more and you will hit the barrier that DMI has. Also DMI serves a few more devices and all those share those lanes, Gbit, USB20 USB30 USB3.1, Sound, and all other PCIe slots on your board.

 

You do know that you take 3 x the chances of 100% data loss if 1 of those 3 fails ?!


Edited by BitMaster

Gigabyte Aorus X570S Master - Ryzen 5900X - Gskill 64GB 3200/CL14@3600/CL14 - Asus 1080ti EK-waterblock - 4x Samsung 980Pro 1TB - 1x Samsung 870 Evo 1TB - 1x SanDisc 120GB SSD - Heatkiller IV - MoRa3-360LT@9x120mm Noctua F12 - Corsair AXi-1200 - TiR5-Pro - Warthog Hotas - Saitek Combat Pedals - Asus PG278Q 27" QHD Gsync 144Hz - Corsair K70 RGB Pro - Win11 Pro/Linux - Phanteks Evolv-X 

Link to comment
Share on other sites

 

You do know that you take 3 x the chances of 100% data loss if 1 of those 3 fails ?!

 

 

i do know , that is why i got a NAS (Raid 10) and i don`t have anything critical on it (mainly games)

 

p.s i got those 3 drives for a ridiculous price , so...why not :)

IAF.Tomer

My Rig:

Core i7 6700K + Corsair Hydro H100i GTX

Gigabyte Z170X Gaming 7,G.Skill 32GB DDR4 3000Mhz

Gigabyte GTX 980 OC

Samsung 840EVO 250GB + 3xCrucial 275GB in RAID 0 (1500 MB/s)

Asus MG279Q | TM Warthog + Saitek Combat Pedals + TrackIR 5

[sIGPIC][/sIGPIC]

Link to comment
Share on other sites

  • Recently Browsing   0 members

    • No registered users viewing this page.
×
×
  • Create New...