DIY Standalone RAID

Posted by: Waragainstsleep

DIY Standalone RAID - 08/20/07 04:15 PM

I have a case from a rack mount UPS which looks like it could happily hold a bunch of drives and is deep enough to hold a few other things. I want to build a computer in there to host some drives. basically I'm thinking it will need PCI for a SATA card, and gigabit ethernet. Besides that, I want it to be able to share via AFP and ideally use Bonjour. It will need to run either software RAID or have a compatible low cost SATA RAID card available, and run VNC, preferably to work with Apple Remote Desktop.

I figure this leaves me the option to run some flavour of Linux, or try to build a hackintosh. Will Linux run the services I want?
I'm looking for suggestions for both sotware and hardware solutions. I need the smallest board possible which has plenty of SATA ports or expansion slots for them. Overall system speed and CPU are not immensely important. Something a year or two old will probably suffice. The point is to do this as cheaply as possible. I'm not looking for Xserve RAID performance. I just want to be able to hold large amounts of data with redundancy and the hardware needs only do a reasonable job of keeping up with gigabit ethernet. Its for streaming media across a local network and keeping it safe.
Posted by: zenstate

Re: DIY Standalone RAID - 08/20/07 08:38 PM

Any shuttle PC should do the job.  That way you could have on board SATA and not need a PCI card and in turn take up less space.  The mobo's are pretty small in those and since its x86 hardware you could run any version of linux you want.
Posted by: Waragainstsleep

Re: DIY Standalone RAID - 08/20/07 11:58 PM

But how man SATA ports are we talking? This box I have could hold 10 drives or more no problem. Thats the sort of size I'm going for.
Posted by: Waragainstsleep

Re: DIY Standalone RAID - 08/22/07 11:52 AM

OK, my new plan involves recycling the old logic board and CPU from a DA I used to use at work. It was donated when one of the USB ports came out of its socket. If you shove it back in and leave a cable connected, it works perfectly, otherwise it powers off or crashes, but that isn't an issue in this case, it will be rare I need even one USB port on it.
Anyone know the maker of this card:

http://eshop.macsales.com/item/Norco%20Technologies/NORCO4618/

It ticks all the boxes except I would really have loved to have RAID 5 in place.
Anyone know anything about this one:

http://www.newegg.com/Product/Product.aspx?Item=N82E16816115022

The price isn't too much worse, since I might well have used two of the cheaper ones. If I only have one card, I might be able to use a riser to flatten it 90 degrees against the board. That gives me plenty of space for the drives and some cooling.
Posted by: zenstate

Re: DIY Standalone RAID - 08/22/07 01:52 PM

The first one you linked from OWC must be decent as they only carry quality products.  eSATA gives you more options also.
Posted by: Protocol6v

Re: DIY Standalone RAID - 08/22/07 05:13 PM

DONT BUY THE NORCO!!! I bought it and it's based off the Silicon Image chip set. The driver that Silicon Image has causes kernel panics on any vs. of OS X older than 10.2. I have tried just about every card except the High Point and they are almost all based off the Silicon Image chip set. I settled on the Sonnet Tempo, it runs great and supports up to 20 drives with port multipliers. I have heard good things about the High Point though.
Posted by: Waragainstsleep

Re: DIY Standalone RAID - 08/23/07 12:51 AM

OK, thanks for the tip. Does the Sonnet do RAID 5? The more I think abot it, RAID 5 is going to be absolutely essential. I Need redundancy otherwise I may as well keep shoving bigger single drives into my MDD or Xserve.
Posted by: Protocol6v

Re: DIY Standalone RAID - 08/23/07 06:13 AM

No I don't think it does. The High Point you linked to on new egg would be a good thing to try.
Posted by: zenstate

Re: DIY Standalone RAID - 08/23/07 10:25 AM

Sorry if my advice sucked..  I have no experience with SATA RAID's.
Posted by: Waragainstsleep

Re: DIY Standalone RAID - 08/23/07 12:20 PM

If I go this route using the DA bits and Highpoint card from Newegg, I can get 8 HDs running happily in the box. (The issue of being able to afford them is another matter.)
There is room to add another Highpoint card later on with a second bank of 8 HDs. The beauty is I can add a dual CPU module to the board (I can, can't I?), and in those circumstances I should be literally doubling my storage with no performance hit.
I have thought about fibre channel, but I think gigabit ethernet will be plenty for my needs. I can add a second one of those too. It would be worth testing the two separate RAID sets when using one NIC and an IP each, see its faster than just using the one NIC for both.

Since I already have almost everything I need bar the controllers and the drives, I figure I can get the first bank up and running at 4TB (unformatted) for about 650 which is $1300. Should be over 3TB with parity and formatting. Thats not bad at compared to an Xserve RAID.
I could settle for 2.4TB for 460/$920. Should end up around the 2TB mark.
A 1TB Xserve RAID costs 4200 over here.

I must stop getting excited about projects I can't afford to finish.
Posted by: TCPMeta

Re: DIY Standalone RAID - 08/23/07 05:56 PM

Sounds like my first high end file server back in 98. A old AMD 486 DX2 133MHz with a Biostar socket 4 motherboard with built-in IDE and 2 PCI slots. I tossed into a old SGI server case with 4 IDE hard drives and 7 SCSI hard drives. I got around 90GB in all of space and used Linux to run the system. I didn't have anything special for running the system, I used a FTP and SSH.

Your system sounds sweet as heck but why would you need all of that hard drive space? The largest drive I own is 250GB and i've only used 20GB out of it. What are you using to power all of those drives? I doubt a DA power supply can handle all of that with out stressing out.
Posted by: Waragainstsleep

Re: DIY Standalone RAID - 08/24/07 12:13 AM

You're quite right, I will need to get a beefed up PSU and mod it to run the DA board I suspect, but they are pretty cheap these days.Suggestions of wattage are welcome. Could I run 8 (or even 16) drives and a DA from 600W?

As for whats going on it, the answer is my music and video libraries. But it will also be used to backup systems on my home network. I might run portable home folders off it too. Not sure just yet. The Xserve will probably end up running a mailserver among other things, so that will archive to the RAID too.

I have sooo much media its not even funny. I estimate I am around the terabyte mark at the moment, but if I had two or three more I could be even less picky about what I add to it. smile
The current terabyte is spread across 3 machines in two buildings on 9 HDs and another RAID (at work).
I just want it all centralised and fault tolerant.
Posted by: Protocol6v

Re: DIY Standalone RAID - 08/24/07 06:46 AM

At the 16 drive point, I would split it into two banks and use dual 600W supplies.
Posted by: zenstate

Re: DIY Standalone RAID - 08/24/07 06:58 AM

16 drives?  I would say thats overkill unless you run a web server.
Posted by: MeltedRabbit

Re: DIY Standalone RAID - 08/24/07 09:04 AM

Personally, for this project I would look into how much money I had and look at how much storage I really needed.  I must admit that I am not familiar with what a "DA" is.  However, unless a DA has at least two 64-bit 2GHz processors, I would not use it for this project.  It sounds like it only has PCI-X slots.  While this is not necessarily a bad thing if you were to go with a new motherboard with multiple x4, x8, or x16 PCIe slots you would have the more bandwidth and less bus contention.  On Intel chipset server motherboards, all of the PCI-X slots are funneled through a single controller that is only has a x4 electrical PCIe width and external PCI-X Magma expansion chassis also only use a x4 PCIe slot.  In my research on SATA RAID setups, if you want to get say 4TB with RAID 0+1 for fault tolerance (RAID 5 at these prices would use the CPU for the XOR operation and would be very slow as a result).  So for this you would need sixteen 500GB drives or eight 1TB drives.  For the drives alone you are looking at about $100 for each drive, so roughly $1600 for the drives alone.  Then, you would need a case for these disks.  On newegg.com there is a pretty barebones case in the external enclosures section from AMS that has four drive bays and a four port SATA port multiplier which runs $200.  This is the least expensive case on Newegg.com with a SATA multiplier.  Buying four of those would set you back another $800. Add in the Norco SATA PCI-X card for $80 and you are looking at roughly $2500 or more if you want 8MB cache drives or 16MB cache drives.  If you think that four of those cases would be too big, you could use a PC tower case and take the SATA multipliers out of the cases and put them in the computer case with the drives.  It would require some hacking, but this is macmod.com.  If you wanted fewer drives I would use 1TB drives, however you are looking at roughly $3200 for a similar setup as above with eight 1TB drives at $350 each.  I think that if you wanted fewer drives or even if you have some time, that you should consider waiting until 1TB Samsung F1 series drives are available.  These 1TB drives use three platters which should be more reliable than the four or five platter 1TB drives from their competitors.  They supposedly were going to be shipped to OEM customers on 8/16/07, so they should be available within a month.  However I doubt that the Samsung drives will be less than $250 which is the price eight of them would need to be to be the same price as a similar sixteen drive setup using 500GB drives.  I'd also wait to see the Samsung F1 series drives because the 500GB drive might cost less than an equivalent drive from a competitor.  I am not trying to sound like a Samsung salesman, but I guess I like their drives because in the past Samsung drives have been generally less expensive and I have had no quality problems with them.  Also as the F1 series have a maximum platter density of 334GB, highest in the industry so the drives should be more reliable than their counterparts which use more platters and heads.  Also just to sound like a broken record, I'll mention it again as it seems to have been suggested as an option several times, RAID 5 at the low end is always going to have the XOR operation done by the CPU and will be slower that molasses in January, I don't think it should even be an option you even consider.
Posted by: TCPMeta

Re: DIY Standalone RAID - 08/24/07 02:03 PM

PCI-E didn't come out for a Mac until Apple switched over to the Intel CPU. The DA (Digital Audio) is a older G4 system. It uses a 133MHz FSB and uses PC133  SDRAM. Check this out. http://www.macgurus.com/productpages/sata/lycesata4.php
Posted by: MeltedRabbit

Re: DIY Standalone RAID - 08/24/07 03:50 PM

I figured that that DA referred to the "Digital Audio G4", but sorry, I just didn't expect that someone would even seriously think about using anything less than a 64 bit CPU like a G5 or an Intel Xeon system.  Besides most older systems probably cannot properly deal with filesystems above 2TB.  Even though Sonnet makes a dual 1.8GHz G4 upgrade for the DA, a Dual 1.8GHz G4 would be insufficient for any serious server.  The maximum I/O throughput of both the CPU bus and the throughput of the 66MHz 32-bit PCI interconnect of such a system would be too low for a reasonable server, especially with RAID 5.  Yes, the PCI slots are 33MHz/64-bit but the connection for the south bridge and many of the peripherals is 66MHz/32-bit PCI.  A PCI to PCI bridge chip is used to bridge 66MHz/32-bit PCI into 33MHz/64-bit PCI on the DA.  The reason that 66MHz/32-bit PCI slots never caught on with x86 computers and even other Macs is that the timing and signaling requirements are too high.  PCI is a load/store architecture like a RISC processor and the decode access window for instructions on 66MHz PCI is 2 ns.  This is not very much time and hard to design for, AGP and PCI-X slots add time for certain operations and relax signal requirements.  Besides, the Gigabit Ethernet on the DA is slower than a modern PCIe GigE card.  As much as I like Macs, literally for the cost of a 1.8 GHz DA upgrade from Sonnet, $500, one could for exactly the same price, buy, two generic 1GB DDR2 800 DIMMS, an Asus M2R32 socket AM2 motherboard which uses the AMD 580X chipset, a top of the line AMD Athlon 64 X2 6400+ dual core Socket AM2 processor, running at 3.2GHz.  Such a system would not even be somewhat comparable to a DA, the I/O throughput would be much higher and such a system would in general be much faster than a Dual 1.8GHz DA.  Granted, the AM2 system would still need a CPU cooler, case, PSU(s), video card, but still one should be able to pick those up for less than $240, the cost of four decent 512MB PC133 DIMMs for the DA.  Not that the DA could necessarily properly handle 2GB of RAM.

The case you mention probably would work fine, but I thought the AMS case would be better because I personally think that removeable drive trays on more expensive external cases are not of much use.  Besides the SATA multiplier chip used in every SATA multiplier on the market is the same and is from Silicon Image.  It also shouldn't matter which SATA multiplier case one buys, as there is no driver software necessary for SATA multipliers aside from the driver software used on the SATA controller.

Also since I am a nitpicker, there are PowerMac G5 PCIe systems, but they are probably not very common as IIRC they were not available for a long time and probably didn't sell too well as many people may have waiting to buy an Intel Mac Pro.

http://developer.apple.com/documentation/Hardware/Conceptual/PowerMac_G5_05Oct/index.html
Posted by: TCPMeta

Re: DIY Standalone RAID - 08/24/07 05:44 PM

A 64bit CPU setup is over kill just for a home RAID server. At the most a dual CPU setup would be fine but that is still a bit much. I've seen people use old IBM Pentium Pros as a home RAID file server and ran perfectly fine. It mostly has to do with the cashe on the drives and the RPMs. You can have the worlds fastest computer but with a 4800 RPM 4MB cashe drive it would run low as heck on file transfers and loading files into swap or VM. Even if the system was to use 10/100/1000 nics it wouldn't be much faster unless you had all of the systems connected to it with the same type of nics. In all it's the speed of the drives and the LAN traffic.
Posted by: Protocol6v

Re: DIY Standalone RAID - 08/25/07 09:13 AM

The Dual Core G5s had PCI-E. And, you definitely don't need a 64-bit system to run a large RAID or RAID 5 setup. I have a 3.5TB Xserve RAID running on a dual 500 G4 where I work. We also have a 7TB running on a G5. The G4 doesn't have that much of a performance decrease when your networked to it. It's all limited by the speed of your NIC anyway, as stated above. Sure, native things like a find copy to the RAID and such are slower, but thats only because the G4 doesn't have PCI-X.
Posted by: Waragainstsleep

Re: DIY Standalone RAID - 08/25/07 05:44 PM

The one requirement I didn't mention was that I wanted the system to be rack mounted. Thats why I chose this old UPS case. The drives will not be in modules, so not hot swappable. I originally thought of building a PC and running some flavour of Linux on it, but I do want reliable AFP from it. The DA parts have the advantage of being free, but will also do iTunes and iPhoto sharing.
I also agree that 64bit CPUs are overkill in a system where the PCI is going to be the bottleneck. The filesystem size is down to the OS (At least where Macs are concerned), except for hardware limits (DA will only do 128GB) on the drive controller chip, which will be on the RAID card in this system.
The system is going to hold 90% media, so it only has to be able to stream video with any kind of urgency. I'm not using it for video editing or as a scratch disk for CS. I know I mentioned fibre channel, but I doubt I would get much benefit from it over gigabit ethernet.
I don't know much about the chips in the Xserve RAID, but I doubt they are 64bit, and I doubt they are clocked very high, since these are dedicated chips, not general CPUs. They are probably just drive controllers with their own memory controllers and network interfaces added on. Which is why I reasoned a G4 at 700MHz or so would do a reasonable job. If I find myself accessing the RAID from two other Macs at once, I could add the dual CPU then, and should help with concurrent access, but I am unlikely to want to watch two videos at once.
As for choice of drive, I hear alot of people who swear by certain brands or models, but I see dead drives of all makes all the time. Such user testimonis may have been accurate when drives were at 20 or 30GB, but the bigger and faster they get, the less reliable they become. I always recommend buying the drive with the longest warranty, and having fault tolerance. But when you are talking terabytes on a tight budget, full mirrored backups or archives are inefficient and expensive.