Advice on Mac Server for Laboratory Setting

Discussion in 'Mac OS X Server, Xserve, and Networking' started by MAH11, May 8, 2014.

  1. MAH11 macrumors newbie

    Joined:
    Sep 24, 2010
    #1
    For the past decade, we have had a small network of ~20 Macs and ~50 users who use remote home directories served from a Xserve in an undergraduate laboratory environment. (The applications reside on the client Macs, of course.) This system has been rock solid, requiring little maintenance or oversight. Equally importantly, this set-up prevents students from storing their work on the client Macs and from editing/deleting/moving important files. Because the faculty members who teach one of the courses that use the lab have been too lazy to update their handouts past Excel 2004 (!), the entire system is running an embarrassingly ancient version of Mac OS. For a variety of reasons, we need to upgrade to a more recent OS this summer.

    The issue is the server: a XServe 2,1 (single 2.8 GHz quad-core Xeon processor) which can not be upgraded past Mac server 10.7.5 (Lion). There are two options:

    1. Keep the Xserve, but upgrade it to 10.7.5 while updating all of the client Macs to Mavericks.

    2. Purchase a new machine (which one?) to use as the server, run a newer version of the server software (which one?), and update all of the client Macs to Mavericks.

    Our IT department has security concerns about #1 and is pushing us to move to a Windows server (fat chance!), but they have little experience with (and low opinions of) Macs.

    Our server needs are quite modest, just serving the remote home directories. No e-mail, no remote updating of the client Macs. Additionally, the students are not storing massive data sets or editing video.

    After reading posts in various forums, I have two concerns. First, I have read disparaging words about the capabilities of recent versions of Mac server, but I'm not sure that these limitations will affect us. Second, I have read disparaging words about the speeds of Mac mini based servers, but it is not clear why this option would be significantly slower than the six-year-old Xserve.

    Which option would you choose and why? If you were to choose option #2, what hardware would you use (Mac mini? How much memory? SSD + Thunderbolt drive? Or Thunderbolt RAID?) and which version of the server (Mavericks server?).

    Thanks for your thoughts and advice,

    Melissa
     
  2. Cubytus macrumors 65816

    Joined:
    Mar 2, 2007
    #2
    Now I see your problem as a struggle to prove Macs are perfectly capable of doing this job! And probably better than any Windows server would do.

    Why not replace the current XServe with a nMP? Its all-SSD will surely come in handy when many students will access their directory at any given time. You haven't listed the XServe's RAM, though. A Mac Mini can't be upgraded much in this regard.

    Now, I heard the same thing about OS X Server, that the last decent one was Snow Leopard. I guess you'll have to make sure none of the applications has any issues with Mavericks. It's not like you are going to have much OS choice.
     
  3. chrfr macrumors 603

    Joined:
    Jul 11, 2009
    #3
    You need to stop using the Xserve. They're old and parts are difficult to find.
    I manage several Mac servers running Mavericks server in a similar but larger environment. They're reasonably stable but Apple has next to no interest in the server business. Do not dismiss a Windows server out of hand as it may actually be the best option, particularly if that's what your internal IT staff understand best. I likely will not continue with Apple servers at my next hardware refresh.
    Apple doesn't build a machine that is server level hardware anymore, as you know. OS X integrates extremely well with Active Directory.
    The disks in the Mini Server are too slow to perform well sharing out home folders, so if you opt to go with a Mini you should plan on buying external storage.
    Worst of all, you never know if or when Apple will just go ahead and discontinue support for Server.

    ----------

    The new Mac Pro isn't a good option for a server; you pay a steep premium for features that are of no benefit in this environment and the internal 1TB of storage is unlikely to be sufficient for 50 users and their home folders.
     
  4. MAH11 thread starter macrumors newbie

    Joined:
    Sep 24, 2010
    #4
    The current server has 6 MB of RAM. The Mac Mini can go up to 16 MB, so this shouldn't be a limitation.

    The biggest issue for me with going to a Windows server is that I do not want to lose the remote home directories. In my experience (20 yrs), students are not good about saving files in the correct location. This leads to unfortunate issues with academic integrity, as well as problems with students who always want to use the same computer (because that's where their files are).

    We have the same issue in the research center that I run. Staff save their files on their local machine even though they have ready access to a fast Windows server.

    In your experience, would the Mini performance be acceptable with a fast Thunderbolt drive? Is that the key parameter in spec'ing a new server?

    Thanks for all of your feedback!
     
  5. mvmanolov macrumors 6502a

    Joined:
    Aug 27, 2013
    #5
    i runa maxed out mini as a home server mainly for file sharing... given the fact that you max file transfer speed is limited by your max LAN speed and unless you are running GbE LAN that's 100MB/s a TB drive won't do you much more good than a USB 3.0 drive.

    also a single drive may not be a) big enough and b) reliable enough. So you may want to consider a USB 3.0 RAID 5 or 10 array (depending on array size, storage needs etc.)

    One way to mitigate LAN speeds (unless GbE - which btw the mini cannot take advantage of as you cannot upgrade the Lan card to GrE - though you could get a GbE to TB adaptor) is to set up Link Aggregation between the mini's LAN and TB-LAN, this way you get a max simultaneous transfer rate of 200MB/s but that may still be slow depending on how many users are transferring files at the same time...

    Now if you buy an oMP you can add more LAN cards and set up link aggregation over say 6 ports this having a theoretical max of 600MB/s. neither of the client machines will be able to take advantage of that theoretical max, but you will be able to have 6 clients transferring files at the 100MB/s Lan max...

    the server app i have found to be relatively stable and for the most part robust enough for my needs... it is fairly intuitive to runs and set up, though i have had some issues as i also use the server mini as HTPC (i know a big no no for servers to also be user machines... but being a home user i couldn't afford a separate server mini... that being said, after some tinkering i have it all running quite sensibly now... up time currently is at 17 days without issues only because i had to reboot after some under the hood tinkering, prior to that my uptime was at 23/24 days without issues)

    for you needs i would think that a oMP may be also better from the perspective that you can set up the RAID array internally, so you won't have to purchase expensive USB 3 Raid box...

    in either case though, while i agree that apple is not necessarily interested in enterprise support, they are interested in SMB support.... so it is unlikely that they will drop the server app altogether (especially given that all the server tools are already built into the OS thus the app is just the front GUI end of what you can already do under the UNIX base).

    And since you are not interested in running a server farm... a mac as a file sharing server can be just as useful as a dedicated linux server....

    good luck :)
     
  6. MAH11 thread starter macrumors newbie

    Joined:
    Sep 24, 2010
    #6
    Thank you for your thoughtful and detailed reply. Just to clarify, are you running the latest version of Mac server and OS? Also, your reasoning is based almost entirely on transfer speeds, which makes sense. What role, if any, does server memory play? Will maxing out the memory have any effect on server performance?

    Thanks again.
     
  7. mvmanolov macrumors 6502a

    Joined:
    Aug 27, 2013
    #7
    yes running latest os and server app.

    ram will increase overall performance but so will an SSD. my mini is maxed out ram and a128 EVO(start up drive) and it flies, that being said, you need to decide if you have the $$$ for that. the 16gig ram for me was overkill as i never even come close to utilizing all of it. but i like the peace of mind. i would go with at least 8gig to be sure however. also i'd get the quad i7 mini (if you decide to go for the mini) as that will make a difference if you are doing other things on it (running other services)

    :)
     
  8. Cubytus macrumors 65816

    Joined:
    Mar 2, 2007
    #8
    Sure there's no need for such a strong video card in a server. How much does it add to the machine cost?

    100MBps is enough for most academic work. Besides, it's likely that the network is indeed 1Gbps-enabled. What is more of a concern to me is the simultaneous access requirements. With all 50 client computers used, it gives only 20Mbps to each student, which isn't fast, still, depending on what type of work they perform. It's unlikely to be a problem with only office files.

    If 6GB currently manages it all without issues, I don't see why maxing it out to 16GB would be a problem. With RAM, the more you can afford the better.

    Maybe a Mac Mini with maxed out RAM and SSD inside would be enough? Of course, set up with a proper backup.
     
  9. chrfr macrumors 603

    Joined:
    Jul 11, 2009
    #9
    It's hard to say what it adds. You can't buy a Mac Pro without 2 graphics cards. Neither are of any value in a server. The OP's server needs aren't very CPU intensive either. You could just about buy 3 Mini servers for the price of the base Mac Pro.

    For networked home folders, an academic environment is a worst case scenario. You have typically at least 20 users all hitting the server simultaneously at the start and end of classes as students log in and out, and home folders can take quite a while to sync at either end of the class. Gigabit ethernet at least from the server to the switch should be considered mandatory, regardless of server platform.

    The OP hasn't specified how much data they have, so it's hard to really offer useful advice here. I would hesitate to rely on internal storage alone, even if it's SSD, but network speed is the bottleneck even with 7200 rpm 3.5" disks which are faster than the stock 5400 rpm spinning disks in the Mini. Even 2.5" 7200 rpm disks will just about be fast enough to saturate gigabit ethernet. With a Mini server, external storage makes the most sense.
    8GB of RAM is likely more than enough. I run a 16GB Mac Pro server which at peak periods will have well over 100 connections, and it never uses all the RAM. In addition to being a file server, it's an Open Directory master, and hosts about 8 print queues.

    ----------

    Every Intel Mac mini has had gigabit ethernet built in.
     
  10. Cubytus macrumors 65816

    Joined:
    Mar 2, 2007
    #10
    As I said, I never came across an academic setting that wouldn't have GbE. What we have here are synchronized home folders, but they are deleted once every 24 hours. So indeed, logging on the same computer would be a bit faster for students, but they don't store enormous amounts of files anyway. Would this scenario require more RAM than the current 6GB?

    With a conservative estimate of 1GB max per student, storage requirement would be 50GB at worst. Nothing a SSD couldn't bear. Maybe a SSD for students home folder storage, and a spinner for the server OS?
     
  11. mvmanolov macrumors 6502a

    Joined:
    Aug 27, 2013
    #11
    gigabit ethenet is 100MB/s... GbE is 10 gigabit...

    ----------

    but that is rather expensive for limited space... a RAID 5 or 1/0 box will offer similar read/write speed at a fraction of the cost/GB
     
  12. chrfr macrumors 603

    Joined:
    Jul 11, 2009
    #12
    Mine didn't up until about a year ago.

    I'm not sure if 6 is enough. I wouldn't be comfortable with it myself, but I haven't looked to see what sort of usage my server gets at peak aside from noting that 16 is more than adequate.

    ----------

    GbE is Gigabit Ethernet. I'm not sure what you're saying here.
     
  13. Riot_Mac macrumors regular

    Joined:
    Nov 3, 2003
    Location:
    IL
    #13
    Mavericks Server is the best server.app since Snow Leopard. You will be just find if you go with a mini with 16GB of RAM, an SSD and an external TB RAID (RAID10 if you can)
     
  14. mvmanolov macrumors 6502a

    Joined:
    Aug 27, 2013
    #14
    apologies, i mean 10 GbE
     
  15. dporvin macrumors newbie

    Joined:
    May 12, 2014
    #15
    I have our school labs 2008 XServe running 10.8.5 & plan on upgrading to Mavericks this summer. I had to swap out the stock video card for an NVIDIA GeForce GT 120 512 MB (luckily I had one sitting around), and edit a text file to add the xserve to the list of supported machines in the installer. It works fine but you have to re-edit the text file with every time you update the OS.
     
  16. MacsRgr8 macrumors 604

    MacsRgr8

    Joined:
    Sep 8, 2002
    Location:
    The Netherlands
    #16
    I used to manage lots of Macs (G4 iMacs up until the first Intel based iMacs, with central Home dirs on all types of Xserves (G4 thru latest Intel - 3,1) where the storage used to be the Xserve RAID and later the Promise RAIDs.

    Being a SysAdmin for OS X, having the Xserve was being able to defend using OS X Server as our principle Server OS.
    I left that company before the Xserve was discontinued.

    In 2011 it was game over for the Xserve. It was very tough for that company to defend using OS X Server, as that OS was no longer possible to install on "Server-grade" hardware, i.e. 19" rack-mountable, dual power supply, etc. As OS X Server can only be virtualised on Apple hardware, then using Apple's Server OS became an issue.

    Since OS X Lion, OS X Server started to transform to being just an app. I wonder why Apple still calls it "OS X Server" and not just call it something like "Services.app".

    So, now we're at a cheap, but great (!) little app, only to be installed on Macs which are not server-grade hardware.
    OS X Server.app has great features, but it just is hard to "sell" it as a real server OS, as it cannot (legally) be installed on server hardware, or be virtualised.

    Since OS X Lion the Mac is easy to manage in large corporate networks where Windows Servers are the norm. The "need" for Mac OS X Server diminished. Apple doesn't have to support the servers and can focus on why it does best: create products for the end-user, not the System Admins.

    Just occasionally, OS X Server.app helps managing Macs and iPads a bit better: NetInstall / DeployStudio, Profile Manager's mobile-configurations, and maybe the caching server for iOS.

    For the rest? It's a nice hobby. ;)
     
  17. MAH11 thread starter macrumors newbie

    Joined:
    Sep 24, 2010
    #17
    Very interesting. If I wanted to explore this option, which text file needs to be edited in the installer? Is there a description of this process somewhere?

    Many thanks,

    Melissa
     
  18. Moofo macrumors newbie

    Joined:
    Jan 12, 2011
    #18
    A foolproof way….

    I'm in the same situation as the OP… In a office with around 200 iMacs. Not a lot use server based home directories but...

    Personally, I would just stick an Oracle Sun X4 server running Oracle Linux (Virtualized or not) with an Helios FileServer software if you want to get fancy. Helios is not *absolutely* necessary. You could use NFS, which is built-in to Linux.

    Replace OpenDirectory by either Novell eDirectory or OpenLDAP.

    Remember, with the oracle servers, the Oracle VM hypervisor is free. They have *4* 10 GBE ports behind.

    I installed a pair of these Wonderful servers during the xmas break and a consultant is just about done with the eDirectory setup, we are binding machines to it and it gives the same functionality. I virtualized a bunch of older servers in the Oracle Servers as well.

    Get rid of the Apple server environment (Never thought I would say this) and get something more open running on another platform. Microsoft will rape you with the licenses though, unless you are in the academic domain...
     
  19. Altemose macrumors G3

    Altemose

    Joined:
    Mar 26, 2013
    Location:
    Elkton, Maryland
    #19
    You can go with a Mini Server or even a classic Mini and install the SSD. The RAM is an easy upgrade and is affordable to get to 16 GB. I did my MBP (Same RAM) in 2013 for $86 so I am sure it is cheaper now.

    Even if you get a Mac Pro refurb, or some other type of Mac, the software is not an issue. Mavericks is free, and the server portion is $20.00. You need to keep in mind that you now need to afford new Office software as 2004 will not run on a newer OS (10.7+) due to the lack of Rosetta.

    If your switch support Link Aggregation, you can use a Thunderbolt to Gigabit Ethernet adapter to hook the two together. It will increase the possible concurrent capacity to the server. Picture a 2 lane road that the cars move at 60 MPH, and a 4 lane road that the cars move at 60 MPH. They both go the same speed but there is more space and capacity for cars (your network clients) to travel over.
     
  20. Consultant macrumors G5

    Consultant

    Joined:
    Jun 27, 2007
    #20
  21. Cubytus macrumors 65816

    Joined:
    Mar 2, 2007
    #21
    Can't argue against lack of budget.

    Never thought I could read "open" in the same sentence as "Oracle". They're probably the most opaque company, along with Google, when it comes to supporting customers. Plus, they rape them with overinflated license cost, lock them in with their proprietary technologies and inflexible products.

    Ever since my university moved to Oracle PeopleSoft to manage students, courses and personnel, they had to dedicate 5 people full-time to help users navigate through the maze. A totally new login method was required as it couldn't integrate with the existing one, students and employees had to learn a new code to access their data, and looking for courses has become unnecessarily complex and terribly slow, not to mention the ugly interface. Apparently Peoplesoft couldn't integrate with existing courses codes and use a different one internally, which slows down secretaries. And as a contract employee, don't even try to get your hours detail. The guy responsible for contracts attribution prefers to print them one by one and make copies of them for workers. A failure in every regard.
     
  22. dporvin macrumors newbie

    Joined:
    May 12, 2014
    #22
    Hi Melissa. You'll want to edit the PlatformSupport.plist file. See this thread & scroll down to Macgolfers post...

    http://forums.macrumors.com/showthread.php?t=1404548&page=5

    Good Luck!
    -DP

     

Share This Page