OS X Server for Managing a Home Network

Discussion in 'Mac OS X Server, Xserve, and Networking' started by haravikk, Jan 7, 2014.

  1. haravikk macrumors 65816

    May 1, 2005
    So I've always been interested in OS X Server for fooling around with, and I did work with it very briefly in the past, but now that it's so reasonably priced I'm wondering whether it's a fit for helping to simplify managing my home network.

    Basically I have a household with four Macs, one mine, the rest belonging to my much less tech savvy family, as well as some iOS devices too. So I was thinking that OS X Server might help me in managing their machines, but I have a few questions:

    User Accounts:
    How well do network user-accounts work in practice? Since most machines have several user accounts on them I figured it might make sense to switch to Open Directory and just give each family member their own account, purely for login purposes (i.e - their data would still be on the host machines).

    Does anyone do this, does it work reasonably well in practice or are there caveats I should be aware of? For example, is it possible to use Open Directory accounts but still have FileVault enabled on a machine, or is that impossible? Also, since my own machine isn't always on, I assume I'd need to setup OS X Server on some (or all) of the machines in order to provide redundant access to the accounts, how well will that work?

    Software Updates:
    How much control does OS X Server give regarding updates? One of the major issues I have is that no matter how many times I tell my family which things are safe to update when prompted, they always end up asking me anyway, or ignoring the updates entirely. I just popped on one Mac Mini and found it hadn't update Java for several months for example, so you can imagine my annoyance ;)

    Anyway, with an OS X Server set as a software update target am I able to force installation of certain updates, or is it still up to my useless family members? What about installation of other applications, or even application updates? For example, can I push new versions of Flash onto their machines, or would I need to script that or something?

    One of the annoyances I have with my ISP supplied router is that its support for local DNS is non-existent, it even seems to interfere with things like file sharing access, i.e - I can't access my Mac from another by entering "Haravikks-Mac-Pro.local" as the address. I could install my own router before my ISPs, but I've been really trying to cut down on unnecessary devices.

    Anyway, is it possible to setup machines in a network using OS X Server such that they will each function as their own DNS Server, but inherit their records from a single machine (mine), so they're not dependent on my machine, but will get custom records from it? I do have a NAS, but unfortunately it's stuck on part of my network that's still only on 100mbps since it's actually in my garage.

    These are the main things I'm interested in, but I'd really appreciate anyone's thoughts or experiences regarding this kind of family setup. I know it's maybe overkill in some respects, but if technology can make my life easier, then I don't mind a few extra steps on setting it up :)
  2. mcnallym macrumors 6502a

    Oct 28, 2008
    User Accounts -

    The purpose of using OD Accounts would be so that the User Accounts reside in OD on the OSX Server, rather then defining locally on each Mac. Make sure that the OD Account has cached credentials on the Macs, and local home directory on the Startup Disk of the Mac. If you cache credentials for the OD account locally and home drive on the startup disk then the OD account should unlock Filevault 2 encrypted disk when login. Make sure you enable the OD Account for Filevault 2 unlocking.

    Software Updates -

    OSX Server will cache updates that a user downloads, and then when another user downloads the same update is served by the OSX Server Cache. It is however still down to the users to select to pull updates themselves.

    Apple Remote Desktop is what you would want from Apple to control Software Updates install software packages onto machines etc. You can force this through so would have control of updates etc. This doesn't require OSX Server but can run from a normal OSX Client machine. Is £54.99 in UK App Store for Unlimited Clients, so hardly breaks the bank.

    DNS -

    Each OSX Server on the network would be a DNS Server and the other machines would use the OSX Server(s) as there DNS Servers. No need to install on all of the machines. That way can use names to access various systems etc.

    If your going to do it the probably best to get setup on a mini or other similar low powered machine that can leave on 24/7, so that is available. If OSX Server on All machines as they are powered off and on at various times then really not worth bothering to do this.
  3. haravikk thread starter macrumors 65816

    May 1, 2005
    Thanks! Good to know there's clear support for that then.

    Another question I have actually is, who administrates each computer? Does each machine require a local administrator account, or can I use Open Directory for that as well, after some initial setup of course? Is it possible to set users to be able to administrate some machines but not others?

    Basically in my home network everyone has their own computer, which they may need administrator access to for simple things like installing apps. But I'd like to use Open Directory accounts to make it easier for everyone to switch to other machines if required. It's mostly me that needs to, but since each machine was bought at a different time their capabilities do differ, plus some peripherals are only connected to certain machines and so-on, so hopping machines easily is potentially handy.

    You're right, thanks! Does Apple Remote Desktop actually require screen sharing to do this, or does it push updates through in the background?

    Well, I have a NAS that I believe I could set up as a DNS master, but since it's actually in a separate building (I have a brick built shed outside) it's on a separate local network, but unfortunately it's only base 100 ethernet, whereas my home network is gigabit ethernet. Plus with backups going through to the NAS every so often it's not super-responsive, so I figured if one or more machines on my gigabit network were functioning as DNS slaves then it might avoid that problem? Like I say though I don't know enough about DNS, but I'd basically like to avoid routing all my DNS requests to my NAS if I can avoid it, but it's the only device I can really leave running all the time.

    I'd use my modem/router but it's supplied by my ISP and is completely terrible, but since it's a cable modem I don't think it's something I could just swap for a better alternative.
  4. mcnallym macrumors 6502a

    Oct 28, 2008
    I think you are better off just using Apple Remote Desktop to do your App Updates. It doesn't rely on Screen Sharing but actually uses the Remote Desktop Client on the laptop/desktop.

    The Package is then pushed to the Mac to be installed.

    If you aren't going to use a constant OSX Server, and have it installed on multiple computers being switched on and off I think you are going to cause yourself too many issues with keeping the servers synched to be worth using. You may as well just keep local accounts and define accounts for each person on each Mac.

    Is like asking the AD Administrators to not keep the Domain Controllers online, but be constantly switching them off and on, and potentially not having a single Domain Controller on at some times.
  5. haravikk thread starter macrumors 65816

    May 1, 2005
    But isn't that the point of specifying alternate Open Directory servers? If each machine functions as an Open Directory server, then there should never be a situation in which a user can't login. At least two of the machines are online most of the day, so there also shouldn't be occasions where stuff gets out of sync for any length of time.

    I intend to get OS X Server anyway as it has simplified management of web-servers, wikis etc., which should make things easier for when I'm doing web-development work, as setting everything I need up and then not breaking it all shortly after is a surprisingly common challenge that faces me ;)
  6. chrfr macrumors 604

    Jul 11, 2009
    The problem is that you want the home folders to be available on a server that's always available.
    You can either have network homes which only exist on the server, or mobile homes which sync back to the server, but in either case, those home folders only exist on one server. The Open Directory replica server can provide backup for credential/account management but won't help with the home folder situation.
    As someone who manages this setup at work, it seems way too complex to be worthwhile at home, especially if everyone already has their own computer.
  7. haravikk thread starter macrumors 65816

    May 1, 2005
    Is it not possible to have no network home at all? All I really want to do is use the login credentials, they'd still end up with a local home folder on each computer they use (which is essentially how it is now anyway). Only other things I'd really want to sync would be things like keychains and app preferences; I can do keychains at least with iCloud though, and app settings aren't hugely important.
  8. chrfr macrumors 604

    Jul 11, 2009
    With 5 computers to manage, you will spend far, far less time just going to each and making an account than trying to come up with a way to do what you want with OS X Server. Just having a user get login credentials on the server but use a local home isn't a standard way of doing things so you'd probably need to do manual configuration on each system for it. (My knowledge on this is thin because I do use mobile home folders and haven't researched a way to not do so, so perhaps someone else can help here.)
    Also regarding software updates, the OS X Server Software Update Server does not provide a way to force updates, but rather just allows the server administrator to save bandwidth by only downloading specific updates. This will also allow the admin to limit which updates a user sees. It won't solve the issues you'd like to solve.
    For DNS, it sounds like your ISP provided router isn't working properly. If you can turn off the routing functions on it (I'm assuming it's a combo router/modem device) and then get a new 3rd party router, you'll be better off, with far less hassle than managing an OS X Server DNS setup.
  9. irnchriz macrumors 65816


    May 2, 2005
    I would recommend setting up net install as well. Saves loads of time if you need to reinstall any of your Macs.
  10. AmestrisXServe macrumors 6502

    Feb 6, 2014
    What router are you using? If there is a DD-WRT firmware set for it, you should change your firmware to that. it gives you a great deal of extensive control, including NAT control.

    You can also get away with using VNC instead of ARD.

    Another perk of DD-WRT: You can use DDNS to resolve a hostname over the WAN to be able to administer, or use your home network when away. Things such as SSH, Web Server (apache2), NFS, and more become easily possible with a standard domain name and a $30/year Dynamic DNS plan from any of several dozen DDNS outfits.

    Then, combined with the NAT/QoS features, you can route incoming ports to standard ports on each system on your network, give each system a static IP lease (LAN only), and use SSH, telnet, VNC, etc, remotely for each system.

    Converting to DD-WRT costs nothing, given that a build exists for your router. I use Linksys WRT54GL routers, as I find them the most robust, but DD-WRT exists for a wide range of HW now.

    You can even host your own websites on your internal (home) network over port 443, and possibly port 81. DynDNS also has a service for port redirection, if port-80 is closed. (It's called Port-Hopping, and resolves a URL such as www.MyURL.co.uk) to a non-standard port of your choice, such as port 8080.

    With the NAT, you could assign a unique SSH value for each system on your lan, such as 8122, 8222, 8322, etc, and resolve each to port 22 on specific IPS on your LAN. This gives you remote SSH management. (You can also do this with port 23 / telnet.)

    Finally, if you put VNC on each system, you can port forward specific ports to each LAN IP on a specific port. For example, from your LAN, if you assign each VNC connection to port 6100, with the web (JAVA) interface as 6101, you can use the NAT to port-forwardfrom a WAN connection, using unique ports for each UP like 6110->6100, 6111->6101 on ip and 6120->6100, 6121->6101 on ip

    On your internalLAN, you can always address each IP with ports 6100 and 6101, but from a WAN (remote) connection, you can access each system via the domain name (e.g. www.MyURL.co.uk) on ports 6110, 6120, etc.,allowing access to each system with no hassles. This IMHO is better than ARD, as it gives more specific control, and simultaneous control over multiple systems.

    A home/personal use version of VNC costs nothing. There is also OpenVNC (the older, open-source version, before RealVNC went closed source), and other VNC forks based on that. VNC is also Windows and Linux compatible, and even runs on Android and Palm devices, plus it has a web (JAVA-based) interface that you can access from any system with a JAVA-freidnly browser, including renegade and obscure OSes.

    This is why I favour it over ARD. I think that soe operations may be more streamlined in ARD, but probably nothing worth spending money on for your use. I use VNC to administrate OSX, Linux,and Windows servers and desktop systems. (I wrote magazine articles and software manuals remotely, using a Palm Treo some years back, connecting to my Mac workstation, and running Framemaker from another country to do it.)

    For OpenDirectory,you will want to make an OD / LDAP mater on the server, and slave everything else to it, but you will also want to make constant backups, as if that OD master is ever broken, then every system that relies on it will become a doorstop.

    You may also want to consider making a Netboot image for the slave systems, and using that OD master record to give them user accounts.In doing so, you can boot them without them even having a hard drive, or not having an OS install which saves you 6GB to 12Gb of space for each, which is great for laptops.

    OSX is bootstrapped via Netboot over your LAN to each system, keeping them all in sync, applications can be hosted from your server (using symlinks over AFP, or SMB) and user accounts can be on the local machines, or on the server, either controlled through an OD LDAP or Kerberos master.

    You could further add macports to the server, and run POSIX softare (e.g. ported Linux softare), including a PXE server, to serve Linux, or Windows LiveCD or other live running or install environments from the LAN.

    In conclusion, yes it makes sense to run a home network, especially if you want to streamline everything. You merely need to be willing and able to handle running a server environment,and to keep routine backups. I suggest a 2008 model XServe over a Mac Mini, due to the expansion slots (if you want a good NAS, as an example,and because of the three internal drive bays, giving you SW RAID-1 and RAID-0, that makes your server more reliable.

    The Mini has only one internal HDD bay, and has no expansion capability. That means no internal RAID, and no SAS/eSATA. (USB SATA is really ill-advised.)

    A used XServe2008 will cost about the same price as a new Mac Mini, and is both more powerful, and more expandable. It will also usually include 1TB 7200RPM enterprise drives, rather than a desktop-rated 5400RPB HDD, which is what comes with the Mini.

    Minis do make good slave systems on a network though, as you canuse NAS and such through the network with them, and easily netboot a dozen Mini systems from one server.

    The only things that an XServe (2008) lacks vs. the Mini are Thunderbolt,and USB3, both of which you can add via the expansion slots. SAS is still the way to go for NAS, IMHO.

    Other differences: The XServe uses DVI instead of HDMI,and you honestly don't need HDMI for a server, and has to FW800 ports vs. one on the Mini.
    The server has three USB2.0 ports--you could add USB3.0 to it--and no SD slot, but you can easily add an SD/MMC/Multicard reader.

    The only real hit is not having Thunderbolt, but TB arrays aren't really something you will ever need on a home network, and you can have TB on client systems if you need it for local use. (You likely won't be using the server for anything that is TB specific.)

    Beyond that, you wont see much of a difference between Thunderbolt, SAS, and eSATA, as SATA drives can't reach the bandwidth that TB provides.You shouldn't expect more than 1.5GB/s on a 7200 RPM SATA drive, and less than that on a 5400 RPM mech.

    The only other downside is that the XServe requires FB DIMMs, but in trade-off, its max RAM is much higher than a Mini. The Mini caps at 16GB, and the XServe at 64GB of RAM. (or is it 128GB?)

    The same base cost price point really makes the 2008 XServe attractive, which is why I use those, and G5 XServe systems. I will eventually get the 2009 model,when they drop in price.

    You can also upgrade the XServe CPU easily, unlike the Mini. Really, the XServe give syou a lot of room for future expansion,whereas the Mini is what it is, and will never be more.
  11. chrfr macrumors 604

    Jul 11, 2009
    This is exactly why it doesn't make sense for the OP to run a server at home.

    The 2008 xServe is also no longer supported with the current operating system, huge, extremely loud, and from from ideal as a machine that can also serve as a regular Mac from time to time. It also has very high power requirements so is far more expensive to run than a Mini. The xServe will also include drives, that unless replaced, are closing in on being 6 years old.

    SAS for a 4 client home server? Huge waste of money. Thunderbolt can't be added to any Mac.
    A Mini with an external RAID is a very competent machine, especially for a light use home server.

    16GB is far more than adequate for home. RAM for the 2008 xServe costs a fortune, where you can find it, and is slow.
  12. AmestrisXServe, Feb 9, 2014
    Last edited: Feb 10, 2014

    AmestrisXServe macrumors 6502

    Feb 6, 2014
    Please, clarify... The OP stated that he wants a proper WebDev environment, for which I would think a streamlined LAN would benefit him, both in usefulness in and of itself, and as a proving station. If he is going to run a server for web development, he may be handling all sorts of media types, and databases, which is one good reason to consider rack equipment, as it is an investment into a business that can benefit from it.

    Nonsense! : http://arstechnica.com/civis/viewtopic.php?f=19&t=1223377

    Interesting that the only thing that limits Mavericks from running out of the box on the 2008 XServe are Apple. Can you say, 'Forced Upgrade'? Sure, you can.

    The drives in an XServe may be six years old, but I would sooner wager on them running for another six years, than on the HDD inside a mini running for two years 24/7/365. (Perhaps with an SSD option.)

    Yes, it's large, but it's also flat. You can run it in a rack, on a desk or turn it into a bleedin' glass-top endtable.

    I suppose 'noise' bothers some people. I'm one of those blokes that beefs up the fans in every Apple product, as I prefer my HW to last, and the noise of three 80mm fans doesn't bother me a titch.

    I had forgotten about that stupid TB header. The PCIe cards in development would, require that. Really though, TB isn't all that fantastic, but I suppose you could buy one of those PCIe bridges and expand a Mini all over the place, cluttering a desk with peripherals.

    SAS is just handy, but eSATA is also very viable.

    I frankly wouldn't trust the *internal* HDD on the Mini for server use. External RAID: You could add one via thunderbolt, but I always worry about booting on external volumes.

    With regards to RAM, I mentioned this as a drawback. You are looking at 2x the cost of RAM for a Mini, and 1/2 the bus speed, but with a huge memory cap. I suppose if you do zero or littlerendering and video work, the Mini is fine. It all depends on what the server will be doing in the future.

    Last, concerning current requirements, unless you are one of those people who worry about a £10 difference on your bill, I see little to worry about. 6.25A for the entire system. You are looking at a 665W difference, but once you start adding NAS devices and expansion devices, you are going to come down to comparing pennies.

    That Mini, as a server will never run at 10W, and I would worry about thermal problems in a year or two, running it constantly.

    Either way, I'm not a dealer, so it makes little difference to me. I just know what I would prefer and trust for my own networks.
  13. snarfquest macrumors regular

    Jun 7, 2013
    Just to add an option that's pretty simple.

    To do a software update quickly and easily on a mac without screen sharing you can ssh into the mac and run:

    sudo softwareupdate -i -a

    This will do a command line install of the updates.

    For fun I just logged into my son's Mac and ran it:

    [ xxxxxx@snarfsrv /Users/xxxxx ] $ ssh xwing
    Last login: Wed Jan 22 14:50:13 2014 from snarfsrv.snarfquest.org
    xwing:~ xxxxx$ sudo softwareupdate -i -a
    Software Update Tool
    Copyright 2002-2012 Apple Inc.

    Finding available software

    Downloaded iTunes
    Installing iTunes
    Done with iTunes
    xwing:~ xxxxx$
  14. AmestrisXServe macrumors 6502

    Feb 6, 2014
    Very true, and sshd is part of basic OSX, so you can set up every system with sshd (a somewhat obfusticated option in sharing prefs, called 'Remote Logon'. Enabling this turns on sshd (i.e. the SSH daemon).

    After turning this on, you can ssh into any system with it enabled. For either amusement, or for easy system identification, you can add a MOTD on each system, so that it is easy to identify them in the event that the IPs change (DHCP hell).

    If you want DNS, you can add Apple's Server Admin Tools, to configure and enable DNS, and give each machine a local name, and a reverse name entry. This can make using SSH easier, as you can ssh machinename.local, which will stay constant more so than an IP on a typical home network.

    If you have Windows systems on your network, you can install CygWin to give them SSH and a Bash shell as well. I have a CygWin repository on my servers with the dlls that are absent from the standard package to enable certain, but crucial, commands such as ls, in case anyone ever needs these, and can't find them elsewhere.
  15. chrfr macrumors 604

    Jul 11, 2009
    It's not necessary to enable DNS to ssh to "machinename.local" in a home network. The computer name is set by the name in the Sharing Preference Pane and that works just fine.
    Server Admin Tools isn't available for anything after 10.7 anyway.
  16. AmestrisXServe, Feb 14, 2014
    Last edited: Feb 14, 2014

    AmestrisXServe macrumors 6502

    Feb 6, 2014
    Rats, you're right. I am used to using a URL resolution, forward and reverse, to access SSH and other services on my networks, and I somehow recalled that DNS handled local name entry and reverse-entry on top of the stabdard URL resolutions.

    Doesn't Server.app also override standard services, just as Server Admin Tools did? I've never run 10.8, so I was never forced to run Server.app... I've only heard how horribly implemented it was in 10.7 and decided to stay clear of it. (It's amongst the reasons I never ran Lion, aside from the iOS components.)

    I'm one of those users who wants full control over every possible aspect, and even in earlier OSX releases, I often drop to a shell for even the most basic operations. Lion and beyond just seemed too much like an iPhone to me, and not a desktop/poweruser OS.

    I don't know if this has improved, or worsened with Mountain Lion and Mavericks--(Did they run out of cats, and want to label us as unbranded cattle now? Does that name mean to imply 'wild animal', or 'beast of burden'? Why not Sabretooth, or something in line with the 10.x naming conventions?)--, but most of my mates in the NLE field, steered away from Lion as well.

    If these new releases offered a clear-cut choice of UIs, between the old Finder operations,and the new iOSified Finder, I would happily use them, but I don't care for the new UI, which to me makes no sense for a desktop OS, unless you have a gigantor-sized touch-LCD. To be honest, some concepts could be subtle improvements to Aqua, but you are left with either using 10.6 (or earlier) for a proper GUI, or 10.7 or later for an iOS GUI.

    I would rather have the option to enable the features on the standard Auqa GUI, or something similar; I would also like (in a server OS) a default option to work with 'invisible files' int he Finder. This has been my main complaint about OSX Server from day-zero: Even Tinkertool doesn't do it all, as you can't do a sudo cp -prv operation, without a shell.

    If Apple wants the user to migrate from a CLI, to a iOSified GUI, they need to provide these tools. Frankly, the moment I read that 'Save as' was a deleted menu entry in Lion, I decided that Apple destroyed the OS.
  17. hwojtek, Feb 15, 2014
    Last edited: Feb 15, 2014

    hwojtek macrumors 65816


    Jan 26, 2008
    Poznan, Poland
    Excellent recipe for a disaster.
    You will need to memorize all the port mapping (because each incoming port on the router can be hardmapped to only a single port on a single machine). Which means: 5 services multiplied by 4 computers = 20 ports to remember or to write down. Not very safe nor convenient.
    This is what the VPN was invented for.

    Which means you are actually exposing your network and each and every computer inside open wide for anyone to brute force his way to log in. No network administrator in the world would allow for this. Never.

    I am sorry, but this is the most complicated and nonsense way to remote admin, again: set up a VPN!!!

    ...and what if the laptop is actually moved away from the wifi range? It becomes a brick?


    I don't know if you could compare an Xserve with a Mini, but the Mini is actually faster than a XS2008.

    How exactly can I add Thunderbolt to an Xserve? What is the actual need of having USB3 on a rack mount server?

    Because? It is just a video interface, actually it is the same electrically as the Mini Display Port, but with a different shape of the plug...

    Why would one need a card readers on a remotely administered, lock away server?

    The test results beg to differ.

    In what circumstances did you see 16 GB maxed out on OS X Server?

    "Lot of room" being three 3.5" drive bays with outdated interfaces instead of two 2.5" SATA3 in the Mini. Not a lot of upgrade space in both, if you want to run a serious RAID5 configuration you need to refer to external NAT anyway. Only a JBOD is easier in Xserve, but then, if you use JBOD, you wouldn't think about an Xserve in the first place.
    The graphic card upgrade argument does not have a lot of sense, since the upgrade is complicated (virtually zero cards that physically fit, are EFI64 compatible AND actually an upgrade) and do not pose any value for a server that is remotely controlled, is not used for regular work and doesn't have a screen attached.
    The very single expansion argument for the Xserve is the second half-length PCI expansion slot. Can be used to put a 4-port USB3 card. If there is a reason to use this interface, the Mac Mini already has got it anyway.

    Now there is a couple of usage issues: the Xserve needs a rack. Which means it will steal at least 5 sq ft of any room it is stored in. The Mini has a footprint of a book. The Xserve uses about 7 times the energy a Mac Mini uses (while performing the same duties). The Xserve is out of warranty, the Mini will be covered by Apple warranty (and remember, you are about to store your whole digital life with this computer). It is not comfortable to be around an Xserve. My Xserve is locked away in the basement and I still do not like it (and will decommission it soon).

    Now to relate to some other ideas from this thread: OpenDirectory for a home network: no. Too complicated (I know there is the temptation of "I only put this data once into the OD", but it is not as easy at it seems, starting with a proper certificate for your OD). You will find it out the first time when a guest comes to your house and asks to connect to you wifi. Or you want to share a 3-gigabyte file with them. Your sophisticated setup proves that modern, you need to refer to pen drives in order to copy a file to a friends computer.
    The network booting idea for a laptop is just silly.
    The local software update service is a nice idea, however it IS complicated to set up and plagued by client issues. Multiply complications by number of OS X versions you run in your home. That being said: the Remote Desktop is not devised as a mechanism for pushing OS updates to clients. It is possible, however it is very close to using an angle grinder to do your manicure.

    If you want to ask questions please fire away. I run a 15+ Mac home network with local mail servers, an extremely aggressive captive web proxy and multi-terabyte RAID array providing audio and video media. I also run a VPN (DD-wrt openVPN) that allows my laptops to act remotely in exactly the same way as if they were locally or gives my parents a remote access to my media if need be. I do not have a fixed IP but a (free) opendns/dyndns combination streamlines my network management and allows for maximum remote availiability. I have then 3 open incoming ports on my router, none of which allows a command line login.

    I've been through all these ideas regarding over-the-top home network. At the end they all proved not flexible enough for a socially (Real Life, guests coming in and stuff, not "social media") active home user. The system needs to be secure, not gigantomaniac.
  18. chrfr macrumors 604

    Jul 11, 2009
    Yes, Server.app is a front end to many system services. In other cases, it adds binaries. Server has evolved significantly from its awful 10.7 history and is much more functional. There are plenty of things that can only be done through Terminal, however, there are a lot less of those than there were. Workgroup Manager still exists and is largely unchanged.

    OS X gives a user as much control as they'd like.
    Given that you haven't used the last three versions of OS X, it seems like it'd be worthwhile to at least give the newest a try to see if your (mostly invalid) fears are warranted.

    "Save As" exists again.
  19. AmestrisXServe, Feb 15, 2014
    Last edited: Feb 15, 2014

    AmestrisXServe macrumors 6502

    Feb 6, 2014
    We obviously have very different approaches to what kind of network, and what uses of a network, we want in a home. I have a full server room in mine, and every area is wired with either copper or fibre. I don't use wireless for anything that matters, as there is no encryption protocol that is entirely secure for WiFi. About the only thing that might prove a vulnerability in using WiFi here is a keystroke capture, as all secure datum is routed through copper or fibre lines.

    You aren't going to have much luck brute-forcing an SSH connection with administration access, unless a dedicated group wanted to do so. It doesn't really matter, as I also lease client space out of my home network in addition to everything else, so SSH access is rather mandatory. I wouldn't even consider running a system that didn't have remote shell access.

    I also run a multi-TB RAID mate; several in fact, on a SAS hosts, mostly as RAID 0+1 with twelve RE3 drives per enclosure. I would never, ever, want that piled up all over the place, so I prefer racks for OCD-level organisation and neatness.

    If you route a building with fibre and copper, you will want a central location for it all, and a small room, or large closet works well for that. I have a 4' x 8' closet with its own refrigeration, and two full racks in my domicile, not counting client systems.

    For a series of laptops that are being used on tables for graphic design, writing, and media streaming, I find that a single netboot image with the OS and a few base applications, and all other applications and datum on the network saves a tremendous amount of drive space, if the systems even have drives in them at all. That said, I don't have single, social people visiting to watch a film that need network access.

    They can watch it with my projector system, or on a large LCD or a Plasma display, running on a dedicated client that is already here, using LAN-based storage, or they can request an account and access the storage directly. I don't really have any social friends who aren't intelligent enough to add a user account on their system for access to this network, and I usually give them dedicated storage space, a shell account, and a web environment to use as they will.

    Anyone else that comes here is likely to be a customer, or a part of one of several fraternal or not-for-profit groups with which I associate, hence the lab setups.

    My usage needs, as you can see, are very different. With regard to server speed, I still argue that an XServe can be faster than a Mini, particularly as you can upgrade the CPUs. (I recall the CPU in the Mini being a BGA chip.) The bus speeds on the Mini and the RAM clock speed are higher, but the calculation speed is not, given the right upgrades to an XServe, which is what I said earlier.

    When you can upgrade a machine with off-the-shelf components, it is a bonus.

    Given that the XServe has very good airflow, I am not even going to consider worrying about a Warranty. I don't buy AppleCare plans, and I don't buy new systems: It's just a waste of money. I would be concerned that a Mini would not last ore than three to four years, simply because it has such poor airflow. If you want a 'quiet' system, you should expect it to eventually give in to entropy, sooner rather than later.

    If you don't care, as is my case, given that I can barely hear the systems in the area in which they live, I can expect them to run for up to two decades, save for the drives. The drives, being mechanical, are always dubious, which is why I will likely upgrade entirely to NAND-based media by 2020 or 2022, assuming I'm still alive.

    All that said, I could care less what equipment anyone else buys, or runs. We each have our preferences, and just as mine is for hard-stroke, mechanical keyboards, high airflow, lab-setups, very dim/dark rooms, and art deco architecture; yours is for silence, touchscreens, abstraction, and probably brightly lit areas and feng shui minimalism.

    For the record, I have multiple XServe systems, plus Linux systems, multiple 12-drive disc arrays, and other equipment running 24/7 at home, as well as elsewhere, and I can barely notice the sound over my roof-mounted compressors and fans (for home refrigeration). I keep it a steady 12C/54F in here, all-year-long, which is bloody hard, and required me to build a custom system with mostly manual controls, and none of that 'energy saving' nonsense. Considering the amperage requirements of that system, I'm certainly not going to worry about how much a pair of server racks is using.

    Very likely, none of this in any way makes sense for most other people, but when you use equipment for professional needs, and not merely for 'family needs', the entire scope, and perspective changes around that factor. I don't have a bleedin' wife, or kids, to complain about either the sound of a keyboard, the noise of a fan, how dark it is inside, the smell of my smoke, the noise from my machine and carpentry shoppe area, or the cracking of my billiards games. Anyone that comes over is generally of a similar mindset, and/or is here for a professional reason.

    If you are wondering, no, I am not in a rural area, light years away from everyone, with a ten foot wall with canons on the top. I'm merely a stubborn old man, very set in his ways, and debating the value of any of this kind of setup really isn't a useful discussion.

    It's always up to the client/customer/owner as to how they want to run their hardware, and I merely offered suggestions, that if they wish, they can take or leave. Each person has a different set of needs, and ideas, and the entire world doesn't have to follow the approach of one man, so I suggest alternatives, and potentially differing methodology that may be of interest. There really is no wrong approach to this sort of thing, as long as you know what to expect as the end result. At then end of the day, I still say I wouldn't trust a Mini as a server on my networks, for reasons as I have previously accounted and documented.

    Did they add scrollbars back in again as well?

    It's actually rare for Apple to revert like that, as it has been their policy since 1984 to force their new ideas on users. That is why the original Mac had no numeric keypad or arrow keys: It was to force users to use the mouse.

    To be honest, upgrading the OS on a server is a sure way to break things. I have a lot of custom stuff, some of which relies on 10.5, many applications that I run would cease to function, and it would be a real mess.

    Does Mavericks support Rosetta? If not, I can guarantee that I can't run several critical pieces of software.

    To be honest, I have a 'If it isn't broken, don't fix it.' approach to this sort of thing. I also say 'No', when I see a trend in de-contenting, or function disabling. I can't tell you how many people with Windows 7 have asked me why there is no Telnet command. Why turn something like that off in an OS?

    When Apple started marching down that road, I stopped bothering with their new OS releases.

    If I install a new machine--a Nahelem for example--I could try 10.9 without worrying about breaking my existing installations, but as everything is working as I need it to work, and all of my software is designed to run on 10.4, 10.5 and 10.6--some requiring 10.5 or 10.4--and some requiring PPC, I see no valid reason to upgrade.

    It isn't as if I'm missing out on anything: I don't need iCloud, iPod, iPhone, iPad, iTablet, iCostmoney, gadget-y stuff. In addition, Mountain Lion and Mavericks have some heavy system requirements in contrast to the light footprint of Tiger and Leopard means new hardware to do the same thing that my existing hardware is doing. I'm not doing NLE.

    (I'd rather put money into more SAS and SATA arrays, as I can always use more drive space, vs. more server nodes.)

    I will say that the approach of tying the iEtcetera programmes together, and giving them 'real world' appearances reminded me a great deal of the Lisa. I findly remember 'tearing off pages' to make new spreadsheet and word processing documents with the slanted Ws. The iWare reminds me a great deal of the Lisa Office System, which I personally thought was fantastic, and it's sad that it never went anywhere.

    That's the problem with zero third-party software developers.

    I wouldn't mind trying a Mavericks system, but I have no need to run one.

    My only other qualm would be that I would have to relearn how everything works, with SAT replaced by Server.app, and there is no commercial need for me to do that. (i.e., I only have so much time to spend on any given task, so if I need to learn something new, I want it to be practical for either my work, or my leisure.)

    Either way, I also would like to avoid going further off-topic. You are welcome to start a thread on why Mavericks server kicks the trousers out of Leopard or Snow Leopard, if you wish. This one is about whether using Server is practical for someone new to OSX, to which, I still say 'No, unless you can answer yes to at least two, or three, of those critical questions that I had asked earlier in the thread. Learn OSX first, and then migrate to Server if you find the need to do so.'
  20. ApfelKuchen macrumors 68030

    Aug 28, 2012
    Between the coasts
    We have two mutually-exclusive situations: One tech-savvy family member who wants to fool around with serious networking, and a non-tech-savvy family who will have to live in that networked environment.

    I have yet to see (though I may have missed) a benefit you want to gain for your family members. I see no mention of shared data, for example. Is there something they want to do that can't currently be done? If there is anything at all, could a NAS or Time Capsule or basic OS X file sharing accomplish it more simply? Meantime, how much will their interactions with their computers change in the environment you contemplate? Will there be new things you want to teach them so they can appreciate your creation? Are there things you'll need to teach them in order to do tasks that are routine today?

    Bottom line for me? You're probably going to spend more time training them and more time helping them out, rather than less. You'll go from helpful family member to essential network admin. It's not a fun job - you'll get called for anything that goes wrong, because from their perspective, the only thing that "changed" is that new network you setup. If they do any self-solving now, that'll end immediately - they'll be afraid to do something that might break your network.

    They'll become your guinea pigs, and when you keep guinea pigs you become responsible for their care and feeding.
  21. ElectronGuru macrumors 65816

    Sep 5, 2013
    Oregon, USA
    I dont see the big deal. OP clearly said 'for fooling around with', the server is cheap and it includes the ability to turn services on and off at will and as needed. I did mine for the singular reason that im years out of an IT career and wanted to get under the hood of my network. For the feeling of doing it.

    If its just for learning or to kill time, its a worthy project. If it works out, he can leave it on. If not, he can turn it off.
  22. AmestrisXServe macrumors 6502

    Feb 6, 2014
    Learning additional skills is always a worth ambition.

    What we don't have, with respect to this thread, is an actual list of ultimate goals, types of services that interest the OP, uses of those services, space availability, requirements (hardware, and software), type of network(s) (copper, fibre, or wireless), and if the server area will be at all offset and contained, or out in the open.

    Those all all important concerns and factors, for any kind of valid recommendation. Otherwise, we're mostly shooting in the dark.
  23. haravikk thread starter macrumors 65816

    May 1, 2005
    Well, simplified software updates is probably the big one as far as they're concerned, though it's as much a benefit to me as any time I hop on one of their computers I almost always find a ton of updates that haven't been installed. I realise now though that this is more Remote Desktop's job, or the command line command (thanks snarfquest for pointing that out), though OS X Server's ability to hold these locally is handy as an addition.

    Otherwise, while family members mostly use the same computers, the ability to quickly hop onto each other's machines is something we currently have, since only one machine has a printer and scanner for example. But currently this is done with user accounts on each machine, having this handled centrally just seems like a cleaner way to do it.

    I'm also thinking about moving Time Machine backups to a single machine, as currently each user has their own external backup disk, each with varying sizes. But for my main computer I'm building a storage array that will have a much higher capacity than all our current external disks combined, and I have a NAS which will keep a copy of this. So the ability of OS X Server to provide a shared Time Machine volume may help de-clutter. There's also services like the CalDAV server instead of Google Calendar and a few other bits and pieces.

    But yeah, it's mostly little things I suppose, plus my desire to learn it.

    Well I already do get asked about a million silly things already; one of the reasons updates don't get installed is because my family members insist on asking me before each update rather than installing the ones I've told them are fine to just go ahead with. If I can just push those out then it will make my life a bit easier. I also really need to get everyone on the same version of OS X, though that's not really an OS X Server specific issue I suppose, though I could presumably still push out new OS versions via Remote Desktop?

    Granted some of the other stuff may introduce more potential problems, but I really just want to get a feel for how it works and what can be done with it. Thanks to all the great replies so far I'm thinking I will get still get OS X Server, but well in advance of any big change in my setup, so I can try the various services. I think I'll still give network accounts a try, but with a test setup of some kind, maybe using a throw-away virtual machine? Plenty to think about anyway!
  24. unplugme71 macrumors 68030

    May 20, 2011
    While I do agree it is a lot to manage, there are benefits to having a mac mini act as a home server with many machines and users in that home.

    1) OD accounts makes things more secure. You change your password once instead of on every machine. You set it to change every 90 or 180 days, whichever you feel more secure with. You don't have to remember if you changed it on another machine, or if what password you used on it.

    The only downside is managing the OD system, having a backup, etc. I run a second Mac Mini to replicate my OD.

    2) Having DNS managed on a server is nice as well. I personally run two Mac Mini Servers. DNS is replicated between the Master/Slave.

    3) Wi-Fi RADIUS - Adds a bit more security to the standard WPA2 Personal. If a guest comes over, you can simply create a OD user account with very limited privs. Or if they are not frequent or don't want to go through the headache, setup another wireless router for your guests separate of your home network. Problem solved there. I used to have my friends dump data on my server in their remote user folder. But with Dropbox and other services allowing the same thing these days, I removed them from this.

    4) Software Updates - Having multiple Mac devices with a lot of them streaming Netflix - it eats up bandwidth and my ISP provides a crappy monthly cap. Having to only download Software Updates once is saving me a huge headache in itself!

    5) Time Machine - While I don't personally recommend I still use it for the client machines. However, I run Data Backup software once per week that creates a full image and then daily increments until the next full backup. That way a user has a quick way of recovering files and I also have a more solid restore option as well if TM fails to work.

    There's a lot more benefits as well. It all just depends on how much you want to manage.
  25. chrfr macrumors 604

    Jul 11, 2009
    Your management suggestions will take much more time than setting up 4-5 users on a similar number of computers.
    All that work, and the few family members involved don't gain anything in ease of operation.

    Edit: the one feature in Mavericks server that is great for home users is the Caching Server. Software Update server itself is a pain to manage and for most home environments the caching server will be a better option as it requires no reconfiguration on the users' computers at all.

Share This Page