Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

Invizzible

macrumors regular
Original poster
Feb 9, 2003
223
1
Does anyone know why Apple doesn't include a defrag utility in its OS? Maybe it's just because I started out on Windows, but it seems like that's a tool that an OS should have. It also seems like a simple enough thing to include in the interest of being competitive with Windows. Has anyone heard if such a thing will be in Tiger?
 
I'm still a little unclear on the issue, but I believe OS X automatically deframents files in real-time.
 
Yep, Panther auto defrags any file under 20 MB in real time I believe. I don't think defragmentation is much of an issue on the Mac and I'm sure there is a free utility somewhere if you really do need one.
 
OS X automatically deframents files <20 MB, but that it still doesn't do anything for bigger files.


EDIT: yay for leaving the page open for 10 minutes and not bothering to refresh it!
 
Are you sure? I have Disk Warrior, and when I run it it shows the 'before' graph as being quite fragmented. When I'm done running it, the graph it shows looks perfect. If you're correct, then the only thing I can think of is my HDD is getting fragmented soley by the audio and video files I have that are over 20 MB. Anyway, with a 20 MB limit on what it will defrag, (and since my disc apparently gets very fragmented) I still think a utility should be included to do a thorough defrag.
 
I've never heard of an auto-defrag. :confused:

However, it has been explained to me that the way OSX and Unix (and probably Linux) fetches and stores info doesn't require a harddrive defrag like it does in Windows.
 
That may be at least partially true. I've noticed that in the old OS 9 version of Disk Warrior, it actually moved files around on the HDD when defragging. With the new OS X version, it doesn't seem to do that. It appears to just rewrite the directory.
 
Abstract said:
I've never heard of an auto-defrag. :confused:

However, it has been explained to me that the way OSX and Unix (and probably Linux) fetches and stores info doesn't require a harddrive defrag like it does in Windows.
Mac OS X 10.3 "Panther" has a feature called "hot files". Here's how it works: Files that are frequently accessed get put into a special area of the hard disk reserved for hot files. Files in this area are automatically defragmented. If other files become used often enough to enter the hot file area, some of the less frequently used hot files get evicted to make room for the new ones. There's a 20 MB limit on what can be considered a hot file. That's why large files don't benefit from hot files.
 
Im not sure if the same would be true on the Mac, but i've left my PC hardrive without being de-fragmented for months - when I do defragment it, I haven't noticed the slightest difference in operation.
 
MacSA said:
Im not sure if the same would be true on the Mac, but i've left my PC hardrive without being de-fragmented for months - when I do defragment it, I haven't noticed the slightest difference in operation.


it all depends on how fragmented your drive becomes. at less then 30% most useser will not noticed a diffences or they get think they see a diffenc3es (yes there is one and yes it noticlbly faster if you are using the drive a lot). But if you let a drive cross 50% fragments and defrag from there there is a very notible increase. bump that up to 70% and the computer droping to a crawl.

Mind you it gets to 30% pretty fast but then it slow down after than. Also the fast way to fragment a drive is to deal with large files moving them deleting them and copying them quite a bit.
 
Timelessblur said:
it all depends on how fragmented your drive becomes. at less then 30% most useser will not noticed a diffences or they get think they see a diffenc3es (yes there is one and yes it noticlbly faster if you are using the drive a lot). But if you let a drive cross 50% fragments and defrag from there there is a very notible increase. bump that up to 70% and the computer droping to a crawl.

Mind you it gets to 30% pretty fast but then it slow down after than. Also the fast way to fragment a drive is to deal with large files moving them deleting them and copying them quite a bit.
Hey Timelessblur-

My Windows PC gets that kind of heavy hard drive activity all the time, since I do file conversion work on it. The file fragmentation once hit 80% due to all the HD activity, and the graph in Disk Defragmenter had more red than anything else.
 
i thought newer HDs with multiple read/write heads didn't benefit as much from simple linear defragging anyway?

i've also heard that OS X does auto-defrag.

if you wanna talk about an OS missing some things, i'd go with Windows missing an anti-virus software... ;)
 
wrldwzrd89 said:
Mac OS X 10.3 "Panther" has a feature called "hot files". Here's how it works: Files that are frequently accessed get put into a special area of the hard disk reserved for hot files.

If I remember correctly, these files are put onto the inside of the disk platter because it's 'spinning faster' there (angular velocity, right?) and so has better read times. This means that commonly used files are accessed really quickly.

And although HFS+ (the Mac file system) is pretty clean on it's own, I also vaguely remember that there is a built in schedule for cleaning up the hard-drive (friday nights?). I think Apple figure that if you're machine isn't one during one of these scheduled times, you're probably not using it enough to warrant the clean up. You'll certainly notice a bit of heavy disk activity at odd times. As all this is dredged from the back of my head, someone else might be able to clear things up more for you.

<edit: errr, that would be the outside of the platter that has a higher angular velocity?)>
 
More and better info

Here's a description of what is actually going on with the file system. This summary comes from X vs XP :

Hot-File-Adaptive-Clustering places frequently used files on the faster portion of a hard disk

Automatic file defragmentation defragments files under 20 MB as the are opened

Delayed allocation allows a number of small allocations to be combined into a single large allocation in one area of the disk.

Aggressive read-ahead and write-behind caching reduces the impact associated with minor fragmentation

HFS+ avoids reusing space from deleted files as much as possible, to avoid prematurely filling small areas of recently-freed space, thereby reducing the risk of fragmentation.


Here's some further clarifications from MacInTouch:

From what I have been reading, the Darwin Gurus say that there are actually two separate file optimizations going on in Panther.

The first one is automatic file defragmentation. When a file is opened, if it is highly fragmented (ie. 8+ fragments) and the file is under 20MB in size, it will be automatically defragmented. This is accomplished by the file system just moving the file to a new location. This process only happens on Journaled HFS+ volumes.

The second optimization is called "Adaptive Hot File Clustering". In general, it works like this: over a period of 60 hours, the file system keeps track of files that are read frequently (for a file to be considered as a hot-file, it must be less than 10MB and never written to). At the end of this period, the "hottest" files (ie. the files that have been read the most times) are moved to the "hotband" of the disk (which is that part of the disk which is particularly fast given the physical characteristics of the disk).

The size of the "hotband" will depend on the size of the disk (ie. 5MB of hotband space for each GB of disk). "Cold" files that were in the hotband will be moved out of the hotband to make room for the hot files. As a side effect of being moved into the hotband, the hot files are defragmented.

Currently, Adaptive Hot File Clustering only works on the boot volume, and only for Journaled HFS+ volumes that are more than 10GB.


And some more:

First, journaling needs to be enabled for this to work (unlike third-party defragmenters, which requires that journaling be turned off during the process).

Next, it only automatically defrags files less than 20 megabytes, and which also have at least eight "extents" (a directory tracking mechanism) in the directory (indicating, generally speaking, that the file is pretty fragmented, and will require an "extents overflow" which will cause slower loading times.)

Consider it a minor tune up, and not a day at the garage. If you're constantly creating large files (such as video and audio) a true optimizer will free up more contiguous space than the automatic defragmenting in Panther... which will leave other files (which don't meet the criteria above) fragmented.


My own take on this is that de-fragmentation of files larger than 20megs is unlikely to result in a faster read time. However there are a number of tools to download that can do a full defragmentation. This issue I mentioned above with the scheduled 'clean up' is OSX deleting temporary system files and the like, rather than doing a 'defrag'.
 
aswitcher said:
Can anyone comment on speed improvements using tools to defrag OSX?

Aswitcher, I was going to say 'probably negligible, as not very many files are 20+Mb in size', but I see you have a DV cam ;)

Check out this site for a tool that does all sorts debugging on HFS+ volumes, including fragmentation stats (it's command line only i thinks). The author has done some tests with it, and states at the bottom:

Defragmentation on HFS+ volumes should not be necessary at all, or worthwhile, in most cases, because the system seems to do a very good job of avoiding/countering fragmentation.

This is one defragger I came across - it's pretty pricey though.
 
mim said:
Aswitcher, I was going to say 'probably negligible, as not very many files are 20+Mb in size', but I see you have a DV cam ;)

Check out this site for a tool that does all sorts debugging on HFS+ volumes, including fragmentation stats (it's command line only i thinks). The author has done some tests with it, and states at the bottom:

Defragmentation on HFS+ volumes should not be necessary at all, or worthwhile, in most cases, because the system seems to do a very good job of avoiding/countering fragmentation.

This is one defragger I came across - it's pretty pricey though.

Thanks. Nice tool but scary price.

I'll assume that OSX knows what it is doing...
 
aswitcher said:
Thanks. Nice tool but scary price.

I'll assume that OSX knows what it is doing...

Hmmm...I found a slashdot post here that seems to suggest it might be worthwile getting a defragger if you work with video.

There must be a free (or atleast, less expensive) tool around. I'l have a bit of a look.

<edit - typos>
 
<edit: errr, that would be the outside of the platter that has a higher angular velocity?)>
Yes, the outside should be the fastest--the most surface area passes under the head at the same rate of rotation. (= same angular velocity, higher linear velocity.)


Here's one pretty bad case that might give you an example of what to expect: I have 120GB data-only drive that I use for a bunch of little files, plus several large DV projects as well as a collection of files in the 500MB - 1GB range. It has been run under 10.3 for almost a year with no care other than fsck on occasion, and for most of that time it's been well over 90% full, on several occasions with essentialy no space left.

This is close to a worst case scenereo--little free space, large files, long time. As it turns out, less than 0.7% of the 46,000 files on it are actually fragmented. Of those, about four files are badly fragmented, and another 10 or so aren't looking so hot. Of these, all are over 600MB in size, which isn't at all surprising.

The only time I've seen any actual indication of slowdown is when the disk is above 98% full and I copy a relatively large file to it, at which point there's quite a bit of disk activity and performance is noticably slower.

The point here being that OSX does a reasonably good job of keeping fragmentation to a minimum on its own. It's not perfect, and if you have a drive without a lot of free space that you're moving big chunks of data on and off of (video projects, mostly), then it will get fragmented.

For the majority of people, it just doesn't seem to be worth worrying about, though some people who do a lot of video will probably benefit from either a 3rd party defragger or periodically erasing the drive.
 
With OS X, there is no need to worry yourself about defragging your HDD. It does a pretty decent job, however if you want to set your mind at ease you can use MicroMats "TechTools" it should solve most if not all your problems and concerns.

It is well aware that many utility applications that come included with the OS do an alright job not the best. So if OS X does a decent job why worry unless you are working with video or using your Mac as a TV or TiVo PVR.

Sit back relax and enjoy you are not working on a wintel box. :D
 
DiskWarrior (in OS X) I believe only rebuilds the directory structure of a disk, and does not defragment its files. Drive10 (also by Micromat but still my preference over TechToolPro, which I have had issues with) will do that and defrag a HD.

The two apps have slightly differing ideas of what constitutes a perfect directory structure, but in 2 years plus of using them I have never had any problems. Most of the time I use DW for the directory and then D10 for defragging, always rebuilding the directory before defragging.

Does it make a difference? Well, when my HD is close to full and D10 shows it as heavily fragmented then yes, it does, but I tend to move files around a lot and have a large turnover of data on my measly 20GB HD.

For casual use I don't think I would bother defragging, as seems to be the general consensus.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.