Some clarification needed here, is my understanding of the two standards MB/s and Mb/s ( Megabytes verses Megabits ) correct ? A quote from an Everythingusb post about these figures : "During any normal migration from a USB 2.0 to USB 2.0 averaged around 20 MB/s on any drive I've ever used. Upgrading from the USB 2.0 to a USB 3.0 got me an average of 40 MB/s. On transferring from USB 3.0 to USB 3.0 I got an average of 58 MB/s. Read the speeds below. A Low Speed (USB 1.1, USB 2.0) rate of 1.5 Mbit/s (187 kB/s) that is mostly used for Human Interface Devices (HID) such as keyboards, mice, and joysticks. A Full Speed (USB 1.1, USB 2.0)rate of 12 Mbit/s (1.5 MB/s). Most USB Hubs support FullSpeed. A Hi-Speed (USB 2.0) rate of 480 Mbit/s (60 MB/s). A Super-Speed (USB 3.0) rate of 4.8 Gbit/s (600MB/s)" SO ... with a new & empty Lacie 7200rpm USB2.0 external drive, if I see a Speed Disk measurement of say 40 MB/s then this is X8 to get Mb/s, that's 320 Mb/s ... not far from USB2.0 max spec of 480 Mb/s ?? Is this correct ?? Many thanks for clarification OR letting me know if I have this totally wrong !! Regards, Martin.