The Dynamic Video Location/Direction Tagging bible

Discussion in 'iOS Apps' started by Menneisyys2, Dec 31, 2012.

  1. Menneisyys2 macrumors 603

    Joined:
    Jun 7, 2011
    #1
    Ever wanted to know whether you can know not only the precise GPS coordinates of something you've shot a video of during travel, but also the direction (based on the internal compass) of the camera while the object was taken a video of? Every wanted to know how this can be done on an iPhone (or any 3G iPad) – or, if you have a non-GPS/compass-equipped external camera, with the help of your iDevice? Read on – this article is for you!

    In the first chapter of the article, I explain why you can't do this with Apple's stock Camera app – or for that matter, when compass recording is concerned, any decent(!) standalone camera out there.

    In the second one, I show you two third-party apps for iOS that (are supposed to) do the trick. Should you only need an app for shooting with dynamic location / direction information, you can safely skip all the other chapters.

    In the third chapter (“Shooting with an external camera”), I explain how iPhones (and 3G iPads) can be used as a GPS & compass tracker to be used together with an external camera. After listing and evaluating several GPS (but not compass!) trackers readily available in the AppStore, I present mine, along with source code(!), that does this – including even compass tracking. If you in no way want to shoot with your iPhone for the reasons outlined in this chapter (narrow field-of-view or mono audio, for example) but would still use it as an excellent tracker, this chapter is for you.

    Note that THIS article also explains the advantages of dynamic location / direction storing - you might want to read it before going on, should you not see the point in all this, based on my introduction above.

    1. The problem

    1.1 The problems with dedicated cameras


    I've frequently posted information on cameras capable of storing GPS coordinates along with still shots (and videos). One of my 14-month-old list is in my well-known still photo geotagging tutorial.

    Unfortunately, the situation hasn't changed much in the last 14 months. Still very few cameras have built-in GPS, let alone compass. Video support is even worse: the only cameras with continuous(!) GPS data recording are pretty restricted when it comes to overall image quality, with particular emphasis on low-light performance. Unfortunately, IQ-wise the best compact cameras with GPS like the Canon S100, aren't able to record continuous GPS data at all (as they aren't using AVCHD but restricted (see Section 1.2.1 below) simple MOV files). The GPS-equipped AVCHD models, already capable of dynamic location logging, in the Panasonic TZ / ZS series (for example, the ZS20) can't boast very good image quality. Finally, Sony's HX-series P&S cameras (the only ones that do have compasses in addition to GPS'es, both dynamically logged along videos), being compact superzooms, aren't really meant for enthusiasts, not even in their most recent reincarnations as their lens are pretty slow and sensors tiny (as with all compact superzooms, as opposed to true high-quality small enthusiast cameras like the Sony RX100, the Fujifilm XF1, the Panasonic LX series etc. - see THIS for more examples of these cameras).

    Needless to say, the king of sports cameras, the GoPro series, don't even contain a GPS unit (let alone a compass), not even in their third reincarnation. The only dedicated sports camera model with GPS, the Contour+2, doesn't have compass either – and is, generally, much less recommended than the GoPro series, particularly if you need easy and safe chest / head mountability particularly important to, say, skiers like me.

    1.2 The problems with Apple's approach: a single GPS location + no compass info

    iPhones starting with the 3GS, iPod touches with the 4th generation and all iPads all have compasses, in addition to all video recording-capable iPhones' and all 3G-enabled iPads having GPS. Unfortunately, these are not utilized when shooting videos. This, unless you either use third-party apps (like that of Ubipix, see below), means you'll never know where the iPhone was when a particular frame of a video was shot and to which direction it was pointed.

    The inability to save GPS / compass information of iOS is because of several reasons. In the following subsections, I explain them all. Again, this info doesn't truly needed by people that “only” want to know what AppStore app to select to be able to dynamically geotag their videos. They can safely skip the remainder of this chapter and move straight to chapter 2.

    1.2.1 Video container restrictions

    I've very often explained why the MOV / MP4 / M4V container Apple only supports is inferior to more advanced ones like MKV or AVCHD. Then, I haven't mentioned AVCHD also supports variable, continuously changing location (and compass) info storing with every single(!) frame.

    MOV's recorded by iDevices, unfortunately, only support one, global location tag. It's this value that QuickTime shows like in the following dialog (via Cmd + I):

    [​IMG]

    (As with all the other images in the article, click for a much better-quality and larger one.)

    It's worth pointing out that the, otherwise, great and highly recommended MediaInfo doesn't display this info:

    [​IMG]

    Incidentally, there is something interesting here: the tagging time (11:11:53) is 12 seconds later than the recording/encoding date (11:11:41). This is because location data is not guaranteed to be present when starting recording the video – it can be stored in the target MOV file when it becomes available. This also means the coordinates you find in a video shot on an iDevice aren't necessarily that of the starting point but may be several (in this particular case, 12) seconds into the video.

    Physically, the GPS coordinates are at /moov[0]/meta[0]/ilst[0]/[2] as can be seen in the following screenshot shot with IsoViewer:

    [​IMG]

    Not being associated with any particular frame, this also means there can be only one instance of this data for the entire video stream.

    Programmatically setting the metadata property of AVCaptureMovieFileOutput (see section “Adding Metadata to a File” in the official “AV Foundation Programming Guide”) doesn't help either as it sets exactly this list element and not that of the currently recoded frame. That is, creating your own video recorder and continuously setting the metadata with the then-current GPS coordinates won't work. The MOV container (and Apple's limited API to manipulate it) simply doesn't support this.

    1.2.2 What about the video framegrabs of the iPhone 5?

    Unfortunately, not even the shots (technically, as opposed to more decent cameras like the GoPro 3 Black Edition, the Nikon System 1 cameras or newer Sony models delivering full-resolution still images, vastly inferior simple framegrabs) made by the iPhone 5 have dynamic location info. (They could as each of them does have a fully independent location tag.) That is, if you travel with a car while continuously shooting a video and, without stopping video recording, tap the still snapshot icon on your iPhone 5's screen, the 1920*1080 still shot stored will have the exact same location tag as that of the video – that is, the location it was tagged. That is, these shots will not have the actual location of taking the shot itself.

    2. Third-party solutions readily available on iOS

    2.1 Ubipix


    Without doubt the best application to record dynamically geotagged videos even with compass info is the free Ubipix (AppStore link).

    Basically, when you record a video from inside the app, it'll also record all the location and camera direction info along with the video recording. These recordings are stored in the /Documents/defaultFolder folder of the app. Recording one video will result in three video files to be created: the video recording itself (say, 1356684671.mov), a textual file (.txt) containing all the location and compass info and, finally, a PNG framegrab of the video of the same resolution as the video itself. The video file isn't encrypted, which means you can edit it before uploading (a probably important thing, should you want to remove the audio track before making the video public). You can even shoot video in another, much more capable app allowing for a lot more features (torch, snapshots on the iPhone 5, separate exposure / focus carets, a lot of additional settings on jailbroken devices in the stock Camera app via CameraTweak etc.) of the current iPhone model (say, the built-in Camera app or third-party ones like FiLMiC Pro) and, then, manually transfer the video to this Ubipix folder for later upload. (To create a compatible .TXT file with all the location / compass info in Ubipix's own format, you'll need to modify my location logger discussed in Section 3.2.) Also, you can shoot a video with an external camera and, after converting (with AVCHD videos, just remuxing via iVI) its video to a MOV container (if it isn't already shot using that one), do the same before upload.

    After finishing shooting (which, again, may involve a conversion with an external tool or even transferring videos from another camera), all you need to do log in to your free Ubipix account in the Settings dialog of the iOS client (annotated by a red rectangle below):

    [​IMG]

    (an iPhone 5 shot; as you can see, the app doesn't make use of the full screen estate of the newer, 16:9 devices)

    and, then, selecting the video to upload and tapping the upload icon in the upper right corner. If you want to keep the video private, you'll need to tap the “Public” icon (the third from the top) before uploading so that it becomes “Private”.

    These two icons are both annotated in the following screenshot:

    [​IMG]

    After tapping “Upload”, you'll need to select a target channel; currently, only one is available (“Ubipix”):

    [​IMG]

    After tapping it, just check the “Public” checkbox. Don't be afraid if you upload a private video, this won't make its visibility public:

    [​IMG]

    Then, just tap the Submit button, also annotated in the above screenshot.

    The video uploads and, after a while, becomes available. You'll be e-mailed by the unique URL of your video. You can access it both through the link and by logging in in both a desktop and a mobile (including iOS) browser, by tapping the “myTracks” link in the top right corner. You can also quickly navigate to your public videos on the main map if their location is unique. For example, the following shot shows three videos shot here in Finland (all the three by me), after clicking the circle with the number in it:

    [​IMG]

    2.1.1 Ubipix problems, bugs and some fixes

    2.1.1.1 Disabling audio


    Note that the Settings dialog has an audio switch (annotated by a green rectangle below):

    [​IMG]

    While it's even disabled by default, it doesn't seem to have any effect on the upload. That is, there is no way to disable audio upload. The only way to get rid of the audio track is manually editing the original files (removing the audio track) and uploading the edited versions.

    For this, I recommend MP4Tools (albeit other MOV editor tools should also work). Just make sure you rename the .mov extensions to .m4v before dragging the files to MP4Tools and, after deleting the audio tracks, rename the files back to .mov. Then, the files become unplayable on the device but not after upload. All the example videos below (see section 2.1.2) have been processed this way before upload – this was the only way to remove private family member chat (albeit it was in Finnish so few would have understood it).

    2.1.1.2 Other problems

    - Compass errors can't be fixed by simply adding a constant bias (see below)

    - Buggy iOS interface: with videos taking up more than one page (that is, with more than 5 videos), the filesize will be always the same on subsequent pages as on the ones on the first page. Some examples:

    [​IMG]

    This is the first page of the list. Note the filesizes: 354.51, 6.79, 4.79, 8.94 Mbytes. Now, let's take a look at the second page (Ubipix organizes the videos by their shooting date):
    [​IMG]

    The filenames and the shooting dates are correctly displayed, so are the thumbnails – but the sizes aren't. They're exactly the same as on the first page: 354.51, 6.79, 4.79, 8.94 Mbytes.

    The situation is the same on the third and fourth pages:
    [​IMG]


    [​IMG]

    Note that there are other, display-related bugs. For example, the date of the previous upload is also incorrectly displayed (in the screenshots right in the uploading-specific 2.1 section, 12/09/2012).

    - no desktop or local map rendering capabilities – you'll always need to upload the video and watch the videos streamed from Ubipix's servers.

    - there isn't any location-specific information (town / street names or even all locations shown on a map) in videos before uploading, unlike with the, in this respect, vastly superior GeoVid. (See 2.2 for more info on GeoVid.)

    - limited recording time: 15 minutes. Note that, should you want to switch to the highest-quality (on iPhone 4S+es, 1080p) mode, it automatically switches back to 5 minutes. (You can always change this with the dedicated slider in Settings.)

    - there's no way of downloading the video from Ubipix's servers as, say, an MP4 file. (Or, for that matter, in any other format.) This is unlike traditional video (or photo) sharing sites like Vimeo, YouTube etc. That is, should you want to archive your videos, you'll need to make use of, say, iExplorer to directly access the Documents/defaultFolder folder of the Ubifix app. (Needless to say, it's not possible to do this via iTunes as iTunes File Sharing isn't enabled by the app.)

    2.1.2 Ubipix examples you can try right away

    I've made three videos public so that you can check out how location / direction tracking works. They work just fine even without installing Ubifix on your iDevice. You'll want to use a desktop browser or an iPad for watching the examples – definitely not an iPhone / iPod touch. (See the next, 2.1.3 section for the why's.) All these videos have been shot on my iPhone 5 here in Finland, Iisalmi.

    In-town video

    Note that this video shows a compass at least 90 degrees off in the default “Digital compass” mode; everything else OK. (Of course, switching to the GPS Bearing mode this is fixed but, then, the camera's current direction isn't at all shown. See section 5. for more links to discussions on the difference between bearing (perfectly computable from at least two pairs of simple GPS coordinates) and true, compass-based camera direction.)

    A big problem with Ubipix is the lack of any way to define a constant bias, which would have definitely made the camera direction display right with this and the following video.

    Outside the town

    In this video, particularly the 110 → 150m uphill can nicely be seen on the bottom graph: both the altitude and the speed (the car shows down while climbing the uphill). The following shot shows the position just before the uphill:

    [​IMG]

    Finally, let me present you a video where the compass did work properly (instead of in a car, it's shot outdoors):

    HERE

    This also shows that, even here in the North-east, compass is working OK.

    (cont'd below, in the second post)
     
  2. Menneisyys2 thread starter macrumors 603

    Joined:
    Jun 7, 2011
    #2
    (continued from above)

    2.1.3 Playing videos after uploading

    As has already been mentioned in Section 2.1.1.2, it's not possible to (meaningfully – that is, along with location / direction info) play back videos without uploading them first.

    For playing the geotagged videos back, you won't need a separate application – just a Web browser. Even the iOS Safari will do, which nicely renders and updates the location / direction info, along with the speed / altitude ones, during playback, while playing back the video in a window, just like a desktop browser – at least on an iPad. An example screenshot:

    [​IMG]

    Unfortunately, on the small-screen iOS devices (iPhones and iPod touch models), videos can only be played back in full screen mode and not in a window. This means you can't see the metadata dynamically changing during playback. You'll always need to pause playback to go back to the main Web view at video positions you want to see the then-current location / direction info. It's a bit slow but, if you really need to know where an object really is, works. (Nevertheless, you will still want to stick with an iPad or a desktop browser, which are all capable of non-fullscreen playback.)

    An example screenshot of how the Web interface on the iPhone 5:

    [​IMG]

    Note that tapping the Play arrow in the video will switch playback to fullscreen.

    2.2 GeoVid

    GeoVid (main homepage; AppStore link) is another application for dynamic geocoding. While it's, in some respects (mainly, local, before-upload metadata review support), vastly superior to Ubipix, I don't recommend it.

    For example, if you do upload your videos, they won't (necessarily) become accessible. I've uploaded some videos of Finland. None of them become publicly accessible, as opposed to Ubipix's app. I wonder if one can at all make anything public. I've even searched for “Finland” and the like, without any success.

    Also, it's VERY slow to upload – at least from here (center-Finland). Ubipix is waaaay faster on exactly the same network.

    There are no accounts with private storage, unlike with Ubipix; that is, you can't just upload your videos to have proper, dynamic location / direction support for your own consumption only.

    However, it has a definite advantage over Ubipix: it can display the route right inside the iOS app – without having to upload the title first. An example:

    [​IMG]

    Also, it lists the locations of all recordings right in the video list – something also painfully missing from the Ubipix app. Some examples:

    [​IMG]


    [​IMG]

    Note that it might take some time for these location names to be displayed in the list after finishing shooting.

    3. Shooting with an external camera

    While iPhone models starting with iPhone 4 have finally considerably narrowed the gap between iPhones and top Nokia models (the 2009 iPhone 3GS was the first iPhone to shoot any video while the early 2007 Nokia N95 already shot VGA video – almost two and a half year before Apple), they still suffer from some major problems rendering them pretty much incapable of shooting videos in a lot of circumstances:

    - lack of stereo audio recording (many still and most video cameras do record stereo audio – and, yes, Nokia's top Symbian phones, even the early 2010 Nokia N8, too. Apple, you're at least three and a half years late, assuming the iPhone 5S/6 is released next Summer.) And I haven't even talked about high-amplitude audio recording (present in the Lumia 920 / Nokia 808), also missing from iPhones.

    - lack of a wideangle lens while shooting hi-res video. We still have to live with an approximately 43mm, that is, pretty narrow field-of-view while shooting videos. While most (consumer) video cameras also start at around 40mm (this is probably their biggest downside compared to still cameras also shooting videos), consumer-priced (non-professional) still cameras generally start at a much-much better and wider 24...28mm, even when shooting videos. (Some – for example, the Nikon P500 - even start at 22.5mm ultra-wideangle.) And, yes, Nokia's recent, top phone models are much better in this respect as well: the Nokia 808 uses an equivalent 26mm when shooting videos, while the Nokia Lumia 920 uses 28mm. (Apple, another area you should do your homework.)

    - lack of rugged enclosures easily attachable to the human body or cars, surfboards, helmets etc. Ever seen something like the GoPro Head Strap Mount or the Chest Mount Harness? I bet you haven't. They'd be extremely good for e.g. sports shooting but nothing is available. (Of course, without a wideangle lens, it's pretty much impossible to use any iPhone models starting with the iPhone 4 for action shooting. There are, however, wideangle / fisheye lens add-ons. Their quality – particularly of those cheap Chinese knock-offs -, of course, may leave a lot to be desired.)

    All in all, there still are areas when it's just not feasible to use the iPhone for video shooting.

    (NOTE: I'm comparing to Nokia's high-end phones only as, video recording-wise, they have always been considerably better than anything else on the market at the time. There would be no point in comparing the iPhone 5 to the, compared to even the iPhone 5, inferior, say, Samsung Galaxy III.)

    However, you can still use your iPhones to record not only the GPS coordinates, but also compass (direction) information. The latter is painfully lacking from standalone GPS trackers; for example, from my personal favorite, the BTGP-38KM.

    Of course, the direction of separate cameras and iPhones can render the compass information unreliable. (Unless you mount your iPhone on your camera or just keep them in stacked up in one of your hands, of course.) There may be cases, however, then the phone records true camera direction. An example is the above-linked GoPro Chest Mount Harness. If you put your iPhone in your shirt's (front) pocket, the screen facing your chest, it'll record reliable direction info, always recording the true direction of the camera. (If you don't have a pocket guaranteeing the iPhone's back will always face straight ahead, you can put your phone in any other pocket, assuming it's tightly staying in there and won't wobble. However, then, you'll need to apply a constant bias to the recorded data. Programmatically, it's not at all hard – or even if you export the data to, say, an Excel spreadsheet and do the addition there.) Of course, this couldn't be done with the Head Strap Mount (unless, in some way, you also fasten your iPhone on your head.)

    3.1 Readily available AppStore apps

    I've tested all AppStore GPS tracker apps containing the word “compass” in their description.

    GPX Master

    EveryTrail: Displays heading only in the summary screen – something easily computable using two track points (see THIS) and having nothing to do with the compass

    GPS Tracker: depends on external service; not for local GPX logging

    MapMyRun: same

    Speedometer: speed only

    GPS Cycle

    GPS Device Data: free is ad-supported (Premium is $1); no logging, only full GPS data (incl. Compass)

    GPS Stone

    Maps+; $2.99 IAP: no sign of compass

    GPSRecorder

    None of these supports recording compass data. However, if all you need is location (not direction) recording, go for any of these titles where I haven't explicitly stated it's incapable of logging to a file.

    3.2 No AppStore apps for compass logging: what now?

    If you (or any of your friends) have a (paid) developer account, you're saved: I've written an app that does exactly what is needed. It records the full info of the current compass / location data whenever they change with the thresholds of five degrees (compass) and one meter (location) to avoid exporting “noisy” data and using too much storage. Now, a 10-20-minute session only uses some hundreds of kilobytes.

    The entire source code of the project, available for re-compilation and deploying, is HERE. The app delegate's non-trivial methods (and one global variable) are as follows:

    NSOutputStream *outputStream;
    - (void)printStringToFile:(NSString *)stringToLog {
    NSData* dataToStoreInAFile = [stringToLog dataUsingEncoding:NSASCIIStringEncoding];
    NSRange dataRange;
    dataRange.location = 0;
    dataRange.length = [dataToStoreInAFile length];
    uint8_t buffer [dataRange.length];
    [dataToStoreInAFile getBytes:&buffer range:dataRange];
    [outputStream write:buffer maxLength:dataRange.length];
    }

    - (void)locationManager:(CLLocationManager *)manager didUpdateHeading:(CLHeading *)newHeading {
    // NSLog(@"currentHeading: %@c1", newHeading);
    if (newHeading.headingAccuracy < 0) return;
    NSString* newRow = [NSString stringWithFormat:mad:"%@: %@\r", [NSDate date], newHeading];
    [self printStringToFile:newRow];
    }

    - (void)locationManager:(CLLocationManager *)manager didUpdateToLocation:(CLLocation *)newLocation fromLocation:(CLLocation *)oldLocation
    {
    // NSLog(@"locationLoc: %@", newLocation);
    if (newLocation.horizontalAccuracy < 0) return;
    NSString* newRow = [NSString stringWithFormat:mad:"%@: %@\r", [NSDate date], newLocation];
    [self printStringToFile:newRow];
    }

    - (BOOL)application:(UIApplication *)application didFinishLaunchingWithOptions:(NSDictionary *)launchOptions {
    [self.window makeKeyAndVisible];

    CLLocationManager* locManager = [[CLLocationManager alloc] init];
    locManager.delegate = self;
    locManager.distanceFilter = 1;
    locManager.headingFilter = 5;
    locManager.desiredAccuracy=kCLLocationAccuracyBest;
    [locManager startUpdatingLocation];
    if ([CLLocationManager headingAvailable])
    [locManager startUpdatingHeading];

    NSArray *paths = NSSearchPathForDirectoriesInDomains(NSDocumentDirectory,NSUserDomainMask, YES);
    NSString *documentsDir = [paths objectAtIndex:0];
    long long currTimeMs = [[NSDate date] timeIntervalSince1970] * 1000;
    NSString* fileToRender = [documentsDir stringByAppendingString:[NSString stringWithFormat:mad:"/%lld-data.txt", currTimeMs]];
    // NSLog(@"fileToRender: %@", fileToRender);

    BOOL createSuccessful= [[NSFileManager defaultManager] createFileAtPath:fileToRender contents:nil attributes:nil];
    if (createSuccessful) {
    NSLog(@"File creation succesful!");
    outputStream = [[NSOutputStream alloc] initToFileAtPath:fileToRender append: NO];
    [outputStream open];
    }
    return YES;
    }


    The interface declaration:

    @interface CompassDemooooAppDelegate : NSObject <UIApplicationDelegate, CLLocationManagerDelegate> {
    UIWindow *window;
    }


    Note that this is really a barebones app. I haven't even created a view controller in the process – all the business logic, as has already been mentioned, is in the app delegate. Basically, it's as follows:

    1, in didFinishLaunchingWithOptions, I initialize the location / compass reading of CLLocationManager. Also, I create a file, the filename based on the actual number of milliseconds elapsed since midnight Coordinated Universal Time (UTC), 1 January 1970 (that is, Unix time) so that it surely doesn't overwrite previous tracks. These tracks will, then, be accessible via simple iTunes File Sharing (as the .plist file of the project contains the UIFileSharingEnabled boolean with the value “YES”.)

    By the way, the .plist also contains an UIBackgroundModes array with a single element of “location” to make sure recording also continues while the phone is suspended or another app (even one using the location services) runs, including the built-in camera app.

    2, the location callback, didUpdateToLocation, simply logs the full location data to the file.

    3, the compass callback, didUpdateHeading, does the same to the compass data.

    4, finally, applicationWillTerminate closes the output stream – this will be called upon terminating the app from the task manager. Note that you can access the content of the log file before closing the file and no harm will happen if it isn't closed.

    Note that there's a helper method, printStringToFile, which does the actual file writing. It isn't using NSString's 'writeToFile' so that it doesn't need to first store the to-be-logged data in memory (to avoid losing this data entirely upon some kind of a failure, including the app crashing when it entirely runs out of memory) as is done in, say, HERE. Instead, it uses low(er)-level file writing with NSString → NSData conversions and [NSOutputStream write:maxLength], hence the somewhat longer code. As you may know, there isn't anything like PrintStream/PrintWriter.print(ln) (Java) or fprintf() (C(++)) in the iOS libraries.

    Also note that, not having any UI components at all, all you'll see is a white screen while running the app. However, the logging will start right after you start the app until you kill it.

    3.2.1 Enhancing my app

    Should you want to make it more universal by exporting the GPS / compass data into a standardized GPX file (Wiki),

    1, you need to be aware of that the GPX standard doesn't have any standardized way of doing so (this is also mentioned HERE). I recommend following the MikroKopter.de extension of the GPX standard explained HERE; most importantly, the <Compass> tag. (Note that the <Course> tag isn't based on the values the hardware compass reports. It's the standard bearing value the GPS unit reports and will always depend on the actual speed vector of the camera, not the (arbitrary) direction of it.) Of course, you can come up with your own tag extensions if you want to do so.

    Instead of polling the current location and compass data in the compass and location callbacks, respectively, you can just store their previous value in a global or method-local-static variable so that all GPX records contain both the location (based on GPS) and the camera direction (compass).

    2, instead of creating and writing to the GPX file directly, you can use free-to-use GPX creation libraries for iOS. I know only of them: GPX-Logger (homepage). Note that it uses the dynamic framework HERE. (More info & discussion HERE).

    Let me know if you really need the recorder to record to a particular format but can't extend my app. If (and only if) I have some time, I create the enhanced version.

    4. What about processing GPS (and possibly also compass) data in AVCHD files?

    This section is only meant for people having a camera recording true, dynamic GPS (and, with some SONY cameras, also compass) data into AVCHD file structures. It will NOT be usable for people that only want to use their iPhones or their iPhone + another non-GPS/compass-capable camera. However, as this info is closely related to the subject of this article and the project it links to isn't well publicized, I decided to dedicate an entire section to it.

    There is an app, AVCHD2SRT, for creating a frame-specific (timed) SRT subtitle file off existing AVCHD videos HERE. Note that the discussion HERE not only discusses it, but also the practical questions of everything related.

    5. Other discussions

    THIS thread talks a lot about the differences of the GPS bearing (easily computable via, say, MapSource) and true compass.

    THIS discussion links to several (non-iOS/compass-specific) video mapping services.
     
  3. Menneisyys2 thread starter macrumors 603

    Joined:
    Jun 7, 2011
    #3
    UPDATE (01/01/2013): 1, I've played quite a bit with the (proprietary) location / direction metdata (the .TXT file) format of Ubipix so that I can both enable cutting from videos before uploading (something impossible now) with keeping the correct metadata and to be able to export right to this format from my own location / direction recorder app (see sources above).

    Fortunately, video editing, while it's in no way supported by the Ubipix client, using an external MOV editor, can be done flawlessly. I used one of the best quick video editors, QuickTime Pro 7, for cutting out the first 116 seconds from a fireworks video I shot in Iisalmi, Finland, yesterday (it's online; link). The (cut) video I've saved from QT7Pro was without problems accepted by the Ubipix app.

    Cutting the .txt file, however, was more problematic. It does need to be cut – when uploading the original TXT file with the heavily cut new video, Ubipix's server just wouldn't accept it – it silently refused it.

    Basically, if you cut out X seconds from the beginning, you'll need to remove the first X number of "features" records from the file. A real-world example follows: here's the metadata file of my Iisalmi video, with the records 4...352 (the location / direction values of the fourth … 352nd second of the video) removed:

    {"detail":{"title":"testwithcutlast120records1356983903","hashtag":"Test hashtag","group":"Test group","description":"Test Description","private":1,"instance":"Test instance"},"features":[{"geometry":{"properties":{"fov":51.06,"pitch":339.1122,"speed":5.781224,"decl":4,"time":1356983903743,"accuracy":5,"bearing":251.7188,"yaw":264.1093,"roll":285.6798},"type":"POINT","coordinates":[27.18814,63.55495,85.979]},"type":"Feature","vsec":"1"},{"geometry":{"properties":{"fov":51.06,"pitch":337.5637,"speed":5.539752,"decl":4,"time":1356983904726,"accuracy":5,"bearing":253.125,"yaw":264.1093,"roll":286.1889},"type":"POINT","coordinates":[27.18805,63.55494,85.90985]},"type":"Feature","vsec":"2"},{"geometry":{"properties":{"fov":51.06,"pitch":338.0775,"speed":5.313472,"decl":4,"time":1356983905726,"accuracy":5,"bearing":251.3672,"yaw":263.1093,"roll":287.5141},"type":"POINT","coordinates":[27.18797,63.55493,85.43353]},"type":"Feature","vsec":"3"}, ... ,{"geometry":{"properties":{"fov":51.06,"pitch":358.6461,"speed":0.149863,"decl":4,"time":1356984254986,"accuracy":10,"bearing":40.78125,"yaw":245.3681,"roll":234.0427},"type":"POINT","coordinates":[27.1858,63.55468,79.14551]},"type":"Feature","vsec":"354"}],"account":{"ver":"Ver:1.3","device":"XX:XX:XX:XX:XX","user":"default_user"}}

    Here, it's the records starting with "geometry" and bounded by {}'s are the ones you'll need to recognize upon cutting. For example, in the above example, the first record is as follows:

    {"geometry":{"properties":{"fov":51.06,"pitch":339.1122,"speed":5.781224,"decl":4,"time":1356983903743,"accuracy":5,"bearing":251.7188,"yaw":264.1093,"roll":285.6798},"type":"POINT","coordinates":[27.18814,63.55495,85.979]},"type":"Feature","vsec":"1"}
    And the second one (immediately following, separated by a single “,”) is as follows:

    {"geometry":{"properties":{"fov":51.06,"pitch":337.5637,"speed":5.539752,"decl":4,"time":1356983904726,"accuracy":5,"bearing":253.125,"yaw":264.1093,"roll":286.1889},"type":"POINT","coordinates":[27.18805,63.55494,85.90985]},"type":"Feature","vsec":"2"}

    Notice the last key-value pairs in these records: "vsec":"1" for the first, "vsec":"2" for the second etc. Even the key name (vsec) refers to what it is: “video second”.

    This means all you need is remove the X records of these records from the file, making sure the header (telling Ubipix the name of the video) and the footer (with the device ID (XX'ed out above) and some other stuff.) It's no problem if you don't recompute these values if you delete a section from the beginning of the video: “vsec” can safely start with, say, 116 (as below), it will still be considered as the first second of the uploaded video.

    For example, with the above Iisalmi example, the resulting metadata file has become as follows (again, I've left out most of the inner records):

    {"detail":{"title":"cutFIRST115recordsWITHOUTrenumbering1356983903","hashtag":"Test hashtag","group":"Test group","description":"Test Description","private":1,"instance":"Test instance"},"features":[{"geometry":{"properties":{"fov":51.06,"pitch":11.8216,"speed":1.32344,"decl":4,"time":1356984018724,"accuracy":5,"bearing":172.9688,"yaw":193.4087,"roll":284.5805},"type":"POINT","coordinates":[27.1875,63.55501,70.06348]},"type":"Feature","vsec":"116"},{"geometry":{"properties":{"fov":51.06,"pitch":11.78718,"speed":1.33215,"decl":4,"time":1356984019706,"accuracy":5,"bearing":181.0547,"yaw":187.4087,"roll":283.8172},"type":"POINT","coordinates":[27.18749,63.555,69.67194]},"type":"Feature","vsec":"117"},

    ...

    {"geometry":{"properties":{"fov":51.06,"pitch":358.6495,"speed":0,"decl":4,"time":1356984352000,"accuracy":5,"bearing":40.78125,"yaw":252.3681,"roll":224.9837},"type":"POINT","coordinates":[27.18582,63.55467,75.48431]},"type":"Feature","vsec":"470"},{"geometry":{"properties":{"fov":51.06,"pitch":358.6495,"speed":0,"decl":4,"time":1356984352000,"accuracy":5,"bearing":40.78125,"yaw":252.3681,"roll":224.9837},"type":"POINT","coordinates":[27.18582,63.55467,75.48431]},"type":"Feature","vsec":"471"}],"account":{"ver":"Ver:1.3","device":"XX:XX:XX:XX:XX","user":"default_user"}}


    With these changes, you can already transfer the MOV and the TXT data back to your iDevice and initiate the upload.

    If you take a look at the video (as has already been mentioned, it's HERE), you'll notice the location data is a bit (some metres) off but the direction works just fine, as can also be seen in the following screengrab:
    [​IMG]

    Note that the lens of my iPhone 5 might have been wet (the snow has constantly been falling); hence the strange silhouettes over the bright lights.

    2, Note that one of the biggest problem with the Ubipix app is that it doesn't let access the torch control: it'll be lit whenever the app thinks it should. This can't really be fixed (other than waiting for my updated logger and doing all the shooting from the stock Camera app, with my logger running in the background). For the time being, just do the following: when starting shooting, point to the lens to a bright source of light so that the torch won't be light up. Then, you'll be free to continue shooting anything else. Then, you can just edit out the first 1-2 seconds (a strong light source to block the torch) using the method I've explained above.

    BTW, another Iisalmi fireworks video I shot yest... I mean today uses exactly this trick to guarantee the torch remain unlit (to avoid much worse image because of the constantly lit snow falling close to the lens) in THIS video (Ubipix live video link).

    A screengrab showing the light at the beginning (and also showing the infamous PurpleView effect of the iPhone 5's optics – now, with direct light in the field of view of the camera):

    [​IMG]
     
  4. Menneisyys2 thread starter macrumors 603

    Joined:
    Jun 7, 2011
    #4
    A self-standing article - a followup to the above one(s)

    Ubipix-compliant location / direction logger released (with full source code + a quick ARC migration case tutorial)

    In the update section of my previous article (see above) on advanced location and camera direction tagging of videos, I've already mentioned the built-in video recorder of Ubipix, currently, the only iOS application to record truly dynamic, almost (metadata is sampled every second) frame-level location and direction info, in no way has a decent video recorder.

    For example, as has been explained in the article, it's not possible to directly instruct the app not to switch on the torch in low-light situations, unlike with the default, stock Camera app's video recorder. Not mentioned in the article, but certainly visible in both of my Iisalmi fireworks videos published the day before yesterday (HERE and HERE – Ubipix links), the continuous autofocus pretty much messes up videos like those of fireworks. While the stock Camera recorder also uses continuous autofocus, several third-party recorders allow for fully manual focusing, which makes it possible to avoid those really ugly focus hunts.

    Fortunately, with the new version of my now-Ubipix-compliant logger, it has become possible to use any other app to record your videos in, while running the logger on the same iPhone (3G iPad). Of course, you can run the logger on any other iOS device. Also, as has been explained in the original article, you can use any non-iOS camera (e.g., a GoPro action / sports camera) – just make sure you synchronize the internal clock of the camera to that of the iPhone logging the location / direction info.

    My logger

    First, my recorder. As usual, it's available as a freely compilable and deployable Xcode project. I've created a completely new Xcode 4.x project to flawlessly incorporate, among other things, Automatic Reference Counting (ARC; tutorial). The full Xcode project, ready for deployment, is HERE. Feel free to deploy it on your iDevice. Hope you find it useful (and/or the source code instructive).

    Switching to ARC has resulted in having to declare the CLLocationManager instance (so far, a local variable) as a strong property; otherwise, it would just stop being existing after a second or two. ARC, on the other hand, has saved me from having to manually retain / release the now-global location (CLLocation) / direction (CLHeading) instances assigned in the two callback methods, - (void)locationManager:(CLLocationManager *)manager didUpdateLocations:(NSArray *)locations and - (void)locationManager:(CLLocationManager *)manager didUpdateHeading:(CLHeading *)newHeading, respectively. (Note that the former is also updated to conform to Apple's recommendation for iOS 6+: it no longer uses the now-not recommended, old locationManager:(CLLocationManager *)manager didUpdateToLocation:(CLLocation *)newLocation fromLocation:(CLLocation *)oldLocation callback. Dedicated discussion also HERE).

    Note that in the first, non-Ubipix-specific version of my app, I didn't need to use any global variables (or properties) because I did all the direct file writes from CLLocationManagerDelegate's callback methods. However, Ubipix's data format can't be output from those two methods for the following reasons:

    - a single record must be created every second, independent of whether there's any change or not.
    - both location and direction data must be provided in these records, meaning there must be a way of communicate the last-used values to the other callback method so that they both can be output at the same time
    - there is no way of accessing location / direction info synchronously, that is, from a method called via a timer, meaning we absolutely must use CLLocationManagerDelegate's callbacks.

    This is why I needed to create two new global variables, CLLocation * savedLocation and CLHeading *savedHeading. Both are set from the CLLocationManagerDelegate callbacks. Without using ARC, they would have to be retained in the callback methods so that they still live when the timed callback (here, mainTimerCallback) is executed and accesses them. The latter, however, would need to release them to avoid memory leaks. The fact that I also need to copy the location reference to a backup variable (CLLocation * prevTrueLocation) to be able to output the previous value when location info becomes unavailable after a while for some time would have made releasing much more complicated. This means the ARC approach did introduce seriously simplified code (except for having to declare the, before switcing to ARC, strictly local CLLocationManager instance).

    This has already been mentioned, but it's indeed worth dedicating a full paragraph to. In the previous version of the logger, I simply didn't export anything while I didn't have any meaningful location (and heading) information. Ubipix's logging format, however, requires outputting records from the first second. As one may not have any meaningful location info in the first few seconds, in the mainTimerCallback callback method's largeconditional if expression, I just return doing nothing (other than just increasing the counter of the current second, currSecondIndex, which is necessary when, after first acquiring a usable location data, outputting the same location info for all the seconds before). (Note that I've entered a lot of remarks in this method explaining why the different conditional branches were needed and when does the execution enter them.)

    Should you just output an empty CLLocation for currently-unknown locations, Ubipix's server will consider it having been shot at the true 0,0 Earth coordinates as in the following screenshot (as usual, click for the original, better-quality one!):

    [​IMG]

    The recorded tracks are named the same way as in the previous version.

    Note that I've used the standard [NSTimer scheduledTimerWithTimeInterval] for setting up the repeated callbacks. It makes it possible to call back non-CLLocationManagerDelegate methods. Should you delete the NSTimer-based setup and switch to the [CADisplayLink displayLinkWithTarget] above it (commented out), it'll also call back (even when running in background) but only when there's something on the screen. Should you want to use the app as a generic logger (mostly with the screen off), you won't need to change anything. Should you, on the other hand, want to use the logger for background tracking strictly for videos recorded on the same iPhone, you might want to enable the CADisplayLink-specific code (and disable the NSTimer instruction) – after all, it'll help reduce the log file size as nothing will be logged when you suspend the device.

    Using the logger

    Let's see how the logger should be used with a video recorded on the same iPhone. (With an external camera, you'll need to synchronize the clocks and/or convert to a Ubipix upload-compliant format first.)

    First, you'll need the exact timestamp of the start of the recording. Unfortunately, VLC or QuickTime Player don't display this. A screenshot of the latter:

    [​IMG]

    However, the free and absolutely excellent MediaInfo does:

    [​IMG]

    (note the annotated part).

    It's this info (NOT the “Tagged date”) that you'll need to convert into Unix time. For this, use for example the online converter at http://www.onlineconversion.com/unix_time.htm . Just fill in the “Convert a Date/Time to a Unix timestamp” fields (annotated with red below) and click the “Submit” button. The result you'll need to look for will be displayed in the textfield immediately below the buttons. I've annotated it with a green rectangle in the following screenshot:

    [​IMG]

    (Note that UTC times need to be entered, not the local - here in Finland, EET - ones. This is why I've substracted two from hours.)
    Just copy it to the clipboard, open the output of my logger and search for the value. There will most probably be only one hit:

    [​IMG]

    It's up until the previous record (in the screenshot below, of vsec 70) that you'll need to delete records from the track log – and, from the end of the video as well. (For example, if it's a two-minute video, you'll need to delete records starting with vsec 71+2*60+1 = 192.)

    After this, you can already upload the two files (the original video recording and the edited track file) to Ubipix's server via myTracks > Upload Recording (direct link).

    An example

    Among other videos, the one HERE (Sonkajärvi, Finland) has been processed exactly this way, with my logger and the stock Camera app.
     
  5. Menneisyys2 thread starter macrumors 603

    Joined:
    Jun 7, 2011
    #5
    UPDATE (04/01/2013): after talking to the developers of the app, I've found out you can save the original videos (converted to WebM playable by, among other players, VLC) with some manual work.

    Basically, all you need to do is switch to the source code view. In Firefox, you'll need to right-click anywhere (but not on an active compoentn) on the page and select “View Source Page”.

    You'll be shown the source window, where you'll need to look for “webm” to quickly find the video. The second hit will be an already right-clickable link:

    [​IMG]

    (The shot also shows I did search for "webm")

    Just right-click the link and select “Copy Link Location”. You can already copy the just-acquired URL to a new browser tab, where all you'll need to do is directly saving the current page (Cmd + S on a Mac).

    In Chrome, you can even easier find the URL (searching for “webm” in the source window will work just fine there too). Right-click the video and select “Inspect element” (highlighted in the following screenshot):

    [​IMG]

    Navigate one or two rows upwards to the long <video> tag (highlighted below):

    [​IMG]

    Click the link there. You'll be presented this:

    Select “Open Link in new Tab” (highlighted below):

    [​IMG]

    In the new tab, you can just press Cmd + S to directly save the video (as you'd do in Firefox):
    [​IMG]
     
  6. VideoMapa macrumors newbie

    Joined:
    Jan 17, 2013
    #6
    Questions about recording time (Ubipix / Geovid)

    Hello,
    Ubipix - Only 15 min ?
    GeoVid - ?
    What can I do to shoot more time?
    Regards.
     
  7. Menneisyys2 thread starter macrumors 603

    Joined:
    Jun 7, 2011
    #7
    Using Ubipix, you can't. HOWEVER, if you use an external recorder and my logger (ask a friend with Xcode to compile and deploy it on your handset), you can record videos up to 2Gbytes in size. 2GB is the max Ubipix's servers accept.

    Using lower-res modes or bitrates (use a client that let recording into these formats OR post-convert your videos before uploading), you can record and upload even hours-long videos.
     

Share This Page