Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

babelshot

macrumors newbie
Original poster
Sep 17, 2009
9
0
Seattle, WA
I am looking for beta testers for Babelshot app, due to release in the app store soon. Babelshot translates any text snapped with iPhone camera into different languages. You can use it for translating printed articles, snippets from books or newspapers, etc.

Requirements:
- iPhone 3GS or previous generation iPhone with Clarifi case is recommended (but not required).
- Non-English speakers and bilinguals are welcome.
- Previous beta testing experience is a plus.

What is expected from you:
- Install and use the app for reasonable period time.
- Report any bugs or issues noticed.
- Provide honest feedback on general look and feel of the app.

Only 20 slots available. If interested, send your device ID to babelshot@codiumlabs.com or DM @babelshot on Twitter.
 
Babelshot has shipped and available in the app store for only $1.99:
http://linktoapp.com/babelshot

Here's the cool video shot by Japanese review site iPod Touch Lab:
http://www.youtube.com/watch?v=Gb6xncXg1PY

First independent review went online:
http://tinyurl.com/yf6jkm5

Screenshots of the app in action:
sequence_1.jpg


Enjoy!
 
You know how RedLaser is able to scan barcodes without you actually having to snap pictures? Like, it automatically scans in the area and beeps when it recognizes a barcode?

I think the killer app related to what you're doing (that's a very big challenge but would make you tons of money if you pulled it off) would be to make an augmented-reality translator program.

The idea behind this would be that the user doesn't have to snap any pictures, they just pick the source and target languages. Then, any text in the source language that's on the iphone camera screen is automatically translated into the target language, and the translation is overlayed on top of the source text in the image.

This type of program would be invaluable for all people travelling to foreign countries where they don't understand the native language. Just aim the camera at restaurant menus, street signs, newspapers, etc, and it would automatically overlay the translation on the image.

If you wrote something like that which worked with 50% or more accuracy, I'd be willing to pay like $19.99 for it. :p
 
You know how RedLaser is able to scan barcodes without you actually having to snap pictures? Like, it automatically scans in the area and beeps when it recognizes a barcode?

I think the killer app related to what you're doing (that's a very big challenge but would make you tons of money if you pulled it off) would be to make an augmented-reality translator program.

Is it really that much of a time expense to hit a button to snap a picture? What this app does (according to information posted) is already a HUGE win.
 
Is it really that much of a time expense to hit a button to snap a picture? What this app does (according to information posted) is already a HUGE win.

The time does add up. Like if you're scanning through a foreign menu and want to find the steak dish, you don't want to have to snap pictures of every page and do them one at a time. Or if you're reading a newspaper in a foreign language you don't want to take snapshots of every 3-inch block in the newspaper, you want to be able to move the phone down the page and read as you go along.
 
I want to be able to translate And choose images from my Photo albums instead to taking pictures.
 
The time does add up. Like if you're scanning through a foreign menu and want to find the steak dish, you don't want to have to snap pictures of every page and do them one at a time. Or if you're reading a newspaper in a foreign language you don't want to take snapshots of every 3-inch block in the newspaper, you want to be able to move the phone down the page and read as you go along.

...Lazy :rolleyes:
 
I was a little skeptical about this app after reading in the title that it's called Babeshot.

It makes a lot more sense that it's actually called Babelshot.
 
OP edit your title.

I want to wait for some extensive reviews on this one but it sounds great. One thing I loved about this cheap samsung flip phone I had was that I could speak my text messages. I want to do that.
 
looks interesting!

great idea.. I usually read and take a lot of notes.. is there a way for example if I was reading a macroeconomics book to take parts of the page and put them as notes somewhere or somehow? Also how well does this work on an iPhone 3G, I would definitely love to see a review before I buy!
 

Why is everyone hating on gillybean? He is being innovative and groundbreaking. The future is real time information overlaid onto the world around us. Wake up or shut up. Just because an older process works doesn't mean its as useful or easy to use. If I could glance at a sign and read it in english rather than taking a photo of it and then reading the converted result then I would prefer the former. That's the whole point of the 3GS and augmented reality. The power to do amazing, new things in your pocket at all times. It's not being lazy, it's thinking different. Has the Mac community lost its way?
 
Why is everyone hating on gillybean? He is being innovative and groundbreaking. The future is real time information overlaid onto the world around us. Wake up or shut up. Just because an older process works doesn't mean its as useful or easy to use. If I could glance at a sign and read it in english rather than taking a photo of it and then reading the converted result then I would prefer the former. That's the whole point of the 3GS and augmented reality. The power to do amazing, new things in your pocket at all times. It's not being lazy, it's thinking different. Has the Mac community lost its way?

For gillybean its about hating on a perfectly good product because it doesn't work as nice and easy as they want it to. Its just too hard to wait a whole 3 extra seconds to snap a photo and have it process.

This isn't like typical augmented reality apps where they take your location and direction and overlay predetermined data from a database. This app needs to do the decoding of the what the phone sees, which takes time. If your phone is moving around (hold it 'steady' and try to read the text on the screen that you camera is showing) the app likely wouldn't be able to properly analyze the data.
 
I think the killer app related to what you're doing (that's a very big challenge but would make you tons of money if you pulled it off) would be to make an augmented-reality translator program.

The idea behind this would be that the user doesn't have to snap any pictures, they just pick the source and target languages. Then, any text in the source language that's on the iphone camera screen is automatically translated into the target language, and the translation is overlayed on top of the source text in the image.

This type of program would be invaluable for all people travelling to foreign countries where they don't understand the native language. Just aim the camera at restaurant menus, street signs, newspapers, etc, and it would automatically overlay the translation on the image.

I'm working on this as version 2 or 3. There's some nice progress already, but far from releasable quality yet. The major obstacle is relatively weak iPhone CPU.

Follow @babelshot on twitter if you're interested in this subject. I may be able to put a video of augmented-reality-like feature prototypes on YouTube in next several months.
 
For gillybean its about hating on a perfectly good product because it doesn't work as nice and easy as they want it to. Its just too hard to wait a whole 3 extra seconds to snap a photo and have it process.

This isn't like typical augmented reality apps where they take your location and direction and overlay predetermined data from a database. This app needs to do the decoding of the what the phone sees, which takes time. If your phone is moving around (hold it 'steady' and try to read the text on the screen that you camera is showing) the app likely wouldn't be able to properly analyze the data.
I wasn't try to hate on a good product, I was just pointing out what the delta is between the app as a "that could be neat, not sure if I really want it" app and the app as a "this is an absolute must-have app for a significant part of the userbase, and people would be willing to pay lots for it" app. For RedLaser I think sales went way up when it could be done in real time without snapping pictures.

I acknowledged that this is a very difficult problem, and it's great that it sounds like babelshot is working toward it as a future version. I hadn't thought of the limitations before, but now that babelshot points it out, it does seem like it could require a CPU more powerful than what the iPhone currently uses.

Maybe there are some design tricks that could be used here though, e.g. if there's a way to reduce processing time by 80% and only degrade quality by 15%, then that would be worth doing (e.g. only match potential source words against the 2,000 words that are most likely to arise in actual usage rather than the full dictionary). Or maybe there are potential randomized techniques to speed things up. The Redlaser guys have a background in Artificial Intelligence and Computer Vision, and I'm sure that helped.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.