Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

gowdalokeshs

macrumors member
Original poster
Aug 31, 2010
32
0
bangalore
Hi,

I am developing one application, where i need to pair the Apple authenticated bluetooth accessories. After that, i need to measure the distance between the paired devices to check if it is beyond the bluetooth range. Is there any way to accomplish this ?

Please share your thoughts.....
 
If a device can pair with the other device, it's in range. If not, it's not.

"Range" is an amorphous term in the wireless world, as it depends on more factors than people typically like to account for. For example, the range of a class 1 BT device is "officially" 100m or so, but indoors with foil-back drywall you can pair devices well over double that distance down a long hallway or not be able to pair a device in the next room.
 
Thanks for your reply...:)

But, here i have to do with the distance between the paired devices....I have to be able to find out the distance between them......
For example, if the distance between them is above 2m, i need to display certain message....
 
I'm no expert, I haven't got any technical background whatsoever, but this is how I see it working. When bluetooth devices are in use, antennas in these receivers are transmitters and receivers at the same time, right? So, like with every other trans-receivers or just receivers, there should be some way to indicate the strength of the signal? Like bars, or something. And there should be a device to measure that. What I think you have to do is measure the strength of the signal at different distances, i.e. at 1 meter it reads: x dBm or mW (I googled these units - I don't know they are actually used to measure signal strength), at 2m it's a bit less and with every meter further from the device the signal is getting weaker up until it looses connection, let's say it is 100 meters. Taking that you can measure the strength of the signal with the accuracy of 1 meter you should end up with 100 different reading and each reading would represent different distance. Then you'll have to create some kind of script that would read that data and translate into numbers, so for example x dBm=1 meter, y dBm=2m, z dBm=25m and so on. That's how phone's signal indicator works, isn't it?
 
For example, if the distance between them is above 2m, i need to display certain message....

Then you're going to have to find some other way. Perhaps GPS, even though with a desired accuracy of 2m you're approaching the limits of the GPS system's resolution. You cannot depend on signal strength to direction-find with that sort of accuracy given the nature of radiofrequency and the surrounding environment(s).

What I think you have to do is measure the strength of the signal at different distances, i.e. at 1 meter it reads: x dBm or mW (I googled these units - I don't know they are actually used to measure signal strength), at 2m it's a bit less and with every meter further from the device the signal is getting weaker up until it looses connection, let's say it is 100 meters. Taking that you can measure the strength of the signal with the accuracy of 1 meter you should end up with 100 different reading and each reading would represent different distance. Then you'll have to create some kind of script that would read that data and translate into numbers, so for example x dBm=1 meter, y dBm=2m, z dBm=25m and so on. That's how phone's signal indicator works, isn't it?

This would work great if the strength of an RF signal depended only on the linear distance from the antenna of device 1 to the antenna of device 2. Unfortunately this is not what happens in the real world due to reflection, attenuation, etc.
 
Damn I can't sleep...

What if he took these readings in three different areas, let's say: one in the open area without any obstacles, 2nd in the urban area, and last one indoors. Later, user would have choose, the in the application, what kind of area he's in, before calculating distance between BT devices. It's not perfect but it would narrow down the margin of mistake, wouldn't it? Other thing I thought about... when two bluetooth devices are paired, do they constantly exchange some data? If they do, and I suppose they do, the application could include some kind of 'double check' of the signal strength. What I mean is that two devices would be communicating and comparing their readings. Just like they were pinging each other comparing the result and displaying average result.
This would work great if the strength of an RF signal depended only on the linear distance from the antenna of device 1 to the antenna of device 2. Unfortunately this is not what happens in the real world due to reflection, attenuation, etc.

I hope it does make some sense, if it doesn't I should probably go to sleep now... 3AM here
 
What if he took these readings in three different areas, let's say: one in the open area without any obstacles, 2nd in the urban area, and last one indoors. Later, user would have choose, the in the application, what kind of area he's in, before calculating distance between BT devices. It's not perfect but it would narrow down the margin of mistake, wouldn't it?

Not really. Those are all very vague conditions. Are you inside a building with plain drywall, or in a large masonry building? Is it humid outside or dry? What is the path from antenna to antenna like in the "urban area"? etc.

Other thing I thought about... when two bluetooth devices are paired, do they constantly exchange some data? If they do, and I suppose they do, the application could include some kind of 'double check' of the signal strength.

I don't believe BT distinguishes between signal strength levels aside from "strong enough" and "not strong enough." I could be wrong.
 
Hi all,

Is there any API to calculate the received signal strenght indication(RSSI) of bluetooth enabled devices......
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.