I understand your point. If BF3 has an option to show FPS I can report my values during the week when I will be playing again (if I don't forget).
However, I must again point out that FPS is far from holy grail benchmark everyone is so obsessed about. Its a subjective measure insofar that the relation of FPS to gameplay quality largely depends on the quality of game's code. Strictly speaking, human vision cannot discriminate more than 20 frames per second. The reason why you usually want to keep FPS high is because the key property is the quality (smoothness) of animation. A movie usually has 25 FPS, but the individual frames are blended together and thus contain a degree of motion blur, which is interpreted by our brain as smooth animation. Now, imagine a game at about 25 fps (with occasional drops) where the animation calculation and rendering are not in sync. This will result in visible "jumpy" animation and other mismatches, which are painfully registered in our brain. Than take another game, at same 25 fps, but using synchronized animation/rendering (important thing here is to animate based on the real-time clock!) and motion blur; that game will look much more believable here. The point is, that there are games which may still look jumpy at 60 fps (because to horribly coded animations) and another one which look very smooth at ~30 fps. This is why I stopped looking at FPS counter long time ago and instead just look if the game has this smooth "movie" feeling to me. If it does, I am happy; if not, I reduce quality settings until it does. Similar by the way is also true for multiplayer games: what you want is actually not high FPS, but a low visual latency. Now, the traditional way to reduce this latency is indeed to increase the FPS and let the brain average away the noise introduced by the (sucky) game engine; anyway, a well synchronized game should keep the visual latency at minimum when above a minimal fps boundary.