Okay, let's try for a for a more definitive answer:
Firstly, the watt rating you refer to is based on rated values. In other words, it is a projection of performance under perfect, constant conditions. The manufacturer is telling you what the bulb will do at one fixed point. That point should be 12V but is more likely the charge point of 13.6 or thereabouts. More later.
To perform your test, the first thing to understand is that the bulb is actually not a 'wattage' device, but rather a fixed resistance. Therefore, as voltage decreases, so will the current. Applying a declining voltage to a fixed resistance will result in a declining current.
So, step one is to determine the resistance of the bulb. For an incandescent, it's simply a matter of substituting in the power formula and solving for R. P is power in Watts. P=VI and R=V/I. Therefore, R =V^2/P. For a 12V system, and a 55W load (at 12V), the result is 144/55 or 2.61 ohms. Now put your meter in resistance (ohms) mode and measure the resistance of the bulb to see how close it is. Bear in mind that this changes with temperature as the bulb gets hot, typically increasing the resistance and lowering current draw.
Now, as you do your test, constantly perform the calculation. If your bulb measures 2.6 ohms, then:
At 12V, you should draw 12/2.6 = 4.61A
At 10V, you should draw 10/2.6 = 3.84A
If your bulb was a 3 ohm load, it would result in 4A and 3.33A in the two examples above. Hopefully you have the picture now and can validate your current draw! Nothing is static and you have to understand the conditions at the precise time you take your reading.
Finally, no two bulbs are identical, nor will they behave exactly alike. The all-important resistance and temperature point will result in the output. At any given point in your test, you can record your voltage and current to determine the power (watts) being consumed by your bulb, which will NOT be 55W. At 10V and 3A, it would be 30W of actual load, for example. Hope this helps!