hello,
I am assuming you are using incandescent lamps, however this effete can also be seen with modern LED lamps (Those with simple linear driver).
The current draw of any domestic listing installation should not be able to reduce line voltage noticeably.
However i found this topic interesting so i did the math:
With the diagram on James Hooker´s website i get the following function for lamp life:
IMG_8429.jpg
If I understand it correctly the R1+R2 for a lighting circuit ia 2Ω max.
Let´s say you use two 100W 240V lamps. That gives you a resistance of 288Ω. Your grid voltage is 236V The R1+R2 for one fitting is 0.5Ω, Th R1+R2 fo the second fitting is 1,75Ω.
That will result for the first lamp voltage 235,998 --> ∆U=-0,0168
For the second lamp the lamp voltage would 235,994V --> ∆U =-0,01669
With that you et a lamp life of 126,015% = 1260 hours and 9 minutes for the first and 126,03%=1260 hours and 18 minutes.
So you can see that the effect is barely noticeable.
However: What was said above is also correct: A dimmer switch will also result in a lamp voltage drop. Also soft starting prolongs lamp life, and finally the time of day the lamp is used. If the dining room lamp is primarily used during cooking and dining hours this is quite possible that there is a smaller line voltage ( Due to more load on the grid). However when the living room light is used until late at night these lamps will see a higher line voltage due to less electricity demand in the grid (thats the reason for economy / or the lighting of Highways at night)
Finally, If your lighting circuits show a significant voltage drop (several Volts) when the lights are switched on, it is time for work in the electrical installation. P=I^2R so even small concentrations of elevated resistance in the electrical installation will lead to heated parts that are not supposed to be heated-> R increases even more and at some point the installation will fail, possibly catastrophically leading to a house fire.