As Ray says, you may get away with the 12V socket near the TV, but if you try to use an appliance that needs more power - a large laptop (60 to 90W) then the limitations of the caravan wiring may become more apparent.
Briefly, to keep voltage drops and cable losses to a minimum, keep 12V wiring as short as possible, keep the inverter as close to eth battery as possible, and if extensions are necessary, it is better to use them on the inverter's 230V out put rather than the inverter's 12V input.
If you want a more detailed explanation read on:-
The science is that all conductors offer some resistance to the movement of a current through them. Double the length of a conductor and the resistance doubles, so to minimise cable losses the cables should be as short as possible. Also, as the current flow is restricted, it develops a potential difference (Voltage loss) between the ends of the conductor so you never get out the same voltage out that you put in except when no power is being used.
The maths is:-
Cable losses (cable heating) in W= the cable Resistance x the square of the Current.
So short wires are one strategy, but the other and more effective strategy is to reduce the current passing through the wire.
Current = Watts divided by Volts, Thus my 46W TV only draws 0.2A at 230 V but needs 4A at 12V plus 1A to run the inverter so that is 5A
If I were to use the same wire and distance, and for the sake of the argument it has a resistance of 0.1Ohm then the losses at 12V will be 5 x 5 x 0.1 = 2.5W and at 230V will be 0.2 x 0.2 x 0.1 = 0.004W or 625 times less!
The other important factor here is the inverters min V requirement, The voltage drop along the cable is given by;
Volts = Current x Resistance
So for the 12V line system it will be 5A x 0.1Ohm = 0.5V. By comparison for the 230V system it will be 0.2 x 0.1 = 0.02V.
Although the 12v systems voltage drop is only 0.5V, if your inverter needs 11.5V then the battery needs to be a minimum of 12V to keep the inverter and TV running.
One of the symptoms of a 12v wiring voltage drop problem, is the inverter cycling between shutting down and the restarting. Simply, when the inverter shuts down it draws a much-reduced current, so the voltage drop along the cable also falls allowing the voltage at the inverter to rise. It may increase enough to tell the inverter to switch on, but as soon as it draws the current to feed to the load, the voltage drop due to current along the cable sets in and the invert shuts down again.
Connecting the inverter as close to the battery as possible means the cable losses are minimised and the inverter will work for longer as the battery can sink further.
The same reasoning