It's well known that speakers can be cabbed up in series or parallel to achieve a desired impedance to match with your amplifier. Surely there must be some measurable difference of a form in the outputs of each combination?
In series, current is the same at each point in the circuit, thus with drivers in series, the current in each speaker is 100% of the output current. However the voltage across each driver is proportional to the ratio of the driver to the the sum of all drivers. Vdx=Vt*Zd1/(Zt).
In parallel, the total current drawn is split between each branch, the current in each branch being Idx=It*Zt/(Zt+Zdx). However the voltage across each driver is 100% of the output voltage.
What implications do these results have? For series: voltage is split between drivers, current through each driver is the total output current. For parallel: current is split between drivers, voltage across each driver is the total output voltage.
Obviously, P=VI, so the total output power would be the same for series and parallel, but what effect does the difference in current and voltage have on the output?
In series, current is the same at each point in the circuit, thus with drivers in series, the current in each speaker is 100% of the output current. However the voltage across each driver is proportional to the ratio of the driver to the the sum of all drivers. Vdx=Vt*Zd1/(Zt).
In parallel, the total current drawn is split between each branch, the current in each branch being Idx=It*Zt/(Zt+Zdx). However the voltage across each driver is 100% of the output voltage.
What implications do these results have? For series: voltage is split between drivers, current through each driver is the total output current. For parallel: current is split between drivers, voltage across each driver is the total output voltage.
Obviously, P=VI, so the total output power would be the same for series and parallel, but what effect does the difference in current and voltage have on the output?