The transformer, as I understand things, increases the current while decreasing the voltage(that's how you can buy an arc welder that puts out 250 Amps and runs from domestic mains). In an ideal transformer, these two factors would be equal and inverse, i.e if you halve the voltage with a transformer you double the current. In the real world there are always losses involved, which manifest themselves as heat(mainly).
Full disclosure - I did used to be a partner in a company that manufactured transformers, but I'm still not an electrician.