Remember that devices will draw whatever they need. So if the end-device needs 100mA and the supply provides 400mA, the device will only draw 100mA from that supply.
The difference is that current into a boost/buck is not the same as current out. Going from 12 volts to 9 volts is a buck. That means there will be less current going in than current going out. How do you tell? Using ohm's law: Power In must be equal to Power Out. (Plus a little extra for loss through the regulator.)
That means if you want 400 mA @ 9 V out, that is 3.6 Watts (P = I * E). So to match your new buck supply with an input of 12 Volts, algebra would tell you that 3.6 Watts = 12 V * I, or 300 mA.
Generally, that is the case.
The important consideration is that this is an automotive application. The nominal DC voltage is 12 volts. If the battery is fully charged you could see over 14 volts. There will also be spikes in the 20 volt range from things like the ignition coils. Additionally, electronics in production vehicles must pass the load-dump test. This test is where two engines are jumped together without any batteries, resulting in surges in the 48 volt range. This voltage is why most passive components that are automotive grade tend to be in the 50 volt range, even in 5 volt circuits.
A well-designed power supply will protect its electronics. So you need to make sure your supply can tolerate 24 volts on the input.