My Note 3 charges at a maximum current of 1,200mA, regardless of how much current I supply to the USB 3.0 cable. (I have a 25 Amp 5 Volt supply available for testing.) Plugging the same cable into my laptop's USB 2.0 port gives me a 450mA charge rate. Qi (wireless) charging gives me 600 mA.
But the phone will never draw more than it's designed to, no matter how much current you supply. (You supply the 1 Amp [100 Watt) light bulb in your house with about 100 Amps. It still only draws about 1 Amp [half an Amp if your household Voltage is 220 Volts].) What normally sets the charge rate is how much current is available - up to the maximum the phone can draw.
As far as how long it takes to charge the battery,m there are a number of variables, including its past charging history, how deeply it was discharged AND how much current you can supply to the cable. Charging from 40% takes a lot less time than charging from 0% (or, effectively 0% - if the battery ever drops to no charge at all it becomes toxic waste - since charging a totally discharged Lithium battery results in a small bomb, there's a protection circuit that disconnects the battery inside from the terminals on the outside if the voltage ever drops too low). And if the battery has consistently been drained to close to 0 charge, it takes longer to charge than a battery that's normally kept at least half charged and is once being charged from a deep discharge. (You also risk creating dendrites - internal shorts - a lot more if you regularly deep-discharge the battery. They're working on the problem, but no one's come up with a solution yet, because it's difficult to change how chemistry works.)