I don't think the protocol in use (GSM, CDMA2000, etc.) makes the great difference. Bear in mind that the early cell phones used to chatter constantly, and always went to 100% power if they couldn't reach a tower. And then continued to run at 100% power seeking one. Someone (Nokia?) made big news when they attacked that in software, programming their phones to seek a tower--and if it couldn't find one in xx minutes, go silent for xx more minutes, then retry, then go silent, then retry at increasingly lower power levels.
That kind of programming can be built into a phone by the phone maker, by default. It can be programmed into the OS, although I don't know if it is. It can also be changed, like everything else, at carrier request.
So you've got three entities that all have a major say in how much power your phone is going to use, without even considering the protocol it connects with.
Then there are the phones programmed (like Google Fi and T-Mobile's new plans) to always seek WiFi before turning on the cellular radio and connecting by cell. And even more that are programmed to use 3G instead of LTE (4G doesn't really exist in the US, it just means "3G with high speed backbone" which is still 3G in the phone) depending on how busy the service is, which type of signal is coming in stronger, and what priority you have.
There's a lot that can be done invisibly in software, and you and I as end users will pretty much never find out about it until long after, when and if someone happens to blab about it, or some historical data is released. The carriers all consider this to be a matter of trade secrets, which isn't entirely wrong.
("Where's my spare tire?" Sorry, that's a trade secret, you'll have to bring the car in for service. (sigh)