The test comparing it to the dam the nexus 10 was unfair.
Everybody knows that chrome is buggy and slower. I just put up with it because my bookmarks and tabs sync together ever so nicely =)
The dead trigger was buggy on the tegra 4. They didn't actually show proper benchmarking either. I don't think it is all that and I think Samsung exynos 5 quad - a15 quad that should be coming out in the s3 will be faster and also better in gaming if it includes a newer gpu than the mail 604.
Impressive nonetheless and I can't wait to see phones and tablets start to use quad a15's! I really hope we get 64gb standard storage now and micro sd card slots across all devices. What's the point of such a lovely screen if you can't put lots of tv shows, pictures and movies on it?
Sent from my GT-N7100 using Tapatalk 2
But Chrome is the STOCK browser, which was the point. It doesn't matter if it's buggy. If it's buggy and that makes it load slower then Google needs to fix it. I'm surprised nobody noticed that they didn't say what the resolution of the Tegra 4 tablet was (hint: pretty sure it was the same, I'm here, and I asked, but it couldn't be "confirmed")
S4 pro is in quite a few smartphones and it is a quad a15.
Nexus 4 and the LG optimum g are amongst the ones that have it. Battery life is fine, although there's not much performance difference from the quad a9 in the s3 and note 2 - exynos 4412.
Sent from my GT-N7100 using Tapatalk 2
It isn't A15, and it isn't based on A15. A15 is an architecture that is licensed by manufacturers to use. Qualcomm is an instruction set licensee, which means what they do is design a chip that can execute all of the ARM instructions that would be used by an OS. In the case of the S4 Pro, it's the ArmV7 instruction set. They are pluses and minuses to their method. The big plus is that they can tailor their architecture more towards phones and tablets, while the generic A15 is a more general purpose ARM chip that can be used in anything. The big minus is that it's slower than a pure A15 chip. Qualcomm will say it's negligible, and I'm not going to argue that here really (I've already done it in many other threads), but they also claim better battery life because of the optimization of the hardware for the specific use in phones/tablets. I can say that owning a phone with the dual-core S4 they are really efficient and my battery life is great.
What I want to see is battery life on Tegra 4. Improvements in this area are almost a necessity these days.
Project Shield was badass, and nVidia honestly stole the show with it. A couple things that were impressive:
1) They were playing all those games at 4K resolution. It was smooth. SOOOO smooth. Impressive.
2) Streaming a game from your PC is cool. I didn't see any lag between what was done on the controls of the Shield and what happened on the screen. Remember, in this mode all the Shield is doing is funneling video. The real work is still happening on your PC.
3) All of the video from the Shield to the tv was broadcast wirelessly. The HDMI cable was only used for Ethernet. Yes, HDMI 1.4a supports that.
4) The ports on the back are standard ports. No slimport or MHL here.
5) No, I didn't get to play with one. But word on the street is that they are just as impressive in person.
Also, another note about Tegra 4. Don't overlook the HDR processing that they talked about. I can't tell you how many times we were walking around taking pictures and we said "we really need Tegra 4 HDR to get that picture better". It's one of those you don't think about, but then when you see it done "right" you wish you had it right away. And yeah, I also noticed that it was an Android OS on what appeared to be a Windows tablet. No clue what that's about, but the hardware was an engineering sample designed to just show off the platform and not represent a real product.
Jen-Hsun also slipped and used the code-name Thor when he meant Shield, and tried to play it off like it was a mistake. I now want to know what Thor is.

(nVidia likes to use super hero codenames for stuff)
So yeah, I might be the Tegra Champ, but even if I wasn't I would be extremely impressed with what we saw from nVidia at CES. Now I just want to see it in the real world and see what it's really capable of.