starting to cave in towards the thunderbolt

Well, that's just it. Does showing it in the About Phone mean that it truly is implemented in the kernel? Or is it merely a placeholder for future implementations? Or does it mean that the tag is supported and there's an implementation, but all the implementation does is copy the single-core implementation (i.e., not fully or optimally implemented)? Not sure a screen capture will answer these questions. It really takes investigation into the kernel or Motorola (or whomever) stating that it is fully supported.

I would think that motorola putting that in kernel info is them saying its supported. However your point is valid in that we don't know how far that goes. It could be 10% and so they tag it.

Going by some bechmarks I suspect its is pretty far along.
 
  • Like
Reactions: DolfanCole
I agree its all about lte on the TB. I don't imagine dual or quad cores are going to make my lte any faster.
 
stay-on-target.gif


Stay on target!!!


I laughed at this picture as I'm a Star Wars fan. Too funny!:D
 
I agree its all about lte on the TB. I don't imagine dual or quad cores are going to make my lte any faster.

This brings up a good point. With regards to computers, I've always said that you're only as fast as your slowest component. I think the same holds true for mobile phones nowadays. While dual or quad cores won't make LTE faster, but it will make processing data that is received from LTE faster. Same goes with the memory -- faster memory in the Bionic than the Thundebolt. So, there's a faster processor and faster memory in the Bionic over the Thunderbolt. Both of which should make the overall experience much better.
 
dual core is for processor geeks. no offense but the thunderbolt is way snappier then any moto phone i have used including the DUAL CORE atrix.

I beg to differ with you. I did the quadrant benchmark test on the the TB when I had it. The best it would do was probably 1500. The graphics protion
of the test was laggy. I did the quad test on my Gtab, oclockd to 1.2ghz
I get anywhere to 1900 to the 2000. The graphics is much smoother on it than the tb. Of course I cleared the cache on the tab etc. But out of the box
the TB should be smokin. I know some considered that test to be suspect but I'm just sayin.
I alsoo did my research doing YouTube searches, for AT&T Atrix vs AT&T's
version of the TB. The Atrix kicked arse. So if there is any refrence point u are looking for there u go.
Build qual is nice the screen is nice. But it won't stand up to the Bionic.
If I consider anything other than the Bionic, it just might be the Droid Charge.
I know I know.....
 
I beg to differ with you. I did the quadrant benchmark test on the the TB when I had it. The best it would do was probably 1500. The graphics protion
of the test was laggy. I did the quad test on my Gtab, oclockd to 1.2ghz
I get anywhere to 1900 to the 2000. The graphics is much smoother on it than the tb. Of course I cleared the cache on the tab etc. But out of the box
the TB should be smokin. I know some considered that test to be suspect but I'm just sayin.
I alsoo did my research doing YouTube searches, for AT&T Atrix vs AT&T's
version of the TB. The Atrix kicked arse. So if there is any refrence point u are looking for there u go.
Build qual is nice the screen is nice. But it won't stand up to the Bionic.
If I consider anything other than the Bionic, it just might be the Droid Charge.
I know I know.....

This may be where the Quadrant scores may be a little misleading. All the benchmarks that I've seen have the Bionic and Atrix outperforming the Thunderbolt. From what I recall, the Quadrant scores were: Bionic around 2200, Atrix around 2400, and Thunderbolt around 1800. But, some of the hands-on reviews seem to indicate that the Atrix was a little laggy as you move around the menus, etc. And reviews on the Thunderbolt had it moving rather smoothly (I've actually played with the Thunderbolt and it seemed rather smooth to me as well). I admit though that it seemed like reviewers of the Atrix were all dead-set against Blur to begin with, and may have been looking for anything to slam it.

The question then is, does Quadrant really capture the user experience with regards to typical usage (i.e., is it "laggy" doing normal menu operations, scrolling, etc.)? Or is it simply seeing how fast the memory, GPU, CPU, etc. is through timed reads/writes? If it's just testing hardware with simple scripts, it seems that it may be missing how the user experience is with simple everyday operations. I don't know how Quadrant works, so I don't know what the answer is and how applicable the scores are to the user experience. I also don't know if there are other tools that may better capture this. Just speaking out loud ... :)
 
But, some of the hands-on reviews seem to indicate that the Atrix was a little laggy as you move around the menus, etc. And reviews on the Thunderbolt had it moving rather smoothly (I've actually played with the Thunderbolt and it seemed rather smooth to me as well).

I don?t remember much commenting about the atrix being laggy. Most of the (admittedly little) I have read on reviews of the atrix were more commenting that the menus were smooth and peppy, but had some funky things going on in a few areas mostly due to the 800x480 to 960x540 differences. The main issues of lag reported had to do with the webtop/laptop thing, which of course is beside the point in reference to the Bionic.

I would agree though that the TB seemed smooth, at least for the 5 to 10 minutes I played with it at the store. If it weren?t below freezing with 4? of new snow on the ground, I?d probably get myself over to an ATT store at lunchtime to play with an Atrix.

The question then is, does Quadrant really capture the user experience with regards to typical usage (i.e., is it "laggy" doing normal menu operations, scrolling, etc.)? Or is it simply seeing how fast the memory, GPU, CPU, etc. is through timed reads/writes? If it's just testing hardware with simple scripts, it seems that it may be missing how the user experience is with simple everyday operations. I don't know how Quadrant works, so I don't know what the answer is and how applicable the scores are to the user experience. I also don't know if there are other tools that may better capture this.

I don?t really know much about either blur or sense, I?ve never played with blur and only clocked 5 or so minutes playing with sense, but I would say that trusting a synthetic benchmark could be misleading. I would hazard to guess that anything short of plain Android should be looked at with a grain of salt when comparing one-to-the-other. In any case, benchmarks never substitute for a bit of hands-on time.

-Suntan
 
I don't know if I mentioned it in this thread or another, but I used sense's launcher all of like 6 hours on the first day. It's like fischer price, gigantic "FRIEND STREAM" taking up 1/3rd of your screen on the FB widget, GIGANTIC FLIP CLOCK wasting your screen real estate. Not to mention sucking your battery dry. I can't understand why anyone would want to use the sense launcher/widgets over easily superior alternatives. I am assuming this goes for blur as well, so I don't think comparing sense/blur is ever a valid factor with which to compare devices.
 
I don?t remember much commenting about the atrix being laggy. Most of the (admittedly little) I have read on reviews of the atrix were more commenting that the menus were smooth and peppy, but had some funky things going on in a few areas mostly due to the 800x480 to 960x540 differences. The main issues of lag reported had to do with the webtop/laptop thing, which of course is beside the point in reference to the Bionic.

I would agree though that the TB seemed smooth, at least for the 5 to 10 minutes I played with it at the store. If it weren?t below freezing with 4? of new snow on the ground, I?d probably get myself over to an ATT store at lunchtime to play with an Atrix.

Probably the review I remember that really slammed the Atrix and it being laggy was the one done by PhoneDog. Again, the reviewer right out of the gate made it clear that he didn't like Blur.

Like you, I have only tested the Thunderbolt in the store for a few minutes, and haven't touched the Atrix. I don't have snow on the ground, so I don't have your excuse though. ;)

I don?t really know much about either blur or sense, I?ve never played with blur and only clocked 5 or so minutes playing with sense, but I would say that trusting a synthetic benchmark could be misleading. I would hazard to guess that anything short of plain Android should be looked at with a grain of salt when comparing one-to-the-other. In any case, benchmarks never substitute for a bit of hands-on time.

-Suntan

Yep. There's nothing better than your own hands-on review.
 
I went into an AT&T store the other day to play with the Atrix. It was hands down the snappiest phone I've ever used, almost even beating my OC'd Nexus S. I liked the screen too, it was bright, vibrant and resolution was awesome (just wished it was bigger ;) ).

Looking forward to the Droid Bionic even more now.
 
  • Like
Reactions: DolfanCole
I really wanted the bionic. I always loved Motorola. Hell, I'm a Motorola mobility stockholder. But the tbolt released first with unlimited LTE for no price difference above my 3g. If the bionic proves itself that much better than the tbolt, ill just buy a bionic for retail and sell my tbolt. Tbolt will still be so new I probably won't lose much money. I'll save more by not having to be on tired or premium 4g pricing ( a la sprint), whichever way Verizon decides to go. But I wish u guys luck and hope they keep unlimited in time for the bionic. But with the release of so many more 3g phones coming up soon too, it may end soon.

Yes many tbolt owners are having battery issues. I somehow manage to have my battery last me all day so its a non issue for me.
 
Last edited:
I'll save more by not having to be on tired or premium 4g pricing ( a la sprint), whichever way Verizon decides to go.

This has nothing to do with any particular handset, but just on the notion of pricing for 4g.

I don?t see how a phone company can expect to charge more for 4G vs. 3G.

I would have to think that if given a choice between 3G as is or 4G for $10 more a month, most everyone would choose to stick with 3G, thus making their 4G rollout just a lot of egg on their face. The only people that would care about 4G would then be people looking to spend a boat load on an unlimited 4G tethering or dedicated mifi plan, which I have to think is a relatively small percentage of their customers.

I think that is one of the main reasons why you saw Sprint stick it to everybody with a $10 upcharge regardless of your 3G/4G usage. When everybody gets charged more, there is no reason to resist upgrading to the faster speeds.

-Suntan
 
As much as I hate Motorola, Motoblur, and there locked bootloaders...don't go with the Thunderbolt! Why? The Bionic is far superior then the Thunderbolt and the Bionic just has overall better specs. ;)
 
This has nothing to do with any particular handset, but just on the notion of pricing for 4g.


-Suntan

It has to do with a particular handset when one is offered while pricing stays the same as 3g and the possibility of another handset not being offered with the same pricing plan. I'm aware that its a risk I took not waiting to get the handset I originally wanted not really knowing what the pricing structure is going to be like in the end. I just don't believe it will continue to be the same.

I guess I should have stated it this way: It my belief that Verizon will change there 4g pricing structure so that there is a good chance that me jumping on the TB now could either cost me less, or so that I will be able to continue to have unlimited 4g.
 
Last edited: