View Poll Results: Does Google collect too much personal data about you?

Voters
13. You may not vote on this poll
  • Absolutely YES! Far too much for my comfort

    3 23.08%
  • Kinda, yeah.. I limit some of the data I share with them.

    1 7.69%
  • Maybe, but I don't really think about it.

    4 30.77%
  • No. I absolutely trust Google with my data.

    5 38.46%
05-20-2018 04:27 PM
97 ... 234
tools
  1. DMP89145's Avatar
    Certainly. Please let me clarify as well that I'm in no way insinuating anyone with an opinion different from mine is ignorant. I'm only offering my perspective as it pertains to Google's data collection and the inherent risks associated with it. I appreciate the debate, it allows me to evolve as new information or perspectives are presented as well.
    Absolutely. The theme behind the post is to debate the issue. An alternative point of view from mine is always welcome, as far as I am concerned.

    One thing we can agree on, I believe, is that data will definitely play a bigger and bigger role in all of our daily lives. With that in mind, it's of paramount importance that who we each of choose to be the custodian of that information is critical.
    05-19-2018 05:11 PM
  2. nof8butwhatwemake's Avatar
    If we're saying there is the potential for great abuse or risk of harm, I agree. If we're asking the probability of Google betraying it's customers, I think that probability is rather small, especially respective to other players in the industry.
    Would you not agree to that in order for significant advancements to be made in the technologies Google has expressed interest in pursuing, extensive privacy sacrifices would have to be made to facilitate the collection of the data required to make these advances? If we can agree that to be the case, and you truly believe Google to be transparent in their practices, it would be logical to assume people will be given the right to choose whether or not they participate. So my next question to you is, if the level of consensual participation is inadequate to satisfy the amount of data needed to make the technological leaps Google decides to pursue, is it your assumption that Google will remain transparent even if it results in failure or stagnation? That's a tough one for me to swallow.
    05-19-2018 05:16 PM
  3. DMP89145's Avatar
    Would you not agree to that in order for significant advancements to be made in the technologies Google has expressed interest in pursuing, extensive privacy sacrifices would have to be made to facilitate the collection of the data required to make these advances?
    Could you be a bit more specific here? I'm reading this with you considering the "Selfish Ledger" video your reference point.
    05-19-2018 05:40 PM
  4. nof8butwhatwemake's Avatar
    Could you be a bit more specific here? I'm reading this with you considering the "Selfish Ledger" video your reference point.
    To be specific, the development and implementation of AI in numerous applications which is not only an admitted agenda but has become a controversial one already. The reality is in order to build an effective AI, a massive amount of human data must be collected. Algorithms that have a deep understanding of human behavior, and can emulate and predict those behaviors as well, will be a necessary component. This is the end game, make no mistake, AI is the ultimate goal and Google will likely be the one of the first to get there.. though Facebook and Amazon are certainly interested as well, and Apple and Samsung among others are surely looking at this.

    https://www.zdnet.com/article/google...drone-project/
    05-19-2018 05:51 PM
  5. DMP89145's Avatar
    To be specific, the development and implementation of AI in numerous applications which is not only an admitted agenda but has become a controversial one already. The reality is in order to build an effective AI, a massive amount of human data must be collected. Algorithms that have a deep understanding of human behavior, and can emulate and predict those behaviors as well, will be a necessary component. This is the end game, make no mistake, AI it's the ultimate goal and Google will likely be the one of the first to get there.. though Facebook and Amazon are certainly interested as well, be and Apple and Samsung among others are surely looking at this.

    https://www.zdnet.com/article/google...drone-project/
    Ah.. The Project Maven situation.

    Well AI can be utilized for a LOT of different things, but let's stay with the grand "Skynet" version of the story. There are a lot of players in AI, Google is for sure one of them so I'll keep the discussion on just Google.

    The question then comes down to what you believe, as you alluded to earlier. Whether or not Google is truly a good citizen and custodian with the information they collect. I believe they are.
    05-19-2018 06:02 PM
  6. Itsa_Me_Mario's Avatar
    Would you not agree to that in order for significant advancements to be made in the technologies Google has expressed interest in pursuing, extensive privacy sacrifices would have to be made to facilitate the collection of the data required to make these advances? If we can agree that to be the case, and you truly believe Google to be transparent in their practices, it would be logical to assume people will be given the right to choose whether or not they participate. So my next question to you is, if the level of consensual participation is inadequate to satisfy the amount of data needed to make the technological leaps Google decides to pursue, is it your assumption that Google will remain transparent even if it results in failure or stagnation? That's a tough one for me to swallow.
    No I don't agree with the first statement, so unsure how to relate the remainder.
    05-19-2018 06:06 PM
  7. nof8butwhatwemake's Avatar
    Ah.. The Project Maven situation.

    Well AI can be utilized for a LOT of different things, but let's stay with the grand "Skynet" version of the story. There are a lot of players in AI, Google is for sure one of them so I'll keep the discussion on just Google.

    The question then comes down to what you believe, as you alluded to earlier. Whether or not Google is truly a good citizen and custodian with the information they collect. I believe they are.
    Yet it took the equivalent of an employee rebellion to bring this to light. Google was moving forward and still may be - so if they're willing to apply their technology to military applications that will directly result in the killing of human beings, exactly what line so you think Google will refuse to cross when it comes to your privacy?
    05-19-2018 06:09 PM
  8. nof8butwhatwemake's Avatar
    No I don't agree with the first statement, so unsure how to relate the remainder.
    Easy way to avoid the debate IMO, but I'll respect the response.
    05-19-2018 06:10 PM
  9. Itsa_Me_Mario's Avatar
    Yet it took the equivalent of an employee rebellion to bring this to light. Google was moving forward and still may be - so if they're willing to apply their technology to military applications that will directly result in the killing of human beings, exactly what line so you think Google will refuse to cross when it comes to your privacy?
    How will it result in killing humans? They are using existing open source software to help operators better differentiate between targets in footage. Nothing in that software has anything to do with piloting or launching or firing anything. They're absolutely not crossing the line you're describing.
    DMP89145 likes this.
    05-19-2018 06:13 PM
  10. DMP89145's Avatar
    Yet it took the equivalent of an employee rebellion to bring this to light. Google was moving forward and still may be - so if they're willing to apply their technology to military applications that will directly result in the killing of human beings, exactly what line so you think Google will refuse to cross when it comes to your privacy?
    Well the situation with Project Maven is more than what gets sensationalized. It first hit the news about 3 months ago. The software being used is related to the pictures and data that drones collect during reconnaissance. Current that data is process and interpreted by humans. AI would help speed up the process and reduce human error. This, in turn, could save lives for both the military and the innocent if violence is involved.


    The software is open sourced that's being used on the project, so the military could've, and probably would've, moved forward with or without Google providing technical support. In my view, governments all around the world do business with the private sector. I see this as no different.

    So play the situation the other way, Google decides to yield to employees concerns and abandons the project. With them no longer involved, the government may actually completely bastardize the initial intent and we'd never know anything about it. I see Google's role as more of a shepherd on that project.
    05-19-2018 06:21 PM
  11. nof8butwhatwemake's Avatar
    How will it result in killing humans? They are using existing open source software to help operators better differentiate between targets in footage. Nothing in that software has anything to do with piloting or launching or firing anything. They're absolutely not crossing the line you're describing.
    I respect your right to voice your opinion, however reason dictates the legitimacy of the concerns expressed by the employees directly involved with the project itself outweighs the significance of opinion from an outsider with no actual knowledge of the program.

    These Google employees were alarmed enough to risk their careers by publicly protesting what they saw as a dangerous precedent being set by direct involvement in military applications. I'll take their concern much more seriously than your dismissal of it.
    05-19-2018 07:09 PM
  12. nof8butwhatwemake's Avatar
    Well the situation with Project Maven is more than what gets sensationalized. It first hit the news about 3 months ago. The software being used is related to the pictures and data that drones collect during reconnaissance. Current that data is process and interpreted by humans. AI would help speed up the process and reduce human error. This, in turn, could save lives for both the military and the innocent if violence is involved.


    The software is open sourced that's being used on the project, so the military could've, and probably would've, moved forward with or without Google providing technical support. In my view, governments all around the world do business with the private sector. I see this as no different.

    So play the situation the other way, Google decides to yield to employees concerns and abandons the project. With them no longer involved, the government may actually completely bastardize the initial intent and we'd never know anything about it. I see Google's role as more of a shepherd on that project.
    What you see as a beneficial reduction in human error I see as the alleviation of a necessary responsibility for accuracy. Once algorithms take over tasks such as determining hostile targets, who's responsible when mistakes are made? Who's shouldering the moral weight of those deaths? Once machines take over killing for us, how can we possibly apply the same level of humanity in the decision making process? The gravity of creating casualties will never be the same as when a human is responsible for actually carrying out the task rather than executing a program to do it for them.
    05-19-2018 07:18 PM
  13. nof8butwhatwemake's Avatar
    "A DoD statement from July announced that Project Maven aimed to "deploy computer algorithms to war zone by year's end", talking up an "AI arms race" and the fact that former Alphabet chairman Eric Schmidt now refers to Google as an AI company, not a data company."

    I'll just leave this here with one question.. if none of this alarms you, what would it take before you would be concerned?
    05-19-2018 07:34 PM
  14. DMP89145's Avatar
    What you see as a beneficial reduction in human error I see as the alleviation of a necessary responsibility for accuracy. Once algorithms take over tasks such as determining hostile targets, who's responsible when mistakes are made? Who's shouldering the moral weight of those deaths? Once machines take over killing for us, how can we possibly apply the same level of humanity in the decision making process? The gravity of creating casualties will never be the same as when a human is responsible for actually carrying out the task rather than executing a program to do it for them.
    Well, I don't want to get off topic and debate the moral weights of war. But what your very post is exactly why information needs to be as accurate as possible and not less if those decisions become necessary. If there's technology that can spare erroneous loss of life, shouldn't that be implemented?

    Remember Google's role is supporting an already available open source software. They can either be there or not, either way the software is there and if Google hypothetically stepped away it wouldn't prevent anything at all.
    05-19-2018 07:38 PM
  15. DMP89145's Avatar
    "A DoD statement from July announced that Project Maven aimed to "deploy computer algorithms to war zone by year's end", talking up an "AI arms race" and the fact that former Alphabet chairman Eric Schmidt now refers to Google as an AI company, not a data company."

    I'll just leave this here with one question.. if none of this alarms you, what would it take before you would be concerned?
    Not alarmed at all and I would have to see a direct malicious act in order for me to be alarmed. I fly all the time. I have no issues with Boeing or their military division either.
    05-19-2018 07:43 PM
  16. nof8butwhatwemake's Avatar
    Not alarmed at all and I would have to see a direct malicious act in order for me to be alarmed. I fly all the time. I have no issues with Boeing or their military division either.
    Well, then I guess it's a good thing malicious acts by large intelligence organizations are always done in front of civilians.. that way you'll know when to be concerned.

    Sorry, I just had to.. lol. Have a great rest of your weekend, enjoyed the debate.
    DMP89145 likes this.
    05-19-2018 08:01 PM
  17. nof8butwhatwemake's Avatar
    Well, I don't want to get off topic and debate the moral weights of war. But what your very post is exactly why information needs to be as accurate as possible and not less if those decisions become necessary. If there's technology that can spare erroneous loss of life, shouldn't that be implemented?

    Remember Google's role is supporting an already available open source software. They can either be there or not, either way the software is there and if Google hypothetically stepped away it wouldn't prevent anything at all.
    I'm not against software that's for assisting humans in their tasks when you have humans working directly with the results of those algorithms and still holding the final say on any action before it's taken. However there are several defense companies currently working on projects that will bring forth completely autonomous warfare by combining AI with Unmanned Aerial Vehicles in the very near future. Do we really want the company with the entire world's civilian data also working in a military capacity? I don't
    05-19-2018 08:09 PM
  18. tadpoles's Avatar
    Sorry for not reading all the above posts.

    This may be a rare sentiment but I don't mind Google's knowledge of me. It gets me good products that are low to no cost catered to my preferences. So far I've had nothing to complain about. Even ads are catered to my interests. I have to be honest, I like Google...and that's not just me being naive.
    05-19-2018 08:16 PM
  19. Itsa_Me_Mario's Avatar
    I respect your right to voice your opinion, however reason dictates the legitimacy of the concerns expressed by the employees directly involved with the project itself outweighs the significance of opinion from an outsider with no actual knowledge of the program.

    These Google employees were alarmed enough to risk their careers by publicly protesting what they saw as a dangerous precedent being set by direct involvement in military applications. I'll take their concern much more seriously than your dismissal of it.
    Their concerns are that they don't want Google involved with military customers, regardless of whether or not the application is benign.
    Laura Knotek and DMP89145 like this.
    05-19-2018 11:20 PM
  20. Chuck Finley69's Avatar
    We both know that argument is completely false. Data about you is being collected whether or not you use any company's products or agree to their terms. If someone takes your picture with their cell phone, posts a picture with you in it on social media, your data is being collected. Location, facial recognition data, etc. Determinations about you are being made by algorithms based on your friends habits/likes/events and that information is used to build a profile about you whether you want to or not. You cannot avoid data collection at this point without withdrawing from societal norms.
    You have the right to ask people to not upload photos of you to internet or social media.
    05-20-2018 06:09 AM
  21. nof8butwhatwemake's Avatar
    You have the right to ask people to not upload photos of you to internet or social media.
    Correct. To ask.
    05-20-2018 11:00 AM
  22. Chuck Finley69's Avatar
    Correct. To ask.
    The point is that your personal data has been sold over and over long before the internet ever heard of you. There have been more improvements to privacy since the internet became prevalent than before since no one really understood how data was collected in the preinternet days.
    Laura Knotek likes this.
    05-20-2018 04:27 PM
97 ... 234

Similar Threads

  1. Why does my phone keep cutting out
    By Android Central Question in forum Ask a Question
    Replies: 5
    Last Post: 04-29-2018, 09:27 PM
  2. Google assistante microphone problem in galaxy s9
    By Android Central Question in forum Ask a Question
    Replies: 0
    Last Post: 04-29-2018, 11:45 AM
  3. Why does my phone sometime not notify me of a text?
    By Android Central Question in forum Ask a Question
    Replies: 0
    Last Post: 04-29-2018, 08:52 AM
  4. Google play deleted from phone...
    By Android Central Question in forum Ask a Question
    Replies: 2
    Last Post: 04-29-2018, 08:32 AM
  5. Why does the Screen Brightens Up only when in Good Lighting Source?
    By Android Central Question in forum Ask a Question
    Replies: 0
    Last Post: 04-29-2018, 04:58 AM
LINK TO POST COPIED TO CLIPBOARD