Does Google collect too much personal information?

Does Google collect too much personal data about you?


  • Total voters
    0

Itsa_Me_Mario

¯\_(o_o)_/¯
Feb 19, 2018
1,681
0
0
Visit site
Would you not agree to that in order for significant advancements to be made in the technologies Google has expressed interest in pursuing, extensive privacy sacrifices would have to be made to facilitate the collection of the data required to make these advances? If we can agree that to be the case, and you truly believe Google to be transparent in their practices, it would be logical to assume people will be given the right to choose whether or not they participate. So my next question to you is, if the level of consensual participation is inadequate to satisfy the amount of data needed to make the technological leaps Google decides to pursue, is it your assumption that Google will remain transparent even if it results in failure or stagnation? That's a tough one for me to swallow.
No I don't agree with the first statement, so unsure how to relate the remainder.
 

nof8butwhatwemake

Well-known member
May 16, 2018
55
0
0
Visit site
Ah.. The Project Maven situation.

Well AI can be utilized for a LOT of different things, but let's stay with the grand "Skynet" version of the story. There are a lot of players in AI, Google is for sure one of them so I'll keep the discussion on just Google.

The question then comes down to what you believe, as you alluded to earlier. Whether or not Google is truly a good citizen and custodian with the information they collect. I believe they are.

Yet it took the equivalent of an employee rebellion to bring this to light. Google was moving forward and still may be - so if they're willing to apply their technology to military applications that will directly result in the killing of human beings, exactly what line so you think Google will refuse to cross when it comes to your privacy?
 

Itsa_Me_Mario

¯\_(o_o)_/¯
Feb 19, 2018
1,681
0
0
Visit site
Yet it took the equivalent of an employee rebellion to bring this to light. Google was moving forward and still may be - so if they're willing to apply their technology to military applications that will directly result in the killing of human beings, exactly what line so you think Google will refuse to cross when it comes to your privacy?

How will it result in killing humans? They are using existing open source software to help operators better differentiate between targets in footage. Nothing in that software has anything to do with piloting or launching or firing anything. They're absolutely not crossing the line you're describing.
 

anon(10092459)

Well-known member
Nov 25, 2016
1,801
0
0
Visit site
Yet it took the equivalent of an employee rebellion to bring this to light. Google was moving forward and still may be - so if they're willing to apply their technology to military applications that will directly result in the killing of human beings, exactly what line so you think Google will refuse to cross when it comes to your privacy?

Well the situation with Project Maven is more than what gets sensationalized. It first hit the news about 3 months ago. The software being used is related to the pictures and data that drones collect during reconnaissance. Current that data is process and interpreted by humans. AI would help speed up the process and reduce human error. This, in turn, could save lives for both the military and the innocent if violence is involved.


The software is open sourced that's being used on the project, so the military could've, and probably would've, moved forward with or without Google providing technical support. In my view, governments all around the world do business with the private sector. I see this as no different.

So play the situation the other way, Google decides to yield to employees concerns and abandons the project. With them no longer involved, the government may actually completely bastardize the initial intent and we'd never know anything about it. I see Google's role as more of a shepherd on that project.
 

nof8butwhatwemake

Well-known member
May 16, 2018
55
0
0
Visit site
How will it result in killing humans? They are using existing open source software to help operators better differentiate between targets in footage. Nothing in that software has anything to do with piloting or launching or firing anything. They're absolutely not crossing the line you're describing.

I respect your right to voice your opinion, however reason dictates the legitimacy of the concerns expressed by the employees directly involved with the project itself outweighs the significance of opinion from an outsider with no actual knowledge of the program.

These Google employees were alarmed enough to risk their careers by publicly protesting what they saw as a dangerous precedent being set by direct involvement in military applications. I'll take their concern much more seriously than your dismissal of it.
 

nof8butwhatwemake

Well-known member
May 16, 2018
55
0
0
Visit site
Well the situation with Project Maven is more than what gets sensationalized. It first hit the news about 3 months ago. The software being used is related to the pictures and data that drones collect during reconnaissance. Current that data is process and interpreted by humans. AI would help speed up the process and reduce human error. This, in turn, could save lives for both the military and the innocent if violence is involved.


The software is open sourced that's being used on the project, so the military could've, and probably would've, moved forward with or without Google providing technical support. In my view, governments all around the world do business with the private sector. I see this as no different.

So play the situation the other way, Google decides to yield to employees concerns and abandons the project. With them no longer involved, the government may actually completely bastardize the initial intent and we'd never know anything about it. I see Google's role as more of a shepherd on that project.

What you see as a beneficial reduction in human error I see as the alleviation of a necessary responsibility for accuracy. Once algorithms take over tasks such as determining hostile targets, who's responsible when mistakes are made? Who's shouldering the moral weight of those deaths? Once machines take over killing for us, how can we possibly apply the same level of humanity in the decision making process? The gravity of creating casualties will never be the same as when a human is responsible for actually carrying out the task rather than executing a program to do it for them.
 

nof8butwhatwemake

Well-known member
May 16, 2018
55
0
0
Visit site
"A DoD statement from July announced that Project Maven aimed to "deploy computer algorithms to war zone by year's end", talking up an "AI arms race" and the fact that former Alphabet chairman Eric Schmidt now refers to Google as an AI company, not a data company."

I'll just leave this here with one question.. if none of this alarms you, what would it take before you would be concerned?
 

anon(10092459)

Well-known member
Nov 25, 2016
1,801
0
0
Visit site
What you see as a beneficial reduction in human error I see as the alleviation of a necessary responsibility for accuracy. Once algorithms take over tasks such as determining hostile targets, who's responsible when mistakes are made? Who's shouldering the moral weight of those deaths? Once machines take over killing for us, how can we possibly apply the same level of humanity in the decision making process? The gravity of creating casualties will never be the same as when a human is responsible for actually carrying out the task rather than executing a program to do it for them.

Well, I don't want to get off topic and debate the moral weights of war. But what your very post is exactly why information needs to be as accurate as possible and not less if those decisions become necessary. If there's technology that can spare erroneous loss of life, shouldn't that be implemented?

Remember Google's role is supporting an already available open source software. They can either be there or not, either way the software is there and if Google hypothetically stepped away it wouldn't prevent anything at all.
 

anon(10092459)

Well-known member
Nov 25, 2016
1,801
0
0
Visit site
"A DoD statement from July announced that Project Maven aimed to "deploy computer algorithms to war zone by year's end", talking up an "AI arms race" and the fact that former Alphabet chairman Eric Schmidt now refers to Google as an AI company, not a data company."

I'll just leave this here with one question.. if none of this alarms you, what would it take before you would be concerned?

Not alarmed at all and I would have to see a direct malicious act in order for me to be alarmed. I fly all the time. I have no issues with Boeing or their military division either.
 

nof8butwhatwemake

Well-known member
May 16, 2018
55
0
0
Visit site
Not alarmed at all and I would have to see a direct malicious act in order for me to be alarmed. I fly all the time. I have no issues with Boeing or their military division either.

Well, then I guess it's a good thing malicious acts by large intelligence organizations are always done in front of civilians.. that way you'll know when to be concerned.

Sorry, I just had to.. lol. Have a great rest of your weekend, enjoyed the debate.
 

nof8butwhatwemake

Well-known member
May 16, 2018
55
0
0
Visit site
Well, I don't want to get off topic and debate the moral weights of war. But what your very post is exactly why information needs to be as accurate as possible and not less if those decisions become necessary. If there's technology that can spare erroneous loss of life, shouldn't that be implemented?

Remember Google's role is supporting an already available open source software. They can either be there or not, either way the software is there and if Google hypothetically stepped away it wouldn't prevent anything at all.

I'm not against software that's for assisting humans in their tasks when you have humans working directly with the results of those algorithms and still holding the final say on any action before it's taken. However there are several defense companies currently working on projects that will bring forth completely autonomous warfare by combining AI with Unmanned Aerial Vehicles in the very near future. Do we really want the company with the entire world's civilian data also working in a military capacity? I don't
 

tadpoles

Well-known member
Jul 20, 2015
2,647
13
38
Visit site
Sorry for not reading all the above posts.

This may be a rare sentiment but I don't mind Google's knowledge of me. It gets me good products that are low to no cost catered to my preferences. So far I've had nothing to complain about. Even ads are catered to my interests. I have to be honest, I like Google...and that's not just me being naive.
 

Itsa_Me_Mario

¯\_(o_o)_/¯
Feb 19, 2018
1,681
0
0
Visit site
I respect your right to voice your opinion, however reason dictates the legitimacy of the concerns expressed by the employees directly involved with the project itself outweighs the significance of opinion from an outsider with no actual knowledge of the program.

These Google employees were alarmed enough to risk their careers by publicly protesting what they saw as a dangerous precedent being set by direct involvement in military applications. I'll take their concern much more seriously than your dismissal of it.
Their concerns are that they don't want Google involved with military customers, regardless of whether or not the application is benign.
 

Chuck Finley69

Trusted Member
Feb 22, 2015
470
0
0
Visit site
We both know that argument is completely false. Data about you is being collected whether or not you use any company's products or agree to their terms. If someone takes your picture with their cell phone, posts a picture with you in it on social media, your data is being collected. Location, facial recognition data, etc. Determinations about you are being made by algorithms based on your friends habits/likes/events and that information is used to build a profile about you whether you want to or not. You cannot avoid data collection at this point without withdrawing from societal norms.

You have the right to ask people to not upload photos of you to internet or social media.