Should Cyborgs Have the Same Privacy Rights As Humans? 206
Jason Koebler (3528235) writes When someone with an e-tattoo or an implanted biochip inevitably commits a crime, and evidence of that crime exists on that device within them, do they have a legal right to protect that evidence? Do cyborgs have the same rights as humans? "The more you take a thing with no rights and integrate it indelibly into a thing that we invest with rights, the more you inevitably confront the question: Do you give the thing with no rights rights, or do you take those rights away from the thing with rights?," Benjamin Wittes, a senior fellow at the Brookings Institution, who just released a paper exploring the subject, said.
All the evidence is beginning to suggest... (Score:2)
Re:All the evidence is beginning to suggest... (Score:5, Insightful)
That would be nice. But in the meantime ... it's about property. From TFA:
Because they are non-sentient property. Ask again once AI is achieved.
And the difference between a stored text communication and a written letter? Learn the 4th Amendment.
The right of the people to be secure in their persons, houses, papers, and effects, against unreasonable searches and seizures, shall not be violated, and no warrants shall issue, but upon probable cause, supported by oath or affirmation, and particularly describing the place to be searched, and the persons or things to be seized.
Really? "Slaves"? Maybe you should look into actual slavery.
As to "uncertain" just look for the sales receipt or lease agreement. My car is a machine and there is no uncertainty as to who owns it.
Fuck you.
Learn what technology really is before you go off on movie tangents.
Re: (Score:2)
But our laws do not recognize the rights of machines themselves.
Because they are non-sentient property. Ask again once AI is achieved.
What about how the computers store information for their own use (example: evercookies)? I know it's not the "mind" of the computer doing what it wants but it's certainly not the user either.
The right of the people to be secure in their persons, houses, papers, and effects, against unreasonable searches and seizures, shall not be violated, and no warrants shall issue, but upon probable cause, supported by oath or affirmation, and particularly describing the place to be searched, and the persons or things to be seized.
You're right, but that applies once you get an expensive lawyer, only. Otherwise, all the legal stuff that protects us is worthless.
And it's silly to think that machines could be held accountable. But the people that program them...
Re: (Score:2)
You're right, but that applies once you get an expensive lawyer, only.
One of the great strengths of a populace gifted with civil rights is an abiding belief that those rights belong to them. No law or condition of government can abridge an ingrained belief in individual rights.
Would I rather be a wealthy defendant than a poor one? A no-brainer, sure, but when you no longer believe that basic rights are afforded to you, you have already lost.
Re: (Score:2)
One of the great strengths of a populace gifted with civil rights is an abiding belief that those rights belong to them.
Great. But what's the use of a "belief" if it's no longer true? You're talking about a country that re-elected someone as the head of state who was KNOWN to have ordered the targeted killings of American citizens without trial. I don't see how much further one can get from "your rights are all now optional" than the head of government killing people (i.e., effective removal of ALL rights) with no legal process, and the electorate implicitly condoning the process by reelecting him.
Maybe the populace bel
Re: (Score:2)
Are the powers that be embroiled in seemingly constant effort to reverse those personal freedoms given to the citizens? Sure.
However, a voting populace that expects better treatment generally gets it.
Re: (Score:2)
I have a less jaded view. The limitations on government power are set forth in a document we refer to as the Constitution of our Republic.
There. FTFY.
Re: (Score:2)
What about how the computers store information for their own use (example: evercookies)? I know it's not the "mind" of the computer doing what it wants but it's certainly not the user either.
Duh, it's the mind of the programmer who had the script drop the cookie. But your comment tells me you know that already.
Re: (Score:2)
That would be nice. But in the meantime ... it's about property.
Not quite.
An apt comparison would be comparing "cyborgs" i.e. people with various technology built into their bodies, for uses other than medical, with gun laws.
I.e. It is not a property issue but a claim right [wikipedia.org] issue.
As for machines and not humans with implants... again... that is not a property issue.
It is not even an issue of consciousness nor intelligence (should those machines posses it) as we regularly limit the rights of humans who have shown a lack of self-control - be it due to intoxication, disabil
Re: (Score:2)
Fantastic rebuttal ! You hit the nail right on the head.
This bullshit handwaving about "machine rights" is total nonsense. Like you said once Actual Intelligence happens instead of the joke that passes for artificial ignorance today then we can talk about "rights" of machines.
Property, my ass! (Score:2)
First, I agree completely with your comment. Secondly, I don't even have to RTFA to see that TFA rides the short bus.
As a cyborg, I find this entire topic offensive. A cyborg is part animal and part machine, and guess what? There are a hell of a lot of us. I have a CrystaLens implant in my left eye, making glasses unnecessary for me (I see better than you do). It is a device that uses the eye's muscles to focus. I'm 62 and need no corrective lenses whatever.
Do you know someone with a cochlear implant? Artif
Re: (Score:2)
Answer: none. Both are people.
Re:All the evidence is beginning to suggest... (Score:5, Insightful)
Here is the quote from TFA. It provides the context.
No. That is not referring to an IDE drive.
Or, more completely:
So no. They are not talking about an IDE "master/slave" situation. They are talking about humans using machines (with examples provided) and equating that to "slavery".
Re: (Score:2)
You must be from Washington DC, where logic has been outlawed and argument for argument's sake is a local pastime.
There is only one possible logical reference to slavery here: ownership of another human being.
Re: (Score:2)
No. They should set their bar a bit higher than that.
Yes, I think the rule should be simple:
Anyone or anything that claims a right to privacy shall have it.
Re: (Score:2)
When someone with an e-tattoo or an implanted biochip inevitably commits a crime, and evidence of that crime exists on that device within them, do they have a legal right to protect that evidence?
What about when someone with DNA inevitably commits a crime and leaves some DNA behind? Are we allowed to take a DNA swab just out of anyone willy nilly? The answer is no, not yet at least, and not with some kind of due process. In the US and in Europe at least, there are specific laws protecting the privacy of DNA (unless you're a felon, or unless you're in the military).
Granted, the entire male population of three villages in Scotland was once swabbed for DNA [police.uk] for a double rape and a double murder case, bu
cyborg representing persons? (Score:2)
What kind of cyborg? (Score:2)
Cyborgs are just kinds of humans, so yes. Unless you count cyborg cats, which would be a more interesting question which would have to depend on the cognitive abilities of said cyborg cat.
Re: (Score:2)
Jackpot.
Re: (Score:2)
Cyborgs are just kinds of humans
It's conceivable that something could cause a knee jerk reaction and suddenly a bill appears suggesting people with brain enhancing modifications are no longer "people". Then all your inalienable rights go out the window. Perhaps a law is passed saying modified humans are no longer human and therefor no longer citizens. Again, your rights go away. It's something that should (eventually) be addressed before stupidity happens.
While a person is smart and can make rational decisions, sometimes people beco
Re: (Score:2)
The point you really were making is that sometime the people in power are sociopaths with a dictator complex. And the time to stop them from doing something is before they decide that it's desirable.
Re: (Score:2)
what kind of machine? (Score:2)
exactly...so many of "teh singularity" type "futurists" who get to have their thoughts on this stuff published have absolutely no idea what they are talking about
anyone with a pacemaker or hearing aid is a "cyborg"
hell, it's "cybernetic" when you know your phone is ringing b/c you set it to vibrate...
Re: (Score:2)
Cyborgs are just kinds of humans, so yes.
The situation is really not that simple, even if you consider non-cyborg humans. See this Stack Exchange thread on the topic:
http://philosophy.stackexchang... [stackexchange.com]
There are no new legal issues (Score:5, Interesting)
An implanted cell phone is no different, legally, than any other cell phone. The cops can't search your cell phone without permission or a warrant, why could an implanted one be any different? At worst, it'd be the same process to forcibly take a DNA sample, which also requires probably cause.
Does the Brookings Institute require their senior fellows to publish on a regular basis to keep getting a paycheck or something? Cuz I'm having a hard time figuring out any other reason for this.
Re: (Score:2)
The question pops up when the human is no longer sentient. Suppose you had an AI implanted to run through probable scenarios when making najor decisions. This works well for 50 or so years and you pass on or get struck by a car or whatever and become brain dead. Also suppose the AI takes over body functions, draws off your memories and can take commands from other computers. Now are you still human or something else? Do you have the same rights or less because you are not really you?
Re: (Score:2)
I know we beat the Ship of Theseus to death around here, but the obvious follow-ups include, "What if I've replaced ALL my parts over time."
Which one contained "me."
Re: (Score:2)
When there is some theoretical model for building an artificial brain (your question was trivially answerable decades ago), it will matter. In the meantime, yawn.
Re: (Score:2)
Re: (Score:2)
Re: (Score:2)
If the guy's talking about "memory reading devices," then he's off in fantasy land, as there is no theoretical basis for such technology, nor is there likely to be during the lifetime of anyone alive today.
And if you record it, it's a recording, same as if you videotaped yourself committing a crime. If it's a device, there are clear, well established rules for showing probably cause for a warrant. If it's part of the body, there are clear, well established (for decades) rules for showing probably cause for
Re: (Score:2)
Since that has nothing whatsoever to do with the discussion at hand, and there is no theoretical much less actual, way for this to happen, who cares?
Re: (Score:2)
A cyborg is a cyborg. You do not get to make up a definition in order to limit the discusion of it.
I purposely created a fictional scenarii in order to exempt bias but if you do not think it is theoreticaly possible, i suggest you pay more attention. They are recording brain waves as we speak in order to make prosthetics as transparent as possible. If they can relay and replay those signal to prothetics, it isn't unimaginable that it could be done for the real thing. And yes, science fiction has already don
Re: (Score:2)
A cyborg is a cyborg. You do not get to make up a definition in order to limit the discusion of it.
Where legal definitions are concerned, neither do you. And it still doesn't matter. Current law covers it without even stretching.
I purposely created a fictional scenarii in order to
Change the subject, and not answer the real point: current law covers implanted technology in one of two ways, and does so quite thoroughly.
exempt bias but if you do not think it is theoreticaly possible, i suggest you pay more attention. They are recording brain waves as we speak in order to make prosthetics as transparent as possible. If they can relay and replay those signal to prothetics, it isn't unimaginable that it could be done for the real thing. And yes, science fiction has already done it.
Interpreting the equivalent of a mouse signal and replaying memories are not even qualitatively the same thing, and we have already proven, quite conclusively, how inaccurate memory can be, even of one's own actions. The chances of such a sys
A camcorder is a camcorder, even up your bum (Score:4, Insightful)
Re: (Score:2)
Whether the implant records a person's thoughts is not material. A diary is not protected under the 5th even though it is a recording of private thoughts, so why should an implant that records thoughts be treated differently?
I think rgmoore has a point, the key is not that a device records thought, but when does device data become indistinguishable from private thought. If a person had brain damage and there was an implant that would effectively take the place of that damaged part of the brain, say memory,
Re: (Score:2)
They can get the information only if they know that it contains evidence. Merely suspecting it of containing evidence is not enough to compel disclosure.
Re: (Score:2)
The same arguments were (and should have been, to get the issues resolved) made about forced blood tests for DUIs, for forced collection of DNA evidence, for forced collection of fingerprints.
In all cases, when a minimum standard of probably cause is met, warrants will be issued and forced collection allowed.
Feel free to explain how this is any different.
Either it's a device, and subject to the same rules as a cell phone, or it's part of the person, and subject to the same rules as biometric evidence. There
Re: (Score:2)
Re: (Score:2)
Nothing new there, either. Arguments over whether black people were human go back centuries, for instance. Some still argue over it today. You kind of remind me of them.
Sadly they won't. (Score:2)
I would think this would be just an extension of the idea of self incrimination. Yes it's a 'cyborg' and not a robot. So conceivably the 'human' part of the combination was in charge of the volition that led to whatever thing is being investigated.
However: If I commit a crime with a tape recorder in my pocket, should the state be able to subpoena me for the tape? They would. Similarly, cyborgs could expect the same treatment. (forcible extraction of whatever data was requested.)
Re: (Score:2)
Do you have a legal right to withhold your DNA? (Score:2)
Your DNA is part of you, as are your fingerprints, and may carry evidence against you. The fifth amendment protection against self-incrimination does not extend to refusing to give your DNA or fingerprints. You do have the right to refuse to give them voluntarily, but if there is probable cause the police can obtain a warrant and force you to provide samples. This actually exactly the same standard as with other items you might possess... your home, your papers, your cellphone, etc.
I think other forms of
Sure (Score:2)
Re: (Score:2)
I like how you worked gay marriage in there but you are wrong. Gays always had the same rights to marry that everyone else has had- to marry someone of legal age of the opposite sex who was not closely related to them. In other words, they have always been human. Interracial marriage wasn't about not being human either. It was about genetics and their grade. Look into Eugenics to find more but it was the same line of thinking of nazies and the aryan race.
No it could be possible that a cyborg is not consider
Re: (Score:2)
Re: (Score:2)
That difference changes nothing about what i said.
This is no different. (Score:3)
You need a warrant to search external electronics that belong to people. You should also need a warrant to search internal electronics that belong to people. There is no new legal questions created by putting electronics inside people rather than simply keeping them detached.
You can't just shove your iphone up your ass claim to be a cyborg to evade a search warrant. By the same token, the police can't use the fact that your iphone is up your ass to call you a cyborg and search it without getting a warrant.
Re: (Score:2)
... There is no new legal questions created by putting electronics inside people rather than simply keeping them detached.
Maybe, maybe not. Let's say that you have some sort of future pacemaker or other medical device implanted that you need to stay alive. For whatever reason this device as part of its normal function also happens to have historical location information in it. Perhaps the device optimizes or alters its operation depending on your altitude or location. This device would be a part of you and having it wouldn't really be a choice. Would forcefully extracting information from such a device be any different than co
Re: (Score:2)
Why does it matter if the device is physically inside you or necessary to live? Why is a futuristic pacemaker any different than a cell phone? I would argue that a modern cell phone is more a part of a person than this hypothetical futuristic pacemaker, despite being outside the body. The cell phone in addition to storing location information also has all your emails, text conversations, search histories, voicemails, facebook stuff, etc.
Would forcefully extracting information from such a device be any different than compelling a person to testify against their will?
Should fingerprinting someone or taking a DNA sample be considered f
Re: (Score:2)
Why does it matter if the device is physically inside you or necessary to live? Why is a futuristic pacemaker any different than a cell phone?
It is about choice. In my opinion, it is different because such a device would not be carried by choice nor would it have data that you voluntarily placed on it. A cell phone or other computer you carry by choice. Data you put on your cell phone (pictures, email, GPS tracks, etc.), you put on by choice. With a pacemaker (or other medically necessary device), you really don't have a choice to have with you (unless you choose to die). Operational data that such a medical device might gather, you don't have an
Re: (Score:2)
Are you suggesting that said *pacemaker* is storing location information without any method to nondestructively access it? If so, I call bullshit. If not, the cops need only use the same interface to extract the information without killing you.
Re: (Score:2)
Are you suggesting that said *pacemaker* is storing location information without any method to nondestructively access it? If so, I call bullshit. If not, the cops need only use the same interface to extract the information without killing you.
I am not talking about the technical ability to extract data from the fictional future device, I am talking about the legality. My point is that if some future medically necessary device did for some reason store historical location information, that such data should be covered by the same laws that protect a person from self-incrimination. If I don't have tell tell the cops where I was last Thursday, a medically necessary device that I can't live without and which I can't control the data collected on, sho
Legal precedents (Score:2)
To decide this, we need to look at the history of the 5th Amendment and how the courts have interpreted it. I'm not a lawyer, but I think it's pretty clear that cyborgs' personal data will be covered.
According to Wikipedia's article on the 5th Amendment [wikipedia.org], courts have been pretty expansive. You can't even be required to turn over the password to an encrypted hard drive if it would incriminate you.
If I understand the history, the 5th Amendment was partly a backlash over the horribly unfair "Star Chamber" leg
In short: no. (Score:2)
Who cares (Score:2)
We aren't in control of our data or devices anyway. If anything has been shown in the past, is that everything we do with our shiny new devices is phoned home to HQ for further analysis. No way of being self-sustained. It could leak trade secrets. And the users don't care, so lure them with a bit convenience, and they are all yours. No need to get data from inside a suspect, its already enough to just ask google what he has asked google. Google may not be in direct contact with our nerves, but if we include
And what privacy rights would those be? (Score:3)
Seriously; in light of all the violations of our "privacy" by the government, what "rights" can we humans be said to retain?
Viewed in that light, however, the answer is probably a depressing "Yes".
1% and the GOP will use this to jail anyone who (Score:2)
1% and the GOP will use this to jail anyone who trys to gum-up the works as there jobs are taken away. But look at the up side the jail / prison must give you health care
Re: (Score:2)
We have privacy rights? (Score:3)
I think this is a joke because we really don't have privacy rights. The NSA doesn't think so, most government don't think so.
So what privacy rights are we talking about again?
Identity (Score:2)
If you folk's look at it from my perspective: Where i live the police photograph the Tat's of the gang guy's and gal's why not document a Cyborg?
Once a person, always a person (Score:2)
don't over think it!
Re: (Score:2)
Treat it like clothing? (Score:2)
If I commit a crime and my shoes or other clothing contain evidence of my criminal act, is the clothing legally treated as if it's "part of me" or as if it's not?
Generally not. Think about all the crime dramas where dirt that is only found at the crime scene is found in the suspect's shoes, or where the dye from the exploding dye-pack was found on the suspect's clothing.
Much more likely to be a legal issue is the issue of how invasive the legal system can be to retrieve the evidence. A few years ago there
Yes (Score:2)
Battlestar Galactica (Score:4, Funny)
I've watched enough Battlestar Galactica to know the importance of treating cyborgs well. There is a cycle that keeps repeating: humans (or some other life form) creates artificial sentient life form but treats it badly, like a slave race. The artificial life form rebels and begins to conquer its creators, but the artificial life cannot reproduce. That leads to some kind of joining between a faction of the artificial life with its creators for reproduction. The group of hybrids grows and prospers but forgets its origins and creates new artificial life. Repeat.
Re: (Score:2)
I bet the Chimpanzees wish they hadn't invented humans now.
Its quite simple, really. (Score:2)
If it have enough brainpower to hire a good lawyer, it will most likely will be considered human, no matter if a cyborg, robot, bipedal fox created by the wrong kind of scientists...
Re:Humans have too much (Score:5, Insightful)
Re: (Score:2)
Hmm, well, I actually came here to make some sort of comment like that... are there no advocates for full transparency?
Increasingly we're living in a world where everything is recorded. Back in the old days you just had to tell everyone that "God is watching" to make them behave. It kinda worked (the Renaissance was pretty much started because bankers were trying to buy their way out of Hell by commissioning works of art for the church). These days with so much privacy, there's not really any incentive
Re:Humans have too much (Score:5, Insightful)
Assuming that it was impossible to have *any* privacy, you would immediately see widespread persecution of anyone who didn't fit the "norm". Shortly afterward, anyone with any intelligence would cease any public activities which did not meet general approval and start looking for ways to engage in them so that only other people with those hobbies would know about it - in effect, clamoring to restore the lost privacy.
In short, a life without privacy is one where you must live according to how everyone else wants you to live, whether than living how *you* want to live. It is a prison without bars.
Re: (Score:2)
Oh, well, that's an easy problem to solve, we can just simply round up all the persecutors and... OH SNAP, NOW YOU'VE GOT ME DOING IT!
But really, at what point can we just ignore the busybodies and come skulking out of the closet and be who we want to be and not give a fuck about what what other people think, because they don't have the power to do anything about it.
As far as living by everyone else's rules go, I probably have a good deal of privacy, but I still do it anyways. I don't veg out on video game
Re: (Score:3)
But they do have the power to do something about it.
For a good example, I loathe the catholic church. I think they are an outdated organisation that does far more harm than good, that their views on contraception are getting people killed, that their homophobia and misogyny are archaic and disgusting and that, while they proclaim themselves as a great charitable organisation, the fast wealth they flaunt given every chance tells another story. The cover-ups for pedophiles is just the icing on the evil cake.
M
Re: (Score:3)
Good point... I don't think that kind of thing will be much of an issue, though, because corporations like to save money by hiring the lowest salary staff from the largest pool of potential employees as possible.
As much as I'd like to believe that workplace diversity policies were implemented purely for progressive civil rights reasons (and I do applaud some of the brilliant and talented HR reps that can make everyone and themselves believe it!) it's obviously in their interests to "overlook" a lot of stuff
Re: (Score:3)
Very idealistic. We could do with more transparency. Mroe than that, we could do with more equality of transparency. The rich get to hide their mistakes behind the corporate veil. Those of us who aren't executives of corporations have more limited options.
However, until the law is perfect, justice is truly fair, and our peers are totally enlightened about freedom of thought, speech, and so forth, all of which may be never, privacy is important. Is there anyone who hasn't had things to hide from our ow
Re: (Score:2)
Yeah, very idealistic, probably too much for homo sapiens, but maybe an advanced race of cybernetic organisms could handle it. Or perhaps they won't really have a choice since their black boxes could be subpoenaed.
Heh, as a parent, I would feel like a failure if there was something that my kids wouldn't feel comfortable confiding with me. But they're not yet teenagers, so we'll see. I suppose my own youth may have been atypical... brought home my first porn stash when I was 7 or something (someone left
Re: (Score:2)
Total transparency is not healthy for human psychology and social behavior. Just look at what the spying programs have done to create the current delusional group psychosis that is washington DC culture. Do you really want to be at the mercy of a bureaucratically enforced morality for every little decision you make? If everyone knew everyone else's business, you can be sure that getting permission to do anything noteworthy would be next to impossible. You can kiss individuality goodbye.
I would never, ev
Re: (Score:2)
Yeah, I think that it's possible (and that it's actually already happening) that society will become more accepting (and supporting!) of diversity. A healthy ecosystem is a diverse one, and is able to use and take advantage of the individual strengths of each of its members.
zero privacy = full control (Score:4, Insightful)
this question reeks of absent-minded techie "disruptive innovation"
so zero privacy rights...everyone can look at everything? have you thought this through?
so the password to the safe where I keep my guns...that's open for everyone?
does this "full transparency" apply to only digital information? if so, people would just do things they want by paper like before there was ever digital technology of any kind...so it seems that your "full transparency" must include non-digital...which means at any time, my personal affects can be looked at by any person?
what about my business plans? do those get to be secret or does "full transparency" apply to those too?
"full transparency" is a totalitarian dream...so the answer is, if you loose your right to privacy, all the others follow...
can we end this line of questioning forever? privacy rights are a fundamental thing...no need for any techie "disruptive" "innovation"
Re: (Score:2)
I'm just saying if we can embrace the positive parts of full transparency, that will be better than the fallacy of believing we can successfully safeguard our privacy.
Unless you live as a hermit in the middle of the Yukon, I don't really see how you might expect to have fully guaranteed privacy rights while living in society. Someone's going to gossip about you. Might be more effective to limit the damage they can do with whatever information they manage to glean by flying their X-Ray UAV over your house,
Re: (Score:2)
I'm just saying if we can embrace the positive parts of full transparency, that will be better than the fallacy of believing we can successfully safeguard our privacy.
You do realize that Roe v. Wade [wikipedia.org] is based on privacy rights, don't you? If you get rid of privacy, including (or especially) medical privacy then you undermine the foundation of a woman's right to choose.
the Spanish Inquisition (Score:2)
the persecution of scientists
the enforcement of taboos
the "war on drugs" and other states of mind
repression of political opposition to a regime in power
all live by stripping privacy
that's why
Re:Humans have too much (Score:4, Interesting)
Are you 'tarded or something. Tracking ACoward can be much harder than an actual username. Logged in users with a long posting history leak all kinds of information about who they are, information that can possibly trace back to them without an IP address. At worst both just leave an IP, which if measures are taken, such as proxies or hacked machines can be near impossible to track.
Re: (Score:3)
Re: (Score:2)
It's all part of the secular cyborgist robosexual agenda.
Re:Humans have too much (Score:4, Funny)
She's eight foot two, solid blue
Five transistors in each shoe
Has anybody seen my gal?
Re: (Score:2)
It's like the databases of "anonymized" information. Gather enough information, and eventually you'll have enough data points to uniquely identify an individual. That's
Re: (Score:2, Informative)
Re: (Score:2)
For example, you follow grammatical rules, sentence structure and syntax. You obviously are educated, and as such have followed "those" rules. You are posting on an internet site, and as such one can infer that you have access to a computer, electricity, and internet service, so you are a member of a first world society...
So, which rules exactly do you feel are "unjust", and thereby not appropriate for you to follow?
Of note, that
Re: (Score:2)
Humans have too many privacy rights as it is. Groups like ISIS happen because we're more concerned about doing all of our daily tasks in secret rather than being safe.
I think the previous poster already answered your question about what rules he thinks are unjust.
Re: (Score:2)
None of those are rules.
Re: (Score:3)
--Robert A, Heinlein, The Moon Is A Harsh Mistress
Re: (Score:2)
Re: (Score:2)
"Humans have too many privacy rights as it is. Groups like ISIS happen because we're more concerned about doing all of our daily tasks in secret rather than being safe." - by Anonymous Coward
Of course, if I had said something as stupid as that, would want to maintain my anonymity/privacy as well.
Re:Humans have too much (Score:4, Interesting)
The United States is a bit of an aberration and we would do well to remember that. At our founding we were sparsely populated, had few neighbors who themselves were sparsely populated, and were facing large amounts of untamed wilderness. Our concept of manifest destiny effectively meant that if you wanted a say in affairs greater than your own, all you had to do was move west and set up your own place to govern, and if you look at the religious migrations that occurred, and the movement of immigrants that came through America's east-coast cities and kept traveling inland you can see how that played out.
Even still, we had our share of internal violence, with its strongest being the 1860s and the civil war. If you look at the propaganda from that war, The Battle Hymn of the Republic calls on men to fight for natural rights as a Godly cause; religion played a role in many of our decisions as a nation. Now I couldn't rightly say what Union or Confederate troops did to the civilian population beyond what we know about (ie, the burning of Atlanta) because I'm no historian, but given human nature I wouldn't be surprised if the lack of atrocities is simply a matter of documentation and no desire to show them off, versus them not occurring.
Back to my original point, Our country's creation and history is uniquely created by our geography, lack of population density, and the various mindsets of those that immigrated here and those that resettled. Our modern form of democratic republic reflects how disparate and diverse the perspectives and opinions are, and that abstraction layer in the form of elected representation is often overlooked in terms of how we feel and how we actually govern, and our most extreme citizens generally aren't represented in government. We're successful but we still have to pay attention to our fringe element, and fortunately that fringe element is fairly small.
We can't expect other countries to have the same circumstances as we do. Our kicking-over the anthill that was Iraq was a huge mistake, and while Saddam Hussein was not our friend, history has shown him to be the lesser of evils in the short term. He oppressed his people, and he killed those that sought to overthrow him, but he didn't kill those that simply believed in the same god but worshiped that god in a slightly different way. He couldn't have afforded to let religious extremism come out into the open because it was a threat to him, so he kept stomping it down. Don't get me wrong, he was a bad person, but not nearly so bad as what's spawned in his wake.
We need to remember the lessons of Iraq, and to not go around kicking over other dictators just because we don't like dictators. Take that cork out of the bottle and the whole thing explodes.
Re: (Score:2)
A cyborg is a human silly.
It would have to be determine on an indevidual basis but a general rule might be that any implanted object is covered under the rights of the person holding it. Something like this would protect an implanted recording chip's data the same way a person's cell phone is. But if the person is braindead and a computer or AI is making them function, then it can be treated as the cell phone or whatever if a person with reduced mebral capacity. Some of them will be deemed incompetent and a
Re: (Score:2)
Yes, there are no gray areas at all. If it's implanted inside you then its part of you, and if its separate then its not. Oh, wait. A diabetes monitor has an implanted sensor and an external battery pack, so which is it? Can I search the data on it to find out where you've been or what you've been doing or not? Does it matter whether its microchip is inside or outside?
Re: (Score:2)
Re: (Score:2)
Also it would suck if they decided to surgically remove your prosthesis and keep it in their evidence locker for a couple years until the trial proves you innocent, and then hopefully give it back. Things attached to you might be protected as your stuff, but I think it should be protected a little more at least in terms of what evidence they need before they take them and how long they can keep them.
Re: (Score:2)
"Sorry, my dear Intelligent Kneecap. We've had good times together, but now you know aye too much!"
*BLAM*BLAM*BLAM*
[Drags rest of self away from crime scene.]