Advances in technology may open the door to uploading — transferring a human’s intellect to a computer. This is the technology currently being pushed by the Avatar Project, which I wrote about on Estate Dispatch recently. Uploads could theoretically take possession of robotic bodies, control holographic projections in the real world, or interact with virtual environments. (If spending the rest of your life in Second Life version 27 sounds awesome to you, you are probably really excited about this.)
Nick Bostrom sums up the relevant philosophical conclusions (as of 2003, anyway) in § 2.6 of his Transhumanist FAQ:
Many philosophers who have studied the problem think that at least under some conditions, an upload of your brain would be you. A widely accepted position is that you survive so long as certain information patterns are conserved, such as your memories, values, attitudes, and emotional dispositions, and so long as there is causal continuity so that earlier states of yourself help determine later stages of yourself. Views differ on the relative importance of these two criteria, but they can both be satisfied in the case of uploading. For the continuation of personhood, on this view, it matters little whether you are implemented on a silicon chip inside a computer or in that gray, cheesy lump inside your skull, assuming both implementations are conscious.
Of course, philosophy is not law. Would courts recognize uploads as human beings? Honestly, that’s not the interesting question. The interesting question is how do we deal with copies?
Assuming we gain the ability to transfer a human intellect to a computer, we may have two copies immediately: the physical human (if he or she survives the uploading process) and the upload. But it would be foolish to transfer your intellect to a computer without reliable backup. Copying an upload should be no more difficult than backing up your family photos to an external hard drive.1
If more than one copy of an upload is conscious (allowed to access memory and processing power), I assume they would be like identical twins: similarly-structured, separate intellects. In other words, although the copies may be identical at the moment of copying, they will immediately begin to acquire different experiences, think different thoughts, and probably make different decisions, all of which will make each copy a different “person.”
So, could an upload vote? What if the source human still lived? If there are other conscious copies of the upload, could they all vote?2
You see the problems.
For starters, would any upload be entitled to vote in the first place? Only U.S. citizens have the right to vote in U.S. elections, and there are two ways to become a citizen: birthright and naturalization.
It certainly might be possible for an upload to be naturalized, but could uploads inherit the citizenship of the human from whom they are uploaded (or is it simply passed at the same time as the upload takes place)? In the case of destructive uploading (when the human does not survive), this could make sense, because there was one voter before the upload, and one voter after. But in the case of non-destructive uploading, there could be many copies. In other words, there may be 2 or 11 or 57 copies of you floating around the ether. Some might even be Republican.
So if all uploads were citizens (and therefore entitled to vote) as long as they were uploaded from a U.S. citizen, a motivated political candidate might be able to replicated a swarm of like-minded (literally) voters. This seems like an extremely undesirable outcome.
But if uploads cannot vote individually, should they be allowed to vote together? Maybe uploads should have a “primary” with their copies to decide how the community of copies will vote.
Should uploads have to wait 18 years from the date of upload, even if the source of the upload was 18+ when the upload took place?
Another possibility entirely is that the uploads could be treated as simulations of the original intellect, and therefore not entitled to citizenship or the right to vote. This seems like a very-likely initial stance, but it also seems likely (to me, anyway) that uploads will probably get the right to vote, at some point.
Which means that elections may look a whole lot different in 50 years than they do right now. We may have an electorate many times the size of the current one, and much the electorate may be uploads who rarely — if ever — engage in the “real world.”
A very large one, that is. The human brain is estimated to have about 2.5PB of storage capacity. Still, storage capacity is cheap and getting cheaper all the time. Backing up a few petabytes on CrashPlan may be trivial in a few years. ↩
Since uploads can be copied and need not die, the “population” is likely to explode, leading to a massive increase in the electorate. ↩