Should Humans Get an Identity Chip Implanted Under Their Skin?

Imagine the chaotic scene at a date somewhere in the foreseeable future. The sky is darkened by thick looming clouds and sheets of rain as busloads of disheveled citizens are dropped off and quickly herded into orderly lines. The lines lead to the sliding double-doors of a 25-story building with a makeshift sign in front which reads “Chipping Center”. Hordes of people from every nationality stand single file for hours waiting to be “chipped”. All due to the collective governments of the world mandating that every person should receive an identity chip implant immediately “for the betterment of society”, so they claim.

This dystopian future has been fictionally depicted in Sci-Fi movies and TV dramas since the 1950s. Then there was 1962’s “The Manchurian Candidate”, a movie that made a large contribution to a whole new generation’s fears that the government would eventually use technology to monitor citizens. And with every new advance in technology, this frightening prospect has seemed to become less like an entertaining movie trope and more like an imminent reality.

Should we fear this technology, or should we welcome it? What about identity chips in children as a safety precaution against kidnapping? Or how about a chip that will identify who has gotten the COVID-19 vaccine and who has not? Or one that can identify who has the COVID-19 virus? Perhaps we all should be looking forward to being “chipped” since so many sources believe it is inevitable… But is it? Will chip implants ever become mandatory? What are the facts on this issue?

While headlines and viral commentary are eye-catching, there is very little evidence that every person in the world will be forced to get an identity chip implanted under their skin. However, the technology is in fact growing in popularity with every decade which passes. Perhaps everyone should at the very least consider it. Here are a few important facts about the technology.

5 important facts about identity chip implants

  1. Identity chips are not “new” technology.

Identity chips operate using Radio Frequency Identification (RFID) technology. RFID uses low-frequency radio waves to collect information from a tag or “host” and then relay that information back to a system that can read it. RFID has been functioning commercially since the early 1970s. Essentially, every time you encounter a wireless transmitter and receiver, you are encountering RFID technology. This includes your TV remote, car alarm, and even automatic bridge tolls to name a few uses of this technology. The fact that RFID is 50-year-old technology should lessen the anxiety that it would suddenly be used for malicious or deceptive purposes in the 21st Century.

  1. Identity chips under the skin are already in use.

These chips have already been implanted under the skin on a wide scale in different parts of the world. For example, identity chips were implanted in the hands of 4,000 willing participants in Sweden. The implementation was a raving success. The Swedish citizens were said to have been granted quick access to computers, automated locks in homes, gyms, office spaces, and other areas where authorization is required. Identity chips are also already in use for retaining pertinent medical records and banking information for those who have been “chipped”. Participants are largely satisfied with the added level of convenience the chips have provided. And who wouldn’t want the ability to open your front door with a wave of your hand?

  1. Identity chips cannot track your movements.

One might suspect a tiny chip the size of a grain of rice in the hand would be the ideal way to track a person’s movement. But, not so. While identity chips can store a large amount of data, they do not have GPS capability. The low-frequency signal can only travel a short distance to a receiver. Also, the potential signal-range is lowered substantially by the amount of saltwater within the human body. The most efficient way to transmit data from an identity chip is to bring the reader to the point of nearly touching the surface of the skin. As a result, identity chips are a poor choice to track a person’s movements, as the tracking receiver would have to remain inappropriately close to the host.

  1. Identity chips cannot be “unknowingly” implanted under the skin.

The thought that a government agency might covertly implant identity chips into every citizen makes for good Sci-Fi, but this would not be very practical. For one, implantation requires a small yet invasive surgery which simply is not hide-able. And because of their limited range of transmitting a signal, every citizen would at some point have to make physical contact with the receiver for the data on the chip to be read. To date, there are no reported instances of involuntary identity chip implantation. And in most of the United States, it is illegal to require an identity chip implant. Therefore, if such a thing were to occur, it would be a federal crime.

  1. Identity chips are not the largest security risk to our personal data.

We all knowingly and voluntarily carry around multiple devices that present higher security risks to our personal data than an identity chip under the skin ever could. Even the most basic smartphone has GPS tracking capability and enough storage capacity to hold an entire page of our most personal data. Smartphones also record our online activity which, in a market-driven world, is highly sought-after information. Plus, every smartphone contains an identity chip along with other microchips and an array of technology from a multitude of manufactures. If the government really wanted to record where every citizen goes and what we do, they would be far more able to violate our privacy by way of our smartphones than our identity chips. Lastly, a smartphone can be stolen relatively easily while an identity chip cannot.

Share This