Human microchipping has rounded its corners quicker than expected. Now, people have the ability to pay for products, unlock doors, and store or retrieve data all within a microscopic chip implanted inside their hands. Today, over 50,000 people worldwide have a microchip within their bodies, and this number has been increasing rapidly.
Although microchipping may seem like an idea reserved for the near future, it originated over two decades ago, in 1998, when engineer Kevin Warwick became the first human to ever get chipped. Now known as “Captain Cyborg,” Warwick was the first of many to take the risks involved with getting chipped. Technology has advanced in the two decades since, and today it is easier than ever to receive a microchip.
There have been many debates about the ethicality of human microchipping; is this really the beginning of international robotic domination? One 2021 study conducted by Yong Zeng suggests that although microchipping is a part of our reality now, it is not ethical. One reasoning behind this is data leakage: a potential risk of microchipping.
In order to function, microchips use either radio frequency identification (RFID) or near-field communication (NFC) technology. Thus, they are composed of two parts: the tag and the reader. These technologies parallel Apple Pay or contactless credit cards, and are very easy to hack into. The chip reader being used has the ability to scan the chip from a few inches away, so all someone needs to gain access to another individual’s data is to have a similar reader at hand.
Furthermore, these microchips have not been fully proven to be safe for the human body. If the chip were to break down inside an individual, the impact it would have on the human nervous system is unknown.
This, in turn, provides another potential threat, one more dangerous and scary than data being stolen. Hackers have the potential to threaten an individual’s life by implanting viruses into human chips. These viruses could cause the chip to collapse, and the chip’s malfunctions may cause severe damage to the human body.
Physics teacher Mr. Cimilluca said, “I would definitely be cautious when it comes to chipping and programming myself.”
The microchip also raises concerns about a person’s laziness. With so many technologies that are lowering attention spans and causing people to lose touch with basic skills, the microchip might make things worse. Sophomore Eliav Sehati said, “I don’t think humans getting microchipped is a very good idea.” Sehati continued, “People might become more reliant on their chips and it will just make us lazier than we already are.”
Microchips, and all new inventions, are made to accomplish one main goal: to make society run faster and create a more convenient, less chaotic life. The introduction of microchips to society takes this goal one step further, dismantling the use of a cash currency and files of printed or online data. Still, critics worry that such convenience can lead to a dependency on technology. Numerous studies show that technology has reduced analytical thinking and cognitive skills.
Senior Ethan Dayani said, “Almost everyone nowadays carries around cellphones that can do everything that a microchip can.” He continued, “I personally would not get a microchip because any potential risks outweigh the slight advantage microchips may give.”
There are still many advantages that come with microchips. For example, the technology makes it a lot easier to communicate with other devices and people. This includes having first responders at the site of an emergency within minutes, or providing real-time data for individuals and healthcare providers by monitoring vital signs and health metrics.
Access control to buildings or homes is also secured under microchips, eliminating the need for traditional keys or passwords. This is why microchipping has become increasingly popular among employees, with many companies in Sweden encouraging the use of microchips in hopes of becoming fully functional by just that.
While human microchipping does come with its advantages for convenience and healthcare, it also comes with its cons. Some suggest that wearable devices, such as a wristband with the same technology as a microchip, may serve as a better alternative to implanted chips. Others say wristbands ruin the entire point of the implanted microchips and their convenience.
Whether or not human microchips are truly ethical will remain unknown, and the debates will continue for years to come. As humans gain more knowledge about all aspects of life, technology will only advance further, potentially causing, as Stephen Hawking said, “AI to spell the end of the human race.”