Syfy Insider Exclusive

Create a free profile to get unlimited access to exclusive videos, sweepstakes, and more!

Sign Up For Free to View
SYFY WIRE science fiction

Science Behind the Fiction: The cyborgs of Alita are on the way

By Cassidy Ward
Alita Battle Angel

Originally published in 1990, Battle Angel Alita tells the story of a cyborg left in the trash heap without any memory and missing most of her body. Rescued and rebuilt by Daisuke Ido, Alita becomes a bounty hunter and participant in the blood sport, Motorball.

Now the basis of this weekend's blockbuster Alita: Battle Angel, the story is set in the 26th century, in a world filled with cyborgs of varying degree, class warfare, civilization in decline, and cities in the sky. While, on the surface, it's a flashy smash 'em up with enhanced humans, it raises some interesting questions about what it means to be a person when the lines that define us are blurred.

The relationships between us and our machines have become more intense and more intimate. Nearly every living human enjoys the benefits of enhanced abilities via use of technology. Cars allow us to travel at incredible speeds. Airplanes allow us to fly. We take rockets to space and communicate instantaneously with others all over the globe. We are a species engaged in a love affair with our inventions. And while most of this relationship has been increasingly aggressive flirting, a marriage of humans and machines is on the horizon. Mark your calendars and save the date, because full integration is on its way. In fact, some people have already made the jump.

Replacement Parts

In fiction, characters swap out their existing parts for artificial ones in an attempt to improve their physical form, and researchers from the Johns Hopkins Applied Physics Lab have taken this same philosophy and applied it to replacement prostheses. Utilizing more than $120 million in funds from DARPA, the team at Johns Hopkins developed the Modular Prosthetic Limb (MPL), a device which is controlled by the user's mind.

The MPL is outfitted with more than 100 sensors, which send sensory data back through the prosthesis to the user's brain. It allows for the sense of touch, something that has been lacking in prosthetics of the past.

The MPL is still in the testing phases, which are mostly happening in a laboratory setting, but Johns Hopkins has begun to send these devices into the real world to see what they're capable of in everyday settings. The results are promising and, if successful, represent a massive leap forward in the realm of machine integration.

While the team at Johns Hopkins is hard at work returning sensation to missing limbs, researchers in another field are working to return the sense of sight.

Blindness can be the result of any number of factors, ranging from degenerative disease to trauma, but it all amounts to the same thing: the organ responsible for transmitting external signals to your brain is no longer functioning. The internal systems, the brain's software, are still running but they've got no signal. It's sort of like turning on a video camera with the lens cap on; everything is working as it should be, but it can't receive an image. For a long time that was the end of the story. But scientists are working on some unusual solutions to this problem. Rather than fixing the body's hardware, they're circumventing it.

Devices like the Argus II work by bypassing the eye entirely. Utilizing an external camera, a small computer, and implanted electrodes, the device can take images and deliver them directly to the vision centers of the brain.

What the Argus II delivers isn't vision that would be recognizable to a sighted person — it's more pixelated than that. But it does allow users to differentiate objects and sense contrast between objects. In some cases, participants in the Argus studies have even been able to read large text.

And the team at Argus aren't the only players in the game. Right now, the roadways between external imaging tech and the vision centers of the brain are being paved and the race is on to see who will be able to improve resolution and user experience. And now that the proof of concept is in, all sorts of speculation is in the wings. If we're able to take electrical data, translate it, and send it directly to the brain, why stop at natural human vision? What's to prevent us from expanding beyond the spectrum of visual light, or transmitting other types of information? Could you hook a person to the internet and have data display inside a person's mind? We'll just have to wait and see.

Enhanced Humans

There are some individuals, not content to allow the careful process of science to progress on its own, have taken the field of cybernetics into their own hands. Whether they'll be remembered as innovators or mad men remains to be seen, but they are, at the very least, interesting.

Neil Harbisson, a New York-based artist, was born with achromatopsia, a severe form of color blindness which he describes as a total lack of color perception. According to Harbisson, the world he was born into was so many shades of gray.

In order to work around the gap in his perceptions, Harbisson had an antenna permanently attached to his skull which takes in visual data and translates it to vibrations. In essence, Harbisson hears color. At first, Harbisson said, he had to memorize the notes and their associated colors but over time they have become an automatic association.

As a consequence of this relationship between color and sound, Harbisson states he senses an array of colors when listening to music and creates paintings which represent the feel of particular songs or voices.

Kevin Warwick has been on a mission to make himself a cyborg for more than two decades. In 1998 he implanted an RFID device into his arm, which allowed him to control devices like doors and lights via proximity. But Warwick didn't stop there.

In 2002, a team of surgeons implanted an electrode array into Warwick's body with the intent of allowing him to interface directly with a computer. The operation was, by all accounts, successful and the array, known as BrainGate, allowed Warwick to connect to the internet and control devices on the other side of the Atlantic. Warwick took it one step further when a similar device was implanted in his wife, allowing the two to send signals between their two nervous systems by way of the internet.

There's something sweet about the notion of two individuals engaging in so intimate an experiment. There's something a little uncomfortable about it, too. It's difficult to tell if what Warwick is doing amounts to genius ahead of its time or flies in the face of respectable science. Maybe it's a bit of both.

One thing is certain, the interfacing of human biology and advanced machinery is here and will only become more commonplace and more advanced as time goes on.

Alita: Battle Angel opens this week.