Science Fiction or Science Fact?
BY SHAWN WHITNEY
At its best, Science Fiction is a genre that uses technology to tell us something about the world we live in today. The best science fiction – and you can’t help but thinking about the great, and tragic, modern master Philip K. Dick – uses the allegorical power of technology to tell us about ourselves. That’s why Dick was not bound by a particular technological world nor was Heinlein. His stories didn’t depend upon the elaborate rules set by, for instance, Star Trek or Star Wars. The bizarre construction of Ubik (being adapted for the screen by Michel Gondry) bears nothing in common with The Man In The High Castle (being adapted by Ridley Scott – which I found out after I contacted the agent who represents Dick’s estate to enquire about optioning it) have nothing in common except the power to explore our most pressing questions.
Nonetheless, we need to be able to project into the future with sufficient distance that the technology doesn’t feasibly exist in order for it to be allegorical. To provide a concrete example of what I mean, Jules Verne wrote about Captain Nemo’s submarine fifty years in advance of anything like it being possible. The submarine became a stand in for a dangerous attempt to dominate nature. Shelley’s Frankenstein represented the same arrogance.
But we live in an age when we live on the constant brink of science fiction, which creates new challenges for writers. To give one example, the film Outland with Sean Connery was about set on an asteroid mining colony and was a remake of the western movie High Noon. It was released in 1981. Just a few weeks ago Planetary Resources, a company founded by Peter Diamandis (who is the founder of the X Prize as well as Singularity University), announced their intention to establish asteroid mining by 2020, less than 40 years after the Connery movie. This is a technological leap much bigger than sealing up a boat to go under the water.
And Space travel is perhaps the most challenging of the scientific advances. Think about smart phones. When I was in primary school in the 1970s our first introduction to computers was with punch cards that we had to fill out, rather like lottery tickets, scratching ones and zeroes to perform simple math formulas. You made the marks and got back the result a week later, usually wrong. Personal computers were just a twinkle in Steve Jobs’ eyes. Now, as futurist Ray Kurzweil endlessly repeats, less than forty years later, I carry in my pocket a smart phone that has more computing power than a supercomputer from that time period and with a user interface that requires no knowledge of programming. And, with the internet, I can search all of human history and culture. Do you remember when you used to make a bet with friends about some fact and it would take you a week to figure it out, if ever? Probably not if you’re under thirty. I just gave away my venerable hard cover dictionary the other day. If I need to know spelling or definition, I just Google it. And since getting Netflix and iTunes I haven’t rented more than two or three DVDs a year, now I stream. It was only the 1980s when VHS players and the Sony Walkman were seen as revolutionary.
Nor are these advances confined to computing. There is also a revolution taking place in the biological sciences that is potentially as earth shaking as the arrival of personal computing thirty-five years ago. Just look at the last 10 years alone. The Human Genome Project cost billions of dollars to transcribe one human genome over the course of a decade. That was in 2002. We are now within spitting distance of reading and translating a human genome for under one thousand dollars and doing it in a few hours. By 2020 it will cost as little as a standard blood test. The full impact of the genomics revolution won’t be realized for many years but we are now beginning to see some of that impact. There are data banks established that have compiled all the genetic variations that cause various cancers and cross references them with their drug resistances and vulnerabilities. On the basis of this data – along with the genome of the patient – oncologists can figure out the best courses of chemotherapy and, soon, immune therapy, viral therapy, genetic therapy and others are already in testing phases. And there are human stage trials to use gene therapy to correct genetic errors that lead to diseases like Muscular Dystrophy.
Beyond biology we see rapid advances in robotics with the arrival of advanced AI (Siri, anyone?), machine vision and more. Heartland Robotics, started by the scientist who launched iRobot, makers of the Roomba, are expected to announce this year a highly versatile and dexterous, low cost industrial robot that will challenge the use of humans for repetitive, low skilled labour. And robots are merging with medical care and industrial needs with the release of the Ekso exoskeleton in the USA and the HAL exoskeleton by Cyberdyne in Taiwan (see the video below), which uses electrical signals being sent to the muscles of wearer to tell the wearable robot to mimic limb movements, adding strength and endurance to the frail or to those who need it on the job. Single limb prosthetics are also undergoing an epochal transformation with technology similar to the exoskeletons and, in the near future, using direct, wireless communication with the brain. Brain-Computer Interfaces are an exploding area of research.
The list goes on in every field of high tech research. We are witnessing a coming together of a series of quantitative technological advances, from materials sciences to bioinformatics to computing and beyond, that is leading to a qualitative technological and cultural transformation. The miniaturization of technology and the increasingly user-friendly character of much of it (think 3D printing using design software as simple as photoshop) as user interfaces and software undergo their own concurrent advances, is leading us to a world where our interaction with our tools, for that is what technology is, is being fundamentally reshaped.
Coming back full circle, what does all this mean for science fiction? Certainly room will continue to exist for space operas and big budget epics of the Prometheus variety. But it also opens up vistas at the lower end as we are able to credibly explore the near-term possibilities of advanced technologies in “ordinary spaces” and, more importantly, use them in the traditional way that genre devices have been used – as allegories for the human stories underneath. We can imagine genetic engineering taking place in a hacker’s garage – because there is some happening already, witness the DIY Bio movement and the BioCurious lab co-op in San Francisco. Or stories that explore the possibility of ending aging (check out the research by the SENS Foundation) or regaining the use of disabled body parts, etc. None of these necessarily require large budgets to bring them to the screen – itself thanks in part to advances in photo sensor and computer technologies, particularly in the area of HDSLRs and even with smartphone cameras that shoot HD. The revolution in real science is creating the potential for a revolution in science fiction stories that can be brought to the screen. It’s also creating challenges for science fiction writers. I want to explore some of these things further through this blog and as we embark on our own microbudget science fiction film.
In what ways do you think that science fiction will be forced to change as we live through a scientific revolution?