Shad Callister

sci-fi and techno-thriller author


Nightblind crowdfunding campaign 268% and counting

I’m very excited to see that the Kickstarter campaign for the sci-fi short film Nightblind is going extremely well. We were just hoping to raise $1200 to enter the film into festivals and boost our VFX budget, but now we’ve got over $3000 and have hit all our stretch goals. Very cool.

I cowrote the film and produced a background story, as well as acting as a long-distance producer of sorts. It was a fun project, and I’m very proud to see it gaining some backers. Check it out at Kickstarter.


Google’s Self-Driving Car

I am so excited for everyone to start riding around in self-driving vehicles… while I still drive my own. It will eliminate the near-constant worry I have of getting hit by some idiot who shouldn’t be behind the wheel, which is reinforced daily by close calls and near misses.

Seriously, autonomous/semi-autonomous vehicles will be a real revolution in travel, one which is much needed. Of course the current inventions only work in urban areas with nice weather and good roads. And of course they aren’t suited to some tasks. But for a whole lot of people in a whole lot of cities, these will change the way they live.

Imagine how much money and time this will free up. No gasoline bill, no car insurance, no oil changes, no car washes, no speeding tickets. Instead, just one small fee, probably billed monthly to your account, for the use of the cars you rode in.

I hope each of those car-related industries are starting to think ahead about how they will pivot to the new alignment of their business models, most likely one where they contract with the self-driving car companies for these services, and charge an arm and a leg to those that still need or want to drive their own. And I hope the rest of us are starting to think of productive ways to spend our extra time and money.

 


Sci-fi and thriller novels moving drone discussion forward

Thriller novels such as last year’s Drone and this month’s Sting of the Drone are moving the discussion around remote killing and lethal autonomy into the living room. This is a good thing. Science fiction has a long and proud history of bringing current and future ethical and technological dilemmas to the general populace in a meaningful way. Sometimes it takes more than a few senators in a closed-door meeting on Capitol Hill to arrive at decisions humanity can live with.

Image

In an interview with Richard Clarke, author of Sting of the Drone, he made a chilling and profound statement. Asked whether another country developing and using drone capabilities would finally force the U.S. to confront the underlying issues of drone warfare, Clarke says “I think what will really get the debate going in this country is drones in this country.” And he points out that once our own privacy is invaded, we’ll be forced to figure it out. Already we’ve got farm drones, fire drones, S&R drones, and hobbyist/voyeur drones. We’re staring the future right in the face– or rather, it’s staring at us. From the tiny camera guiding the drone hovering outside your window right now.

I’m glad, in a way, that we have drones as a first step to get us thinking about automated warfare. Because once we go beyond a fleet of drones with missiles and start fielding the rest of the crazy machines I’m sure we’ll someday produce, a single misstep could endanger entire populations overnight.


Human Rights Watch fears “Killer Robots”

The Human Rights Watch organization just released a 50-page report warning of some of the dangers autonomous combat machines may pose (http://www.hrw.org/news/2012/11/19/ban-killer-robots-it-s-too-late). It was swiftly rebutted by a number of other news papers and sites pointing out where HRW gets the issue wrong. I’ll just say that Yes, there are serious ethical/technological/philosophical issues with machines fighting wars for us, and No it’s not as simple as “killer robots are bad and should be banned!”. I’m glad the discussion on this subject is heating up.

One interesting point that emerged from the discussion, which I’d like to echo: not only do we already have fully autonomous weapons in the field now, but we’ve had them for many, many years if you count land mines. These are machines that use a simple sensor to kill (nearly indiscriminately), and by extension I suppose that any booby trap used since the dawn of time fits the description. While these examples clearly aren’t the same as the computerized robots we’re talking about here, they might be useful in forming thought experiments to help us understand machine-warrior ethics.

Robots are capable of discriminating targets very carefully. Does that make them an improvement over land mines, or even more deplorable?

Can we think of autonomous killing machines simply as highly advanced traps? If we compare them to the dead-fall traps or spiked pits used since primitive times to kill animals and humans (and sometimes the wrong ones), it’s pretty clear that the man who dug the pit is completely responsible for what falls into it. No one would take seriously the trapper who, confronted with the body of an unintended victim of his deadfall trap, tried to claim “it’s the boulder’s fault!”. Perhaps this tells us exactly where to place the responsibilities for robotic homicides: the human who most directly issued the command to the robot which resulted in the killing.

Regarding the removal of thinking, moral humans from their acts of war, robotics are just another tick mark on the slider scale that began at Wooden Club and progressed through Spear, Arrow, Bullet, and Tactical Ballistic Missile. We’ve been distancing ourselves from the act of killing for a long time now. Part of that distancing was the creation of a warrior class, the delegation of fighting to a man that kills so that the rest of his clan doesn’t have to. It seems we have a new class, now: the warrior machine. 

 

Follow

Get every new post delivered to your Inbox.