Small. Silent. Deadly. The perfect assassin or security system for the budget conscious. Send a few after your enemy. Have a few lurking about in security areas. Make your enemies afraid. Why drop a bundle on a bug, when you can have a Tarantula?
-Adrek Robotics Mini-Cyberform Model A-2 “Tarantula” sales blurb, Chromebook Volume 3.
Remote controlled or autonomous mechanical assassins are a staple of science fiction. The first one I read about was the hunter seeker in Frank Herbert’s Dune. This murder machine was guided to a target to kill them with a poison needle. This idea stuck with me and, when I was making Ramen noodle money writing for role-playing games, I came up with (and sold) the idea for three remote controlled killers produced by my evil, but entirely imaginary, company called Adrek Robotics. These included the spider-like Tarantula, the aptly named Centipede and the unpleasant Beetle. These killers were refined versions of machines I had deployed, much to the horror of my players, in various Traveller campaigns in the 1980s. To this day, one player carefully checks toilets before using them.
These machines, in my fictional worlds, work in a straightforward manner. They are relatively small robots armed with compact, but lethal and vicious, weapon systems such as poison injecting needles. These machines can operate autonomously, or as the description in Chromebook Volume 3 notes, remotely controlled by a human or AI. Their small size allows them to infiltrate and kill or spy. Not surprisingly, clever ways were thought up to get them to their targets, ranging from mailing them with a shipment of parts or hiding them in baked goods (the murder muffin).
While, as far as I know, no real company is cranking out actual Tarantulas, the technology does exist to create a basic model of my beloved killer spider. As might be imagined, such little assassins raise some concerns.
Some concerns are practical in nature and relate to law enforcement, safety and military operations. Such little assassins would be easy to deploy against specific targets. Or random targets when used as weapons of terror. Imagine knowing that a killer machine could pop out of your cake or be waiting in your toilet and they could be difficult or impossible to trace. Presumably governments, criminals and terrorists would not include serial numbers or other identifying marks on their killers, unless they wanted to take credit.
Obviously enough, people can already easily kill each other. What such machines would change is that they would allow anonymous killing from a distance at very low cost. It is the anonymous and low-cost aspects that are the most worrisome regarding safety. After all, what often deters people from bad behavior is fear of being caught and punished. What also deters people is the cost of doing bad things. Using a terrorism example, sending people to the United States to engage in terrorism could be costly and risky. Putting some little assassins, perhaps equipped to distribute a highly infectious disease, in a shipping container would be cheap and without much risk to the terrorist.
There are also moral concerns. In general, the ethics of using little assassins to murder people is clear as it falls under the established ethics of murder and assassination. That is, they are generally wrong. There are, of course, the stock moral arguments for assassination. Or, as some prefer to call it, targeted killing.
One moral argument in favor of states using little assassins is based on their potential for precision. At this time, the United States usually assassinates targets with missiles fired from drones. While this is morally superior to bombing an area, a little assassin would be even better. After all, a little assassin would kill only the target, thus avoiding collateral damage and the collateral murder. Of course, there is still the broader ethical concern about states engaging in assassination. But this issue is distinct from the specific ethics of little assassins.
Somewhat oddly, the same argument can be advanced in favor of using little assassins in criminal activities. While such activities would (usually) still be wrong, a precise kill is morally preferable to, for example, firing bullets into crowd to hit a target.
In addition to the ethics of using such machines, there is also the ethics of producing them. Drones can easily be modified for lethal purposes. For example, a hobby drone could have a homemade bomb attached. In such cases, the manufacturer would be no more morally culpable than a car manufacturer whose car was used to run someone over. And, of course, weaponized drones are already in production.
While civilians can buy weapons, it is hard to justify civilian sales of lethal drones. After all, they do not seem to be needed for legitimate self-defense, hunting or for legitimate recreational activity. Although piloting a drone in a recreational dogfight would be fun. However, being a science fiction writer, I can easily imagine the NRA pushing hard against laws restricting the ownership of lethal drones. After all, the only thing that can stop an bad guy with a drone is a good guy with a drone. Or so it might be claimed.
Although I do dearly love my little assassins, I would prefer them to remain in the realm of fiction. However, if they are not already being deployed, it is but a matter of time. So, check your toilet. And your baked goods.

On what had been a pleasant morning run, I saw a man with a machete emerge from the woods. He yelled at me, then started sprinting in my direction. I felt an instant of fear, for I know the damage a machete can do to the human body. Then cold clarity took over, as it always does in times of danger. I have faith in my speed and endurance, but my speed failed me that day: the man caught up to me with shocking speed. I spun to face him, crazily hearing the line from One Piece that “scars on the back are a swordsman’s shame.” More rationally, I knew that death was almost certain if he was able to hack at my back.
The scene is a bakery in a small town in Indiana. Ralph and Sally, a married couple, run the Straight Bakery with the aid of the pretty young Ruth. Dr. Janet and her fiancé Andrea enter the shop, looking to buy a cake.
While the notion of punishing machines for misdeeds has received some attention in science fiction, it seems worthwhile to take a brief philosophical look at this matter. This is because the future, or so some rather smart people claim, will see the rise of intelligent machine, machines that do things that would be misdeeds or crimes if committed by a human.
Philosophers have long speculated about autonomy and agency, but the development of autonomous systems has made such speculation even more important. Keeping things simple, an autonomous system is capable of operating independent of direct human control. Autonomy comes in degrees of independence and complexity. It is the capacity for independent operation that distinguishes autonomous systems from those controlled externally.
In my previous essay I set the stage for discussing the concern about people switching competition categories to gain something. It is to this matter that I now turn.
Upon taking office, Joe Biden signed an executive order requiring that schools receiving federal funding allow people who self-identify as females onto female sport’s teams
Three Confederate veterans, who fought against the United States of America, were nominated for admission to Florida’s Veterans’ Hall of Fame.