Category Archives: Science Fiction Roundup
I decided to take a break from the heavy stuff this week and point you all to some interesting articles in a much-overdue Science Fiction Roundup.
I read dozens of tech-related blogs and websites, often finding inspiration for science fiction stories. Every now and then, I find some great articles to share, and round ’em up for your convenience.
This week’s post is all about our favorite future overlords and their current progress toward global domination (with a little help from us humans, of course)
Crushing Things With Your Robot Claws is Now a More Satisfying Experience
Here’s something us humans can benefit from: tactile response in prosthetics. We take for granted every day that the ability to feel pressure and texture sensations in the skin is an incredibly intricate and amazingly complex mechanism. For people who need prosthetic limbs, tactile response is something sorely missing, making for a constant reminder of the “substitute” nature of the device. First, you need a type of sensor that can detect texture and pressure (check). But after you have a way of generating proper data to mimic human touch-sense, you need to be able to give that data to the human brain.
Researchers at the University of Chicago have managed to get close to that goal with Bonobo monkeys, which is a step closer to development for humans. They trained the monkeys to respond to specific touch sensations, then stimulated the monkeys’ brains in such a way that got the same response. This helped the researchers to map points and define specific voltages to mimic the desired sensations, which will be necessary to make sure the touch sensations from the prosthesis translates to the brain properly.
Check out the article at 33rd Square for more information.
Boston Dynamics Bring More Options to Skynet’s Table
I’ve mentioned several of Boston Dynamics’ projects before, especially their very impressive PETMAN humanoid
terminator testing platform. Recently, their zippy little cheetah-like WildCat platform has graduated from the lab to cord-less operation. The robot can gallop at up to 15 miles per hour, as you can see in the video. While it might seem a little awkward (it faceplants in part of the video, that is, if it had a face), this is an incredible development for rugged robot mobility. I can’t quite work up the nightmare-inducing terror over this one, but I’m sure it will be a prove to be a valuable tool in our eventual extinction.
PETMAN’s successor ATLAS is also in development, expanding on the unsettlingly-human movements to a bipedal balance system and less-terrifying form. I currently lacks a head, but it hardly looks threatening at all as it gingerly tip-toes through the boulders, does it?
Too Cute to Be Threatening… Right?
Just to show some contrast, here’s a new dancing owl robot. Because kids want those, I guess? ixi-Play is a new interactive robot designed by Witty Worx to play educational games with children using its 720p camera, multiple touch sensors, and internal dual-core processor. I do question how many parents will see this as yet another excuse to leave their young children unsupervised for extended periods of time, but look at how much fun this kid’s having:
Check the article on 33rd Square for more information.
That’s it for this week! Next week I’ll wrap up my discussion of William Gibson’s social criticism. Until then, what do you think about these robots? Cool? Terrifying? Let me know in the comments below!
Hello everyone! Last week I started into examining the historical context of Cyberpunk within the history of Science Fiction, but I decided to put that on hold until next week as I reexamine my direction for the series. It’s been a month since my last round of posts, and there have been loads of articles that I have wanted to talk about, so I was going to do a Round-up at some point anyway. Might as well do it now! If you’re not familiar with these posts, my rules for them are thus: find cool articles about science and technology that would make a cool or interesting story idea. Enjoy!
The Navy Really Does Get All the Coolest Toys
If you haven’t been following my Roundups for a while, you might assume that Naval technology has basically stayed the same since the World Wars, other than the inclusion of more nuclear subs and sophisticated jetfighters. As cool as cannons the size of trees are, the Navy has been trying to find effective replacements for their outdated systems and gradually retrofit their ships. There are two major and surprising directions that this development has gone: lasers and railguns.
LaWS (Laser Weapons System)
The navy has been looking at lasers for a while, and I’ve taken a look at many projects, some of which are now defunct. But the idea of lasers on battleships has stuck around, and one current solution is being introduced for the U.S.S. Ponce, commissioned in 1971, to get a shiny new laser to shoot drones and enemy speedboats. The weapon is actually much cheaper to operate than conventional weapons, and much faster and more reliable than the typical cannon solutions.
GA Blitzer Railgun
It’s not hard to imagine this on a spacecraft or future tank. I’ve actually been following this particular weapon system for a while and seen it go from warehouse-sized machine to something looking much more like a cannon. The railgun fires its projectile with staged electromagnetic pulses, hitting air, sea, and land targets up to 200 nautical miles away. It’s so effective, there’s almost no application it isn’t good for, making it a solution for naval combat, artillery strikes, anti-air, and even missile defense.
PETMAN 2: The Return
I’ve covered this particular robot several times now due to his high publicity and coolness factor. PETMAN, another project by Boston Dynamics, the creators of BigDog, another well-known robot developed for DARPA, the research and development arm of the US military. They still claim this is just for testing new chemical suit designs, but I don’t buy it.
Source: 33rd Square
Virtual Reality Glove? Please.
Virtual reality is one area of science fiction and technology that I get really excited about. It’s all about immersion: how do you make the user feel like they’re really present in the simulated environment? The visual aspect of this is quickly being solved, however, the method of control is very difficult. Sure, you can give someone a controller and leave it at that, but some companies are vying for more elegant solutions. Thalmic Labs’ MYO has the advantage of being useful for many applications (I don’t think I’d mind at all using that armband for every computer interface, if they found a way to make it easily transferable), but adding natural motion controls that rely on actual muscle movement rather than trying to teach a computer to read an understand gestures is probably a much more direct approach than the motion control technologies we see in other industries (the current generations of video game consoles, for example). Extrapolate this to the entire body, and you’ve got yourself a full-fledged VR suit.
Source: Singularity Hub
Virtual Reality Meet Virtual Exertion
Sorry for the Vimeo link, it looks like there isn’t a Youtube equivalent. This project, from University of Wisconsin-Madison, struck me as very closely related to the MYO project above. I’ve talked about virtual reality at length, especially the problems associated with attempting to replicate the holodeck from Star Trek, and it hadn’t occurred to me to go about it this way, at least as a stopgap. I remember quite clearly straining my muscles when pretending to lift some heavy, imaginary thing when I was little. I even still today, when playing a videogame where something has to be lifted or pushed, often strain my own muscles when immersed in the task. Using this sort of feedback, where the computer measures your strain to determine how much force to apply to an object, seems like a very effective way to integrate force feedback into virtual reality without running into the problem of requiring couch potatoes to actually perform strenuous tasks.
That’s it for this week! I hope these ideas will come in handy for your science fiction stories. Until next week, do you think that motion controls like the MYO will become the norm? Why or why not? Let me know in the comments below!
Hello everyone! Last week I went on about media and important differences between them (etc. etc.). It was a thorough and in-depth article, and this week I was supposed to dive right back into it.
Well, I think we need a quick break from that. So instead, this week, I’ve compiled another Science Fiction Round-up for your reading pleasure. If you’re new, this is a series of articles I’ve scrounged from my own vast surveys of SF-related news sites and blogs (because I’m into that sort of thing) and pull them together for your inspiration. I get ideas from reading these things all the time, so I thought I’d pass them on to you. You’re welcome!
As I’ve outlined before, there is a unique, chicken-and-egg relationship between science fiction and technology. Many ideas in science fiction literature comes from actual science (obviously), however, many technologies are developed after being imagined by SF writers. Similarly, a lot of technological development is spurred by video games, either because the tech is useful to the gaming industry, or as a consequence of the games themselves.
Oculus Rift: Affordable Virtual Reality (Finally)
One example of this is the ever-sought (but rarely successful) niche of Virtual Reality. It’s interesting to note just how popular the idea of virtual reality is in popular media, but how unsuccessful virtual reality products have been historically. From Morton Heilig’s Sensorama to Nintendo’s Virtual Boy (which I actually own), virtual reality products have never really taken off. But of course, a certain quality vs. cost calculation has always been it’s bane (these products are always really expensive and never deliver much).
This is all changing, however, with the Oculus Rift, widely being touted as one of the first viable attempts to mainstream virtual reality. The system is intended as a display for video games, which have been the focus and driving force behind the technology in the consumer market, whereas simulations (such as aircraft and parachuting trainers) drive it in military and corporate markets.
In any case, this is an exciting development, but it mostly feels weird to me that we are actually going to have to integrate this technology into our lives in the near future. For instance, this article deals with thoughts on how to properly gain someone’s attention politely while they are using the Oculus Rift device, since you wouldn’t want to freak them out too badly in the process.
Soon You May Have An Excuse to Scarf That Snickers Bar
I can’t tell from reading this article if the creators of this project got the idea from Deus Ex: Human Revolution or not (the chicken-or-egg scenario again), but the connection is clear. In the game, protagonist Adam Jensen, a cyborg, has to eat powerbars to replenish his strength after he hits someone in the face. In the game, it’s a mechanic that keeps the player from pile-driving every single enemy. In real life, it could be a legitimate way to power internal implants. The idea is to use the resources already present in the body to generate electricity for devices in leau of a battery. In this case, the biocells take oxygen and glucose (sugar) from the bloodstream and break them down to create a charge.
It’s unlikely that the researchers responsible for the bio cells actually got the idea from Deus Ex, but the idea has been floating around in SF and video games for years. It’s interesting to see that this appears to be not only possible, but a very practical technology.
Finally, Prosthetics for Quadruple Flipper Amputees
To end on a completely different note, apparently some Japanese researchers found it necessary to give artificial flippers to a loggerhead turtle. After being caught and mangled in a fishing net in 2008, the turtle lost its flippers. Scientists have been trying since to design prosthetic flippers to allow the critter to swim normally. It’s taken a lot of tries (this is their 27th iteration) but I think they’ve come pretty close:
I’m not quite sure about the general applications of this research, but I’m finding it hard to care. Daw.
Well, that’ll be it for this week. Next week I’ll get into the details of elements that transcend media. Until then, how much is too much for virtual reality? Since the push for mainstream VR is coming soon, do you think you’ll buy into it? Let me know in the comments below!
Hello everyone. Tis I, Erik the Reddest back on rotation and ready to go. I read a lot of science articles for research (and because I actually do find this stuff interesting), and I am constantly amazed by emergent technologies. However, sometimes I am not only awed, but totally weirded out by what I read. Quite a few strange stories have come around recently, so I thought I’d share them with you for my first post of the month to give you some inspiration for your writing.
Apparently Fish Do Have Thoughts, Just Really Simple Ones.
Oh Japan. Not only does your culture consistently bewilder us Westerners, but your scientists get in on the fun too. Apparently, at the Japanese National Institute of Genetics, this means asking the question: do fish think? Well the answer, surprisingly, is yes. Not about much, mind you. Just things like “That looks good to eat.” The purpose of this research was to begin developing methods of mapping neural activity, but the whole idea for the project still brings a smile to my face.
Source: 33rd Square
Welcome to My Giggly Nightmare
Also coming out of Japan (no surprises there) is a recent project in creating realistic facial expressions for robots with the hope that they will someday be able to interact with humans on an emotional level. “Diego-san” is modeled to look and act like a one year old boy and has 27 moving parts in his face to create expressions. You tell me: creepy or adorable? I’m going with creepy.
Source: Singularity Blog
Clearly the Term “Microscope” is No Longer Good Adequate
I would have thought it was impossible to do this, but apparently IBM‘s new microscope technique (“Micro” seems the wrong prefix) has captured an image of a hexabenzocoronen molecule at 100x the resolution of an atom, officially confirming its shape and organization to be the same as its theoretical models. It was news to me that they could even get an image of an atom at all, but that was achieved in 2009, believe it or not. To the right is an image of the HBC molecule. Oh the world we live in…
Source: 33rd Square
Doctors Give Vet New Arms (Not Cyborg Ones)
Speaking of things I thought were impossible, doctors recently performed an incredible feat of surgery, giving 26 year old Sergeant Brennan Marrocco two new (human) arms. I honestly thought that giving someone replacement limbs would either have to be done by cloning spare parts,or else advanced prosthetic limbs would be used. I was amazed to hear that this surgery was not only attempted, but successful. Just look at that diagram they had to connect bone, muscle, arteries, even individual nerves to make this work. Truly incredible.
Source: Singularity Blog
At Least We Don’t Have to Worry About A Strong Iranian Airforce
I thought I’d end on a humorous note, at a dictator’s expense. As much saber-rattling as Iran is known to do, they’re still (thankfully) lacking in the technology to pull off their threats for the most part. While that could quickly change if they obtain a nuclear weapon as they are keen to do by most analysts’ opinions, at the moment at least it’s clear they have no idea how to build a proper stealth bomber. Mere weeks after their triumphant space launch of a rhesus monkey into orbit that certainly didn’t die in the vacuum of space, Iran proudly displayed this little piece of engineering:
This is the Qaher-313. For those of you (like me) who don’t have an aerospace engineering degree, head to the linked gallery at the source and enjoy the slide-by-slide take down of exactly how ridiculous this is. It may look kind of cool at first glance, but it’s likely this is a poorly executed hoax (like the monkey) that the Iranian brass just don’t realize is embarrassingly inadequate to fool the 1st world’s educated public. A few things to note are the Mason-jar glass cockpit, the shiny plastic, and copy-cat wing designs that probably don’t actually let it fly, if it even has an engine. Check out the last few slides to see a few actual stealth bombers to get a comparison.
Source: Ars Technica
Well, that’s all for now. Lots of things to write about now, eh? Good! Now what weirded you out more, or just made you laugh? Let me know in the comments below!
Hello everyone, I hope you had a wonderful Christmas! Seeing as this is after Christmas and Dr. Williams already covered this holiday with a lot more class than I’m sure I could muster, I’m just going to stick with my bread and butter and dish out some sci-fi related content. Today, I’ve found some cool articles about technologies present that reflect potential directions for the future. Without further ado, here’s this week’s Science Fiction Roundup.
I Suppose Now, After They Conquer Humans, They Can Entertain Themselves
Part I of the “Computers Making Things” category comes in the form of ANGELINA, aka “A Novel Game-Evolving Labrat I’ve Named ANGELINA.” Michael Cook has devised this program in the spirit of Computational Creativity, a branch of research in Artificial Intelligence that focuses on teaching computers to be creative in their designs, in this specific case, in creating a platforming video game featuring Santa Claus. Given that the levels and powers were created by a computer, the feat is rather impressive. It’s pretty fun, in fact, so give it a try for free! For more information about Cook’s project and its place in the development of Artificial Intelligence, take a look at the full article on 33rd Square.
Source: 33rd Square
Apparently Computers Can Also Write Books, But They’re Really Boring
The second article of the “Computers Making Things” category is Professor Philip M. Parker‘s patented “Authorship Title Material Authorship” program which for some reason does not appear to have a snappy acronym (for shame, Mr. Parker). The system utilizes massive databases of information on specific topics such as huge archives of reports on global pinto beans production or many other horrifically dull topics, using this raw data to meticulously organized documents that exhaustively discuss the subject. So, no novels or anything yet (although apparently he’s working on that, starting with the Romance genre) but if you ever need The 2007-2012 Outlook for Grapes, you can get it on Amazon.com for the low, low price of $795.00! Don’t get too exited, though. For this particular item, one apparently dissatisfied but snarky customer gave it a 1 Star rating, saying:
Is $795 too much to spend on a “fake book” that consists of nonsensical computer generated charts and tables? If you are dying to know the hidden, mysterious truth behind the latent demand for grapes between the years 2007-2012, then go for it. Otherwise, caveat emptor, baby!
Realistically, maybe one firm or company, total, will buy a book like this. However, most of the other 106,476 titles created by the system which include everything from dictionaries to sudokus in Polish are priced closer to $11.99. You can imagine that even if very few people ever touch any one book, this has got to be making some decent money for this guy for practically no effort. I for one welcome this odd form of artificial intelligence, if anything because it means no human being will actually ever be forced to compile such brain-implodingly boring materials into book form.
Source: Singularity Hub
Quadriplegic Woman Feeds Herself Chocolate Via Robotic Arm
It seems that I have been chronicalling this technology over the last few years, measuring the progress of Brain-Computer Interfacing as it develops into a reality that science fiction writers have only dreamed of. Some day we may be able to give people back their sight, or their limbs, or their livelihood through the technology demonstrated here. Jan Scheuermann lost the use of her arms and legs to spinocerebellar degeneration, a once vital and prolific murder mystery writer reduced to an electric wheelchair. Her story is tragic, but hopeful, as she is also a participant in the University of Pittsburgh Pitt School of Medicine’s research. By implanting two 96-prong electrodes in her motor cortex, researchers have enabled Scheuermann to control a robotic hand to perform several tests, as well as feeding herself a chocolate bar- something she hasn’t been able to do for 10 years. The translation of neural activity to motor control of a limb is a maddeningly difficult task, working from a backwards and frustratingly primitive method of reading the brain’s command signals. But these scientists have made enormous strides, creating a smooth connection that comes close to actual limb movement. Scheuermann thinks about moving the candybar to her mouth, and the limb responds just like her own would have. We’re still really far away from being able to replace missing or non-functional limbs with equivalent prosthetics, but we’ve come a long way even since I’ve been watching.
Source: Singularity Hub
That’s it for today! I hope you’ve had as much fun as I’ve had this week, and have a happy New Year too. I have next month off, but I’ll be back in February. For now, what do you think of this trend of teaching computers how to do things like write books and make video games? Do you think computers could ever be considered “creative?” Let me know in the comments below!