This past weekend I went home and grabbed my old tube TV that I had in most of college. It’s a 27″ Samsung Flat screen. Not flat panel. Just a flat screen.
I fell down the RGB rabbit hole some years back and have never fully emerged. I almost bought a Sony PVM a few times, but never pulled the trigger (yet).
This is my replacement, for now. I bought an RGB to Component converter. I also had to extract the audio so I got another little device that does that. Here is the outcome. It’s so sharp!
I listen to a number of video games/tech related podcasts as well as read a slew of related sites. I can safely say that VR has been trumpeted as the next coming. It seems everyone who has tried it is gaga over the prospect of a VR future. I’ve heard it so much it started to become a little irritating (the irony is not lost on me). So, when I was at the gym yesterday and noticed that HTC had set up a booth right outside the check-in, I was intrigued. It was pretty late when I finished up and they were already closing up shop, so I looked online and saw they were still holding sessions the next day with sign ups at 9 AM.
I waffled back and forth because I was pretty apathetic about the whole prospect, but, to set the tone of this post, I’m glad I did.
I’ll give a little overview of the hardware and then go demo by demo. This should hopefully capture everything I want to say about it. Hardware
The Vive looks much like the other VR contenders, Playstation VR and the Oculus Rift. It’s a headset that covers your entire field of vision. There is additional hardware that the Vive has that the Rift and PVR does not, and that is maybe what makes this more exciting than most. The Vive has two controllers, one for each hand, and these have triggers on each controller, and two thumb pads on top. The thumb pads are similar to the Steam Controller, which makes sense because Valve is a partner on the Vive. These are how you react with the virtual world, they are far more immersive that using an Xbox or PS4 controller due to the fact that they are a better analog to how we traditionally interact with the world.
The other, and perhaps the most important, piece of hardware are the positional pylons. I’m not sure what the official term for these are, but they surround the play field on tripods and feed your absolute position in the room to the headset. This, this right here, is why the Vive is amazing. Being able to actually walk around in a virtual space is what turned my opinion on its head. This in concert with one-to-one controls. It’s an experience, to say the least.
The last thing that HTC provided was some generic over-the-ear-cans to provide sound. They are not part of the hardware, but they did help with immersion.
Starting Area
When you first put on the headset you are in a pretty large white room with little informational podiums about various upcoming VR projects. The first thing I did was naturally looked around and then down at my hands. I could see the two controllers and they moved and tracked like in real life. Then I started messing around with buttons and thumb pads. A balloon started to inflate as I rolled my thumb around the pad. As soon as it was filled it started to fall, and I instinctively went to bump it up, like you would in the real world. It totally worked! The women running my demo said, “Oh, I see you’ve discovered the balloons.” Oh, I had discovered much more than balloons (I’m sorry, I’m still giggling at how god-awful cheesy that line is. It’s perfect.)
theBlu
The white room went blank and when the lights came up I was underwater. I started looking around, trying to get my bearings. I was immediately taken aback at the fidelity of what I was seeing. I know graphics have gotten good. My card is a little old, but I’ve seen things at 4k on the beefiest of rigs. I can’t tell if you are more or less sensitive to screen tearing, aliasing, artifacting, and every other graphic anomaly that can crop up. Perhaps because it’s so immersive I didn’t notice all the irregularities, but it looked incredible. I didn’t have to to think, “OK, let’s try moving around.” My body instinctively went exploring around the ship. Looking all around the ship and amazed at the detail. The artists captured the how the sea had started to reclaim the vessel. Seeing schools of fish swim by. Looking up at the refracted light above. I even walked over and looked off the bow of the ship to see the disappearing sea floor below. Then I heard something out in the distance. It started to come towards me. There was this whale swimming up to me. It didn’t seem intimidating at first, but as it got closer I saw the enormity of the thing. VR allows you to sense depth, too, using stereoscopic vision. That in concert with the field of view makes the scale of things feel incredibly real. So, I felt the rush of adrenaline you feel when things are about to turn sideways because this whale was coming right towards me. I feel like I must state that this isn’t me trying to embellish for the sake of writing. This was a visceral reaction. I was even telling myself, “this isn’t real, there is nothing coming towards me.” As the whale’s fin approached I physically moved out of the way to not be hit by it. Then it just stopped, looking at me with its giant eye. As it moved on I turned to my left and there was a manta ray right by me that startled me and I again dodged out of the way. It’s pretty incredible how your brain is completely tricked by this illusion. Then the lights went down and the white room came back up. That is one hell of an introduction.
Job Simulator
Next up for Office Simulator. The primary function of this demo, I would learn, is to get used to interacting with objects in the digital space. This is where your controllers really shine. You start in a cubicle, in a nondescript, cartoony looking office. Everything is stylized to be brightly colored and blocky. I looked around and then looked down at my hands. In keeping with the theme of cartoony, the looked like big Mickey Mouse gloves. Then “my boss” came in. He was a floating CRT computer monitor wearing a tie. In a manner echoing Portal’s GLaDOS he starting informing me that consuming caffeine has been show to improve worker’s productivity and that my first task was to consume some. A cart came into my cubicle with coffee mugs and a box of donuts. As before it was no problem to easily reach my hand over, pick up the mug, using the trigger on the back to mimic grip, and take it over to the coffee machine. I placed the mug, pushed the button, then brought the mug to my face to drink it. The incredible part is I didn’t have to think, do this, then do that. It felt natural. I think that is the most telling part about all this. It’s instinctual in a way a 15 button, 2 stick Xbox 360 controller isn’t.
On a side note, I honestly think the 360 controller is the peak of being able to navigate a 3D space projected in 2D. The issue is the learning curve. I’ve been playing games my whole life, so it feels second nature to me. But I’m sure you’ve seen someone new picking up an FPS for the first time and running into a wall and spinning around. It’s not intuitive.
Going back, he then had me eat a donut, and then move on to turning on my computer. The computer was unplugged so I had to plug everything back in. What was pretty neat, is that I had to look under the desk, which I did like I would look under a real desk, careful to not bump my head (are you sensing a theme?). The next part was cool for a wholly different reason. You had to use the computer, albeit in a simplified manner. The thing was, using the mouse, didn’t feel totally unlike using a mouse for real. This could mean that translation of familiar workflows could be achieved in VR, which is pretty awesome! He then had me fire some people by stamping fired on their files, and the demo was over.
Tiltbrush
This was probably the most, “woah.” demo for me. This was made by Google, for whatever that is worth. The premise is simple. It’s a drawing program. It’s like the MSPaint of 3D. It’s dead simple to use, the one crucial aspect, is that it is in three-whole-d’s! I started just making squiggles in the air. I noticed the tools were controlled by the thumb pads and started messing around with color and changed the background to “space.” Up popped a tiny moon in front of me! So cool. I was drawing all around the moon, which was so cool. To do any of that in a modern 3D modeling environment is really hard because of the limitations of 3D projected into 2D. You would have to be zooming, panning, and tilting. Not only that but my squiggles would cast shadows on the moon. The lighting tech doesn’t directly relate to VR, but it does a great job of making the whole thing more convincing. I then changed my brush and color and the started making twists around my squiggles. Weaving in and out of them, again with the lighting being incredible. As I was walking around looking at this mass of nonsense “art” I made from different angles I thought to myself, there is NO PARALLEL to this in current computing. This is why I think that this was the game changer for me. This opens up all new possibilities for artists in a way that modern tools don’t. I’m not trying to say that this will supplant old ways of doing things, I’m saying this will enhance them and create better end products. And just as soon as it started, this demo was through.
Aperture Robot Repair
This was the most complete of the VR experiences from a complete start-to-finish perspective. This took place in the Aperture Science facility from Portal. All the humor was there as well. What struck me about this demo was that it was a contained room, and it was so detailed. It was ostensibly a repair shop, albeit with unmistakable touches of Portal. I just walked around, looking at everything in it. Imperfections in the walls, tchotchkes on the desk, it was incredible. The voice over wanted you to do various things, like open some drawers, and in typical Portal fashion, you never did anything correctly. There was a moldy piece of cake in one of them as a nod to the long running, and over used “cake is a lie” joke from the original Portal, I chuckled. There was even a tiny universe that made me its god in one of the drawers that was then promptly incinerated. I was told to open the garage door to let a malfunctioning robot come in. As he sparked and sputtered in I found myself at quite a sense of unease. Again, it’s the scale of everything. I backed up to the wall to avoid it. You were then told to explode the robot, not in the boom sense, but in the sense of seeing all the parts floating in air. You were then informed to repair certain areas of the robot and only had 30 seconds before it exploded, this time the boom kind. Of course you were set up to fail. At that point the rooms starts to fall apart to reveal the larger facility, and the spinning death grinders below. Then GLaDOS herself comes down and peaks in with her monoeye. I’ve played plenty of Portal, and let me tell you. She wasn’t that scary until then. She is HUGE! I found my heart racing as she would move around and berate me. I was then informed I was not qualified for this position and subsequently stamped out of existence, literally. That’s where the this demo, and the demo at large, ended.
Conclusion
In a well trafficked portion of a gym full of people, I flailed my arms and looked around and under things not there are people gawked from a distance, and I didn’t give a damn. In fact, I didn’t even think about any of that while in the headset. That should be a ringing endorsement in-and-of-itself, but to bring it further, it was a visceral experience that nothing has replicated to this point. It is everything those podcast people said, and more. I am more than excited to see where all this goes. This is a game changer. I see some issues with it though. The cost being one of the chief concerns, at $800 plus another $1000 for a top end PC to run it, it’s hard to justify those costs. I think costs will come down, but like any hardware, software needs to follow or it’s dead in the water. If there are not enough users to justify software development costs that could all wash away. Secondly, the Vive requires ample empty space, a VR room as some have been making plans in their house. Most people don’t have that kind of space in their domicile. Being able to walk around in the world is really what clinched it for me. The experience misses something without it. Finally, and most importantly, is that you need to try it. You can watch the videos I have here. You can read my entry, you can listen to impressions. There just isn’t a good proxy for trying it out. That is a problem, because if it’s anything like what I had to do, you are only hitting a small amount of people per day. It’s going to need more demos and a lot of people’s friends who fork over the cash to buy it and let them try it.
I am changed though, there was pre-VR and now post-VR and I can see myself with one of these in the future, one that I hope overcomes its hardships.
Oh hey. Here, let me clear out these cob webs. It sure has been a while hasn’t it? I sure like your new sweater. It’s a bit chilly out there isn’t it.? Snow in November! Crazy, right? Wow! Really?! That is great to hear! Oh, my life? How kind of you to ask. Well let me tell you!
I finally started senior design for my engineering degree. Yes, that means I will be graduating soon! Finally! Amazing! Fin-mazing! One the other hand, this also means I have almost no free time. The few moments I do manage to carve out for free time, I am usually rendered useless due to the combination of work, school, and improv. These blips are usually spent watching Murphy Brown because if there is one thing I can do during my downtime it is watch old, irrelevant sitcoms (Cheers and Fraiser, check!).
It sounds like I am complaining, but that’s only is because I am. This is not all without an upside. The project I selected/was chosen for is engaging, challenging, and combines just about everything I’ve learned in school and then some.
We are trying to assemble a team for the Robotic Football competition put on by The University of Notre Dame. Basically, mobile robots that play a modified version of Football. Here is a sample video:
We have six members in total split between two groups of three. Notre Dame has quite a lot of people who participate in the designing, building, testing, and programming of these robots so we are at a slight disadvantage. One group is focusing on the quarterback, and the other on the receiver. Due to the sheer size of the project we are focusing on those two positions in particular and if time and funding allows we will expand from there. The ultimate goal is to compete with a full team in April.
Surprisingly, the hardest part now is location tracking, i.e., how far are the robots traveling, and where are they? There are many different methods: infrared, ultrasonic trilateration, odometery, acceleration, gyroscopic, and every combination in between. All methods have inherent flaws and many require prohibitively expensive equipment. The problem is that location tracking absolutely crucial to performing reliable passes between the QB and the receiver. As of this writing we are going with odometery using rotary encoders and using some advanced mathematic techniques in junction with initial calibration to hopefully yield a reliable, cost-effective solution to location tracking.
We have accomplished much as a group though and have a little to show for it now. The base has been designed to specification put out by Notre Dame. The parts have also been selected and ordered. Preliminary software tests for communication have been constructed and, well, tested. Since the group is made up of so few we are forced to wear many hats and as of now I have been 3D modeller as well as BeagleBone champion due to my Linux background and the fact that I learned how to program with net code this semester.
Here is a 3D Model of our base with the top lid removed. The parts have been ordered and the construction begins in mid to late December.
Since I have been the person pushing the group to use the BeagleBone Black as the microcontroller instead of the Arduino or Raspberry Pi, I took it upon myself to learn it. I am quite familiar with Linux. I run it on my servers, and try (whenever possible) to use it as my main OS (So I can keep my nerd cred, the primary reason I don’t use it all the time is due to software constraints. Adobe software doesn’t run well on Linux, even in Wine.). Wi-Fi is relatively ubiquitous and parts are cheap. Instead of going with a proprietary system like Xbee which is what most people use because they have Arduino libraries, we chose Wi-Fi. The plan is to use IP (same protocol you use for the Internet) using UDP for speed and if the connection becomes unreliable then use TCP.
A test created showcases the network in working order. The idea is the BeagleBone runs a server, written in Java, and waits for packets. A client, written in Java, running on a Windows machine transmits numbers to the server. The server then lights up four LEDs showing a binary representation of that number. All data is transferred over a networking using UDP/IP. Unfortunately, the Angstrom (BeagleBone Linux distro) repository was down at the time of the recording so the USB Wi-Fi adapter’s drivers and firmware could not be installed. Standard 802.3 Ethernet over Cat 6 was used in its stead.
I know this is all sounds very technical, and that’s because it is. Although this will be compiled into several papers outlining our exact reasoning, I wanted to get it out in my own words and will continue to do that as we go along, time permitting. Hopefully by the next update we will have a real world working robot to showcase, but for now this is what I can provide. I hope you found it fascinating despite the overly technical nomenclature. It’s all-consuming, but it sure is fun. I am really lucky to be doing what I enjoy.