I tend to be one of the more distractable folks out there. So, while waiting for my next paycheck so that I could order more parts, I got distracted for a bit by a new game I got recently. It goes by the name of Space Engineers, and if you enjoy creation/space games and don't know about it already, you SHOULD, and here's why.
All of my expectations for space-based games are unfairly based upon two things: the TV series Firefly and Microsoft's game Freelancer. Everything in a space game should satisfy either the feeling of flying around the 'verse smuggling stuff and just barely making a living on the outer rims, or should fall somewhere in the incredibly open-ended structured playground of Freelancer, where you are, quite literally, building your own life around your little polygon pilot. Space Engineers, even in it's early alpha stages, does both.
Not only can you really, truly build your own ship, but there's everything from mining to (potential) combat. Ships can have guns and missile launchers, individual astronauts can carry assault rifles, even thrusters can do a decent amount of damage by melting steel plates on ships and stations. Creative mode allows you to build unfettered, making gargantuan pieces of blocky technical delight. But my favorite so far has to be Survival mode, where you start with a certain scenario and have to actually weld, grind, drill, and mine yourself enough supplies to build stuff.
I started with the "crashed ship" scenario, in which you're the only survivor of a large ship that's crashed into an asteroid. You then have to build yourself a home out of parts from the wreckage, and get yourself back on your feet.
I also enabled Cargo Ships, in which random derelict ships will float through your sector at a distance. You can attempt to catch them, but it's not entirely risk-free. Not only do you have to build a ship to go out there and actually catch the derelicts, but you can also very easily bang it to pieces trying to grab onto salvage, and end up floating thousands of kilometers away from home.
After a fantastic few hours, I managed to create a cozy little home on the opposite side of the asteroid from the crash site.
Resources are acquired by grinding or drilling; construction is done by welding. All power tools and movement use suit energy, which must be recharged in a working ship's cockpit or at a medical bay. Power can travel through any type of block, and is provided by reactors, which use uranium ingots. Reactors do use up fuel, and if it's gone, there's no power to your entire structure.
They've recently added working turrets, collectors and emitters for inventory transfer, and a bunch more (wheels!) with some new updates I haven't tried out yet. It still has a few bugs to work out, but I'd highly recommend it. It's available on Steam here.
Play on,
~Lexikitty
Monday, April 28, 2014
Please give a warm welcome...
...to the newest tool in my arsenal shop, the Printrbot Simple Metal. She arrived in considerably more pieces than she is in now, since I got a kit. Since I was listening to Still Untitled for most of the build and I knoll constantly, I decided to knoll all the tools I used in the build in front of the finished printer in Adam's honor.
A few notes about the kit, if any of you are considering it:
Below are some random shots of the build in progress, followed by the obligatory "it works!" video printing a heart-shaped pencil grip I spent all of 30 seconds on (Art it isn't, but it worked as a good SketchUp to print test). I'll play around with maybe printing some of my Cinema 4D models later, once I have stable printing. I'll probably also post my print settings in another post once I've had a chance to play around with it and get it just right.
And here, you can watch it merrily singing to itself:
Now go build something awesome!
~Lexikitty
A few notes about the kit, if any of you are considering it:
- The pieces are extremely solid. The entire thing is made up of very quality parts that will definitely last a good long time.
- The photo-based instructions are vagueish and not really all that helpful. In fact, if you only follow the instructions given, the hot end never actually gets installed - you have to figure it out for yourself that it might be important, being the BUSINESS END OF THE PRINTER AND ALL.
- While they've cut a lot less corners than with the Simple wooden version, the belts are sill secured with zip ties, and I've had at least one issue with it so far from printing with it for two days.
- If this is your first 3D printer, I'd recommend you read up a bit on GCode and Slic3r before ordering even the fully assembled version. I got through it okay, but I'm pretty familiar with microcontrollers and that helped as far as understanding what the heck they were asking you to do during initial setup.
- It's fiddly. I had to rebuild the Z axis assembly twice before it ran smoothly. The extruder jammed once, and the X axis lost tension when running their provided GCode for a fan shroud because the X axis slammed into the frame and pulled the zip tie off the belt. A MakerBot it is not, but it's also not 2K$.
Below are some random shots of the build in progress, followed by the obligatory "it works!" video printing a heart-shaped pencil grip I spent all of 30 seconds on (Art it isn't, but it worked as a good SketchUp to print test). I'll play around with maybe printing some of my Cinema 4D models later, once I have stable printing. I'll probably also post my print settings in another post once I've had a chance to play around with it and get it just right.
And here, you can watch it merrily singing to itself:
Now go build something awesome!
~Lexikitty
New Parts!
New parts arrived! Some of these were just restocking runs (headers, DC jack switches) but something really awesome is also in the mix - a 10-DOF IMU for the HUD AHRS system (you can see it in the bottom right corner). I also got some cheap tilt sensors, with the hope that they may be useful in repulsor warm-up/down sequences. The idea is to use two in series (it's just a ball that closes the connection, as far as I can tell) so that the repulsor warms up ONLY when my arm is fully extended (by setting them at a specific angle).
~Lexikitty
~Lexikitty
Saturday, April 12, 2014
The Iron Girl Project - HUD, Part 1
When I was looking into the Iron Girl project, the most intriguing - and scary - part by far was the HUD. The HUD on the Mark II is magnificent, and it only gets better with each suit in the movie series. If we skip the holographic and multi-layer effects (again, tech not financially feasible), we’re left with the concept alone:
A video feed from the outside world, as well as:
Numerical and sensory data about the suit
Corresponding visual references to video feed images (heading, bearing, horizon line)
So originally, I wanted to use some sort of USB webcam > OpenCV (computer vision application), and have the Pi do all the processing. Turns out both of the webcams I had lying around had issues with the Pi - one needed additional power, and the other gave me about 4FPS tops (without any image processing). So I ordered the standalone Raspberry Pi camera board, hoping that would have a better framerate. It definitely did, but the issue there was that it injects video directly to the framebuffer - meaning that the operating system/Python/X has no way of knowing what the heck the camera is seeing. The camera feed just fills the entire screen right over whatever it was that you were doing. Rude, but useful in some circumstances.
Using the raspivid command-line arguments, you can specify x and y coordinates for where you want the video. In doing this, you can kind of fake an X window at specific coordinates or use a non-resizable GUI to get the effect. However, the mouse pointer will always disappear behind this, and for the HUD purposes it didn’t make sense to shrink the video to the center of the screen to display edge-HUD elements.
There is, however, a raspivid command-line option labeled -op for “opacity”. This gives you a range of transparency (0-255) on the video itself, so you can actually have highly visible elements BEHIND the video feed and they’ll show up okay. Color mixing is a bit off, and it’s not the prettiest solution, but short of delving deep into OpenGL and MMAL just for the sake of this one project, it was good enough for me.
Booting into Wheezy, I pulled up a sample HUD image in the picture viewer, made it full-screen and zoomed in till I got the edges to touch the screen. I then started raspivid and played for a while with the command-line. At an opacity level of about 110, I managed to get something reasonable:
So that worked okay, and decently for my proof-of concept. I ended up adjusting the settings a bit since I would be working with a pair of video glasses on the composite output of the Pi, not the HDMI port with a nice monitor. The glasses only have a resolution of 720x480. The specific ones I used were based off the Olympus Eye-Trek FMD-150W, but actually removed from an old Enhanced Visions JORDY unit that had cracked on the side and lost both earpiece hinges:
I switched the output to composite on the RPI and had to adjust the contrast settings and brightness to make it look anywhere near decent, but I manged to get something workable with my test HUD image and the video feed overlay:
So far, so good.
For anybody looking for cheapish HMD’s, the JORDY might not be an entirely bad option. Just make sure you look for a version 2 if you need widescreen. The version 1 has a black cord linking the controller unit and the glasses, and has a notch between the eyepieces (right above the nosepiece). The version 2 has a clear/braided cord and no notch above the nose piece. The only real difference is the resolution of the internal screens - the version 1 is based off the FMD-150 and has a resolution of 640x480 (to the best I can tell). The V2 is the FMD-150W with the 720x480 resolution. I manged to find a V1 for about 60 bucks on eBay at the time of this writing. Adafruit.com also has 320x240 HMD’s for $109, and you won’t have to do any salvage.
For the rest of the overlay, I decided to use Pygame to load all the data off the I2C bus, but that’s for another post.
Cheers for now,
~Lexikitty
A video feed from the outside world, as well as:
Numerical and sensory data about the suit
Corresponding visual references to video feed images (heading, bearing, horizon line)
So originally, I wanted to use some sort of USB webcam > OpenCV (computer vision application), and have the Pi do all the processing. Turns out both of the webcams I had lying around had issues with the Pi - one needed additional power, and the other gave me about 4FPS tops (without any image processing). So I ordered the standalone Raspberry Pi camera board, hoping that would have a better framerate. It definitely did, but the issue there was that it injects video directly to the framebuffer - meaning that the operating system/Python/X has no way of knowing what the heck the camera is seeing. The camera feed just fills the entire screen right over whatever it was that you were doing. Rude, but useful in some circumstances.
Using the raspivid command-line arguments, you can specify x and y coordinates for where you want the video. In doing this, you can kind of fake an X window at specific coordinates or use a non-resizable GUI to get the effect. However, the mouse pointer will always disappear behind this, and for the HUD purposes it didn’t make sense to shrink the video to the center of the screen to display edge-HUD elements.
There is, however, a raspivid command-line option labeled -op for “opacity”. This gives you a range of transparency (0-255) on the video itself, so you can actually have highly visible elements BEHIND the video feed and they’ll show up okay. Color mixing is a bit off, and it’s not the prettiest solution, but short of delving deep into OpenGL and MMAL just for the sake of this one project, it was good enough for me.
Booting into Wheezy, I pulled up a sample HUD image in the picture viewer, made it full-screen and zoomed in till I got the edges to touch the screen. I then started raspivid and played for a while with the command-line. At an opacity level of about 110, I managed to get something reasonable:
So that worked okay, and decently for my proof-of concept. I ended up adjusting the settings a bit since I would be working with a pair of video glasses on the composite output of the Pi, not the HDMI port with a nice monitor. The glasses only have a resolution of 720x480. The specific ones I used were based off the Olympus Eye-Trek FMD-150W, but actually removed from an old Enhanced Visions JORDY unit that had cracked on the side and lost both earpiece hinges:
I switched the output to composite on the RPI and had to adjust the contrast settings and brightness to make it look anywhere near decent, but I manged to get something workable with my test HUD image and the video feed overlay:
So far, so good.
For anybody looking for cheapish HMD’s, the JORDY might not be an entirely bad option. Just make sure you look for a version 2 if you need widescreen. The version 1 has a black cord linking the controller unit and the glasses, and has a notch between the eyepieces (right above the nosepiece). The version 2 has a clear/braided cord and no notch above the nose piece. The only real difference is the resolution of the internal screens - the version 1 is based off the FMD-150 and has a resolution of 640x480 (to the best I can tell). The V2 is the FMD-150W with the 720x480 resolution. I manged to find a V1 for about 60 bucks on eBay at the time of this writing. Adafruit.com also has 320x240 HMD’s for $109, and you won’t have to do any salvage.
For the rest of the overlay, I decided to use Pygame to load all the data off the I2C bus, but that’s for another post.
Cheers for now,
~Lexikitty
The Iron Girl Project - Repulsors, Part 2
For the second part of the repulsor project, I wanted to get data fed back to the main computer while also fading the repulsors in a reasonable likeness to the actual repulsor firing pattern from the movie. So, after playing around with the Trinket, I eventually came up with something like this.
The Thinket fades a 1W LED through a MOSFET in conjunction with the NeoPixel Ring. Here’s my first attempt at getting the audio to work. In this example, the Trinket fires directly and then notifies the RPI via I2C that it’s been fired, and Python plays the pre-loaded sound. This method ended up being scrapped since the delay from the I2C check was too long, and I instead made the Raspberry Pi issue the I2C command to fire the Trinket (which made everything virtually instant).
Below is the Trinket wiring diagram. It’s slightly inaccurate - I’m using a Small Mint Tin Perma-Proto from Adafruit, but they don’t have that in their Fritzing library.
A few notes:
- The center LED is a 1W BL-HP20A Cool White LED. It tops at at 3.8V, but if you use NiMH batteries, it seems to be okay.
- The switchable AA battery pack will feed both repulsors and be mounted on my back. It’s got rechargeable AA’s to make sure the LED doesn’t get too much voltage. I may play around with the center LED power later on, but it works for now.
- The MOSFET I’m currently using (IRF510) is a 10V MOSFET and consequently requires a higher gate voltage than a piddly HIGH on 3.3v. Thus, the center light is a bit dim, but it works. I’m still looking for a suitable choice, and will update in a later post with what I’ve found.
- This also leaves me with one I/O pin, which I may or may not use for a tilt sensor (power-up, power-down)or some other physical sensor to feed back to the main computer.
Next steps? Mount the PCB on a glove-like structure, and figure out proper cable routing up the arm. I'm also alternating working on this with working on the HUD.
Cheers for now,
~Lexikitty
The Thinket fades a 1W LED through a MOSFET in conjunction with the NeoPixel Ring. Here’s my first attempt at getting the audio to work. In this example, the Trinket fires directly and then notifies the RPI via I2C that it’s been fired, and Python plays the pre-loaded sound. This method ended up being scrapped since the delay from the I2C check was too long, and I instead made the Raspberry Pi issue the I2C command to fire the Trinket (which made everything virtually instant).
Below is the Trinket wiring diagram. It’s slightly inaccurate - I’m using a Small Mint Tin Perma-Proto from Adafruit, but they don’t have that in their Fritzing library.
A few notes:
- The center LED is a 1W BL-HP20A Cool White LED. It tops at at 3.8V, but if you use NiMH batteries, it seems to be okay.
- The switchable AA battery pack will feed both repulsors and be mounted on my back. It’s got rechargeable AA’s to make sure the LED doesn’t get too much voltage. I may play around with the center LED power later on, but it works for now.
- The MOSFET I’m currently using (IRF510) is a 10V MOSFET and consequently requires a higher gate voltage than a piddly HIGH on 3.3v. Thus, the center light is a bit dim, but it works. I’m still looking for a suitable choice, and will update in a later post with what I’ve found.
- This also leaves me with one I/O pin, which I may or may not use for a tilt sensor (power-up, power-down)or some other physical sensor to feed back to the main computer.
Next steps? Mount the PCB on a glove-like structure, and figure out proper cable routing up the arm. I'm also alternating working on this with working on the HUD.
Cheers for now,
~Lexikitty
The Iron Girl Project - Repulsors, Part 1
The repulsor units are obvious necessities for any Iron Man-themed cosplay, costume, or suit. This was one of those situation where, even though the tech doesn’t technically exist, I still had to incorporate the component into the design. To make up for the lack of actual functionality, I decided I required two things from it:
The light coming out of the palm would have to be substantially bright (“enough to blind a camera momentarily” became my benchmark)
The repulsor would have to let me know it successfully fired, and I could relay that data back up to the HUD.
To satisfy all this, I ended up using the Adafruit Trinket. Not only did it come in a native 3.3v variant, but it supported I2C (via the TinyWire library) as well as the NeoPixel library, which was of interest.
For the light source, I used a 12-LED NeoPixel Ring in conjunction with a BL-HP20A 1W LED. The LED, when mounted on the heatsink, just fit inside the NeoPixel Ring. I got some test code onto the Trinket and the end result was this:
Which I thought didn’t look half bad. But it still didn’t flash or do anything all that exciting. Which is what I made Part 2 is for!
See you after the jump,
~Lexikitty
The light coming out of the palm would have to be substantially bright (“enough to blind a camera momentarily” became my benchmark)
The repulsor would have to let me know it successfully fired, and I could relay that data back up to the HUD.
To satisfy all this, I ended up using the Adafruit Trinket. Not only did it come in a native 3.3v variant, but it supported I2C (via the TinyWire library) as well as the NeoPixel library, which was of interest.
For the light source, I used a 12-LED NeoPixel Ring in conjunction with a BL-HP20A 1W LED. The LED, when mounted on the heatsink, just fit inside the NeoPixel Ring. I got some test code onto the Trinket and the end result was this:
Which I thought didn’t look half bad. But it still didn’t flash or do anything all that exciting. Which is what I made Part 2 is for!
See you after the jump,
~Lexikitty
Iron Girl Project - Concept
The Iron Girl concept was simple. Build a functional Iron Man-like suit, omitting technologies that:
- don’t exist yet
- weren’t financially feasible or reasonable*
- required special licensing/storage/care (there won’t be tank missiles)
The original Iron Girl framework design was garbage and I’m not posting it. However, below is the most complete sketch I have of the current framework (though it is constantly evolving).
I’ll explain all systems in their own detailed posts, but for now, here’s the system overview:
The HUD is (for now) an HMD from my scrap pile running on a composite RCA connection. I’m overlaying a raspivid feed onto a Pygame user interface.
The system relies heavily on the I2C bus to get data moved around. Repulsor firing is also controlled by the I2C bus (a decision I really didn’t want to do, but ended up having too - see Repulsor notes), and the Raspberry Pi is right in the center of it all.Most of the I2C slaves are 3.3v Adafruit Trinkets, with the exception of a 16-channel servo controller.
Firing is controlled using buttons hidden along the left side of the right-hand index finger, accessed by the thumb. The repulsor arms are “warmed up” by their position - I’m still testing whether a flex sensor or a tilt sensor is better for this.
Now to rummage through my notes to make decent explanations of the individual subsystems.
Cheers,
~Lexikitty
*Even though this is one of my biggest dreams ever, it is still, at best, a weekend warrior project. Thus the amount of resources I can justifiably pour into it is limited - however, I’m kind of glad of that - it’s forced me to be more creative than usual with what I have. In the future, I might implement some sort of donation system if people want to actually give money to make this better, but for now, I’ll keep everything on my court in case I muck it up.
- don’t exist yet
- weren’t financially feasible or reasonable*
- required special licensing/storage/care (there won’t be tank missiles)
The original Iron Girl framework design was garbage and I’m not posting it. However, below is the most complete sketch I have of the current framework (though it is constantly evolving).
I’ll explain all systems in their own detailed posts, but for now, here’s the system overview:
The HUD is (for now) an HMD from my scrap pile running on a composite RCA connection. I’m overlaying a raspivid feed onto a Pygame user interface.
The system relies heavily on the I2C bus to get data moved around. Repulsor firing is also controlled by the I2C bus (a decision I really didn’t want to do, but ended up having too - see Repulsor notes), and the Raspberry Pi is right in the center of it all.Most of the I2C slaves are 3.3v Adafruit Trinkets, with the exception of a 16-channel servo controller.
Firing is controlled using buttons hidden along the left side of the right-hand index finger, accessed by the thumb. The repulsor arms are “warmed up” by their position - I’m still testing whether a flex sensor or a tilt sensor is better for this.
Now to rummage through my notes to make decent explanations of the individual subsystems.
Cheers,
~Lexikitty
*Even though this is one of my biggest dreams ever, it is still, at best, a weekend warrior project. Thus the amount of resources I can justifiably pour into it is limited - however, I’m kind of glad of that - it’s forced me to be more creative than usual with what I have. In the future, I might implement some sort of donation system if people want to actually give money to make this better, but for now, I’ll keep everything on my court in case I muck it up.
The Iron Girl Project - History
Once upon a time, in a galaxy far, far away….no, wait, that’s a different story.
Once upon a time, I saw the movie Iron Man (2008). I’d never been a huge comic book girl (working on it), and I wouldn’t have known Marvel from direct current. All I knew was that I needed to build a suit like that. Desperately.
So I started researching how much of it could be an actual possibility. I’d had some experience with robotics and code, so I figured that would help. But I ran into a brick wall at actually fabricating parts. I lived in a college dorm at the time, and I already was getting away with soldering and drilling - I wasn’t about to push my luck doing castings or metalworking. So I closed the books on it for a while, and put it on the shelf.
I built another robot, some computers, and started gradually acquiring parts for a workshop. I got my hands on a drill press last year, and did some basic metalworking. Found it wasn’t quite for me, but I could at least do it in a pinch if I needed to. I started getting out the Iron Man suit notes again and riffling through them.
And then the Raspberry Pi came out.
At first I didn’t much care about it - I investigated the GPIO pins on it, and realized everything was 3.3v, and to top it off, there were no Analog pins. I scoffed and went back to tinkering on my Arduino.
It was a week or so later when I was trying to run two pieces of servo code simultaneously when I realized how limited the Arduino was. I’d been tinkering with the idea of a War Machine turret-thingy, but had no clear idea of what I wanted it to do. However, I ran into the limitations of processing rather quickly. So I took my Raspberry Pi back out and started (grudgingly) to figure out the GPIO side of it. Thanks to Adafruit’s amazing Learning System pages, I got analog data from an Arduino fed into the Raspberry Pi, and even got Python to react to it. I was hooked.
From there, things got sort of intense. Within a few weeks I had I2C up and working, and a repulsor palm being run by a Trinket controlled by Python, which would also handle sound effects. The next week, I had a Raspberry Pi camera module, and by the time Sunday rolled around, I had a vague HUD working. And here we are.
So now you’ve had your boring history lesson. Go do something awesome and exciting.
~Lexikitty
Once upon a time, I saw the movie Iron Man (2008). I’d never been a huge comic book girl (working on it), and I wouldn’t have known Marvel from direct current. All I knew was that I needed to build a suit like that. Desperately.
So I started researching how much of it could be an actual possibility. I’d had some experience with robotics and code, so I figured that would help. But I ran into a brick wall at actually fabricating parts. I lived in a college dorm at the time, and I already was getting away with soldering and drilling - I wasn’t about to push my luck doing castings or metalworking. So I closed the books on it for a while, and put it on the shelf.
I built another robot, some computers, and started gradually acquiring parts for a workshop. I got my hands on a drill press last year, and did some basic metalworking. Found it wasn’t quite for me, but I could at least do it in a pinch if I needed to. I started getting out the Iron Man suit notes again and riffling through them.
And then the Raspberry Pi came out.
At first I didn’t much care about it - I investigated the GPIO pins on it, and realized everything was 3.3v, and to top it off, there were no Analog pins. I scoffed and went back to tinkering on my Arduino.
It was a week or so later when I was trying to run two pieces of servo code simultaneously when I realized how limited the Arduino was. I’d been tinkering with the idea of a War Machine turret-thingy, but had no clear idea of what I wanted it to do. However, I ran into the limitations of processing rather quickly. So I took my Raspberry Pi back out and started (grudgingly) to figure out the GPIO side of it. Thanks to Adafruit’s amazing Learning System pages, I got analog data from an Arduino fed into the Raspberry Pi, and even got Python to react to it. I was hooked.
From there, things got sort of intense. Within a few weeks I had I2C up and working, and a repulsor palm being run by a Trinket controlled by Python, which would also handle sound effects. The next week, I had a Raspberry Pi camera module, and by the time Sunday rolled around, I had a vague HUD working. And here we are.
So now you’ve had your boring history lesson. Go do something awesome and exciting.
~Lexikitty
Welcome!
Hi there!
So I needed a place to dump posts of my projects and personal work, and my Facebook feed just wasn't the place. After finding Tumblr to be a bit inept, I've moved on to Blogger. It should be noted that while I'm a techie, I don't hold a ton of interest in web design or blogging. So this won't be fancy. Hope you enjoy!
Cheers,
~Lexikitty
So I needed a place to dump posts of my projects and personal work, and my Facebook feed just wasn't the place. After finding Tumblr to be a bit inept, I've moved on to Blogger. It should be noted that while I'm a techie, I don't hold a ton of interest in web design or blogging. So this won't be fancy. Hope you enjoy!
Cheers,
~Lexikitty
Subscribe to:
Posts (Atom)