Here are three spins. The first I intentionally recovered poorly. The other two were more “textbook” recoveries.
I’m currently working on detecting loss of control (LOC) on an overshoot of base to final. Using data models in X-Plane which depending on the aircraft model is known to be pretty accurate (there’s a certified commercial version suitable for flight training). Not practical to do this in an actual airplane for obvious reasons like crashing and fireballs.
Anyway, here’s a screenshot of data gathering. I’m collecting about 50 data points 10 times per second. I’m feeding this into a learning algorithm to teach it to predict the actions which may lead to an accelerated stall and/or spin.
Next step is to move the pitot/AOA probe to the other wing and see what difference this makes.
I’ve got Google Glass for the next couple weeks. What should I be doing with it?
I love it when someone takes the time to teach a broad audience something that most people who work on computer systems should now but probably don’t understand very well. Thanks SparkFun!
Not only did Curiousity land safely last night and is already sending images
but I have 5 Raspberry Pi’s on my desk at work right now.
I’m using an infrared range sensor for the real life Hal.
The Sharp GP2Y0A02YK0F – Infrared Proximity Sensor Long Range –
While testing his motions, I was noticing some pretty erratic behavior, so I decided to collect some data on the sensor values on a fixed object, while Hal was not moving.
This is a graph of about a minute of Hal moving the sensor from 45 to 135 degress in front of him and then taking a reading when the motion of the sensor stops. It’s at a wall, at a slight angle so the graph lines aren’t on top of each other at about 45 centimeters away.
My results for this sensor are not within the tolerances of being able to program him to to anything really interesting in an environment that isn’t really simple.
The good thing is that only once in this data would a “is this way further than that way” check be incorrect. Then again, he isn’t even moving positions here. The data is pretty erratic over a 1 minute period.
In reality, this is maybe not the correct sensor for this application. Maybe an ultrasonic would be better. Hal may have to have eye surgery.
Hal IS supposed to be a very inexpensive robot based solely on Arduino. I don’t expect him to be able to do very complicated things, but I was hoping for a little better readings on a fixed object than what I’m getting.