Corrections/comments are very welcome, especially if I get the technical details wrong. Please email flippy.qa76 AT gmail.com.
If you are using servos with Arduino, sometimes the results can be...not what you expect. My example is that I was using continuous servos as the drive motors for a mini-sumo 'bot
. The (greatly simplified) control loop for the motors looked like this:
take distance measurement from ultrasonic sensor
adjust servos' speeds accordingly (using Servo.write())
And I'd get all kinds of wonky behavior, mostly in that it didn't appear the servos were getting any kind of speed updates at all. It was *supposed* to move straight forward when the sensor detected another object in front, but in most cases it would act as if the opponent wasn't visible. (And yes, I checked and double-checked that the sensors were working...)
It turns out this is a timing problem. Before I cut to the punchline, let's look at the issues in play here.
Arduino loop() function timing
The first thing to look at is how often the
function is called. For sake of simplicity and to get a ballpark figure, we'll assume
is empty, i.e., it is running about as fast as possible. I'm too lazy to run my own tests on a modern Arduino, but this thread post
gives us a starting point of roughly 260 KHz for an empty loop. That means
gets called roughly 260,000 times a second or about 3 times every millisecond (ms). I'm going to make a wild guess and say that loaded with some actual executable code that
runs on the order of once every millisecond or so. Keep this in mind.
Now compare this to the
loop() function timing. The ratio is 20:1, meaning
loop() executes about 20 times for every time a servo expects a command. Obviously the actual frequency of
loop() depends on the code in it, but it's fair to say it can send commands to a servo faster than a servo can react to them.
Next, consider what actually happens when a
Servo::write() method is called. As way of background, the
Servo library uses one of the AtMega's built-in timers to create the pulse signals sent to a servo connected to an Arduino. This timer is a piece of hardware that periodically generates an interrupt -- a signal that pauses the normal Arduino program to run a special interrupt service routine or ISR. When the ISR is finished, the Arduino program picks back up wherever it left off. When you use the
Servo library, it uses the ISR to send the appropriate control pulses to all the attached servos.
The thing to understand here is that
Servo::write() does not immediately affect the attached servo. Instead, it simply records the position (or speed) the servo should have when the ISR runs again. In other words, you can
write() a value to a servo and have other things in your program happen before the servo actually responds. Worse,
loop() could run a bunch of times before this happens, so you could actually
write() to a servo a bunch of times, with different values, before it responds (taking only the most-recent).
One more thing to check -- how does the timer's period hold up to our assumption that we're sending servo commands every 20 ms? The answer is in
, the header file for the
library, where you will see the line:
#define REFRESH_INTERVAL 20000 // minumim time to refresh servos in microseconds
REFRESH_INTERVAL is in microseconds (1/1,000,000ths of a second). Doing the math, 20,000 microseconds equals...20 ms. Digging into the ISR code for the timer, it turns out that the ISR (which runs a MUCH more often than once every 20 milliseconds) checks to see if 20 ms have elapsed since the last "refresh" of the servos. If it has, then and only then will it send a new control signal to the attached servos.
I don't care, just tell me what I need to do to get my Arduino sketch working
Fine, fine. In the end, you just need to make sure that you only call
write() on a
Servo object once every 20 ms or so. Maybe a little longer to give the servo's hardware some time to do it's thing -- remember, the servo itself has some control logic that takes a little while to work.
From a code standpoint, this could be as easy as putting
delay(20) at the end of
loop(). This will work if there isn't really anything else other than servo control going on. On the other hand, if you're trying to interleave several independent tasks, you may need to set a "timer" of your own. The outline of that code looks something like this:
// "timer" variable, in milliseconds. remembers the last time an update happened
// MUST be of type unsigned long!
unsigned long lastUpdate = 0;
// time between updates (20 ms for servo control)
unsigned long updatePeriod = 20; // in milliseconds
// millis() returns the number of milliseconds since the Arduino turned on
unsigned int now = millis();
// check to see if updatePeriod milliseconds have elapsed since lastUpdate
// if so, update the new position and reset lastUpdate
if( (now-lastUpdate) >= updatePeriod)
lastUpdate = millis(); // reset our countdown timer
// can do other stuff here without servo delays getting in the way
In practice, I found that performing
Servo::write()'s works a little better with a longer period than 20 ms, simply because the servo hardware can't move that fast. Depending on the particulars of your application, anywhere from 50 ms to 200 ms could be more appropriate -- you'll just have to experiment to find out.
The fam and I will be headed to the beach in early June. Now that we are 3, this means we have 10 times the stuff to carry, so I bought a trailer hitch and cargo rack for our CR-V to expand our carrying capacity. Hopefully I'll be able to install the hitch this weekend.
Most of those reading (all three or so of you...) have heard bits and pieces of the last few days. I've finally gotten a chance to sit and collect my thoughts on paper (well, electrons and magnetic dots on a Google-owned hard disk somewhere). I want to remember this when I'm old and gray. Those of you interested in the gory details read on.
Last Friday -- September 21, 2012, the final day of summer -- my wife, Tiffany, and I went to the perinatologist for her last-trimester weekly checkup. We're about 3 weeks shy of the official October 11 due date. She'd been on light bed rest that week since her blood pressure was misbehaving. Well, her blood pressure was still misbehaving and the perinatologist, Dr. Mann, basically said "I don't think you're leaving here without a baby." We were ecstatic, surprised, and not a little anxious, despite having our bags in the car for "just in case". So down to the Labor & Delivery wing we go, where Tiff gets hooked up to all kinds of monitors and IVs and machines that go "PING!" and we waited for a couple of hours for the on-call OB/GYN, Dr. Schapiro, to see her.
It didn't take Dr. Schapiro long. When she found out Tiff had a headache and some blurry vision, her words were "that scares me". After a little further discussion, it was decided to perform a C-section to reduce the risk of stroke. This even got us bumped to the front of the C-section queue, so things started happening fast at that point: usher Tiff out for prep, usher me into scrubs, and then to the operating room a few minutes later. Somewhere in there I managed to call both my parents, my sister, and my in-laws.
I'm all nerves, so from my point-of-view chaos ensues. Everyone was telling me to take pictures! Which I did, of course, even when they said "he's half-out, take a picture!". (That last one, in retrospect, was pretty creepy.) I think my first words to Tiff, after seeing little Jake, were "he's got HAIR!". Mostly I was worried about Tiff, so I gravitated to her to make sure she was okay. Eventually one of the anesthesiologists grabbed my camera and started taking pictures for me (especially ones involving me, Tiff, and/or the baby).
After he was born, they gave me a choice of going with the baby or staying with Tiff. As much as I wanted to spend time with my new son, I elected to stay with Tiffany. I was still very worried about her. Eventually we ended up back in the recovery area, where we finally got to properly meet our new son.
He, of course, is the most gorgeous, precious, beautiful thing on the planet. Scientific, peer-reviewed, bias-free fact.
At birth he was 6 pounds, 15 ounces and 19-3/4 inches long. He was born at 6:29 pm. Gestationally, he was 37 weeks along. I'm nervous about that, but so far all signs are that he is in excellent health.
We stayed in the "monitor Tiffany closely" room that night, keeping an eye on her BP, which, blessedly, started coming down almost immediately. Not much sleep that night, which, it will surprise no one, was simply the first of many little-sleep nights to come. The next day we moved to a normal room (Piedmont 262, for the detail-oriented folks in the audience).
It's at this point I should mention a major plot point, until now un-revealed. Our trip to the hospital on the day Jake was born began with us visiting Piedmont room 264 (next door to what would later be our room)...where Tiffany's sister Sheri was staying, having given birth to a darling boy of her own just one day prior. The OB who would later in the day deliver our son even visited their room while we were there.
Think on that for a moment. Two cousins, born a day apart, in the same hospital, and their families in adjacent rooms. Joy was boundless. Grandparents on both sides are over-the-moon; these are the first grandchildren on either side. My folks hit the road as soon as they knew the C-section was to happen. Tiffany's parents are local, so they were here as quickly as possible.
One day I'm an uncle for the first time, the next, a father.
So we spent most of five days in the hospital, recovering with Jake, learning from him, learning how to nurse, how to change diapers, how to swaddle, how to survive on 2 hours of sleep (max) at a time, how to appreciate the awesomeness of getting 5 minutes to shave. I actually lost a couple of pounds, even though I ate quite well. :) The biggest challenge though, and one that is still ongoing, is nursing. If anyone told us it would be this difficult, I apparently wasn't listening. Things are finally starting to work, but it's been a hard slog and we're still not where we need to be...but we're close. My parents have been staying with us this week and have been amazing: taking care of our cats, cooking and cleaning, and just generally keeping us going.
Needless to say, we're smitten with little Jake.
Click through for animation.
Fun article on Ars Technica about things going horribly wrong with technology, causing data loss, etc
Anecdote #2 reminded me of a situation of my own making:
At one point I, too, was dual-booting a machine with Windows and Linux, and had the Windows partitions (IIRC I had two -- one for programs and one for data) mounted on the Linux side in read/write mode. At one point, however, I had to re-install and re-partition Windows. The problem is that the Linux installation still had the partition geometry for the previous installation and was accessing the hard disk using those settings. I think after the first boot to Linux the entire Windows installation was corrupted, as it was writing to what it *thought* were free chunks of hard disk but what really had necessary Windows data.
Hard lesson learned.
A Facebook friend posted a link to this picture by Nick Ladd (click through for original on Flickr):
This is brilliant, on so many levels. For those who don't know, it is a Minecraft-themed parody of the famous "A Sunday Afternoon on the Island of La Grande Jatte" by Georges Seurat (which I have been fortunate to view in person, having visited the Art Institute of Chicago a few years back; it's also the one uptight sidekick Cameron trips out on in Ferris Bueller's Day Off). Seurat's painting is also a well-known example of a painting technique called "pointillism" where the picture is made up of tiny dots of paint, as opposed to more traditional techniques that use brush strokes.
Now, it turns out, pointillism (so I'm told, but sadly can't find a reference) help lead the way in the development of computer graphics which, as you probably know, are really just a big matrix of colored dots. And like Seurat's painting, our brains blend those discrete dots into a cohesive picture.
Which leads us to Minecraft. The Minecraft world is made up of big cubic "dots" giving a 3D pixellated vibe. Even though the mountains and trees and creepers and such in the game don't really look like their real-world counterparts (yes, I know Creepers aren't real; just go with it) our brain discerns the general shape enough that it works.
Now back around to the Inception-like nature of Ladd's painting above. We have a scene using 3D pixellated actors and objects from Minecraft, represented in a pixel-based computer image, inspired by a painting that in turn played a role in the fundamental development of computer graphics.
A Sam Muirhead
left this very good comment on an article about Open Source and 3D Printing
Another of the wonderful effects of allowing everybody to tinker is that it can create an explosion of highly specific and creative uses in all sorts of different areas. You may have built a gadget, only thinking about its use in robotics. However, in the right hands, with some lateral thinking and plenty of hacking, trial and error, might be adapted for use in beekeeping, in trainspotting, in tailoring or aeronautics - all opportunities which would have never existed with a closed-source model.
Quite so. Until recently, the tools and infrastructure needed to develop a new product were expensive, requiring financing via corporations, venture capitalists, etc. That means you need a business model capable of recouping that investment and paying back your creditors. Which in turn means you have to have a product with broad enough appeal and/or utility that a lot of people are going to buy it. With that in mind, niche uses for technology often are simply are not cost effective from a business sense.
With Open Source, the inputs become cheap enough someone can build one-offs for niche uses without the massive start-up cost traditionally associated with technology development. Throw in the Internet, which makes
bragging aboutcommunicating your new gizmo to the masses cheap and easy, and the lateral idea-transfer Mr. Muirhead mentions becomes reality. (Another thought: education is an important and, traditionally, expensive input to product development. The Internet has brought the cost of that down, too. Want to know how to wire up a potentiometer or program a depth-first search algorithm? Get thee to Google! Someone has probably written a web page about it.)
Now this isn't some screed about how Open Source is turning product development on its head. Traditional (expensive) product development still has a place and will for a long time (good, because I'd much prefer an iPad to some hacked-together-in-my-garage "tablet"). As a basement tinkerer and CompSci educator (hats I wear that both operate on a limited budget), though, I revel in the freedom to decide what my tech needs are and how they should be addressed. Contrast this to the traditional top-down model that says some other company gets to decide what my needs. Government policies, which until now mainly favor traditional models (primarily through patent laws), need to change so that the playing field is level for both models.
Among many hats I wear, one is that of Associate Professor of Computer Science at a regional teaching university. Occasionally, I receive an email messages similar to the one excerpted below:
Dear Dr. Baumstark,
I am [Joe Bob] from [a foreign country]. I am now an associate professor of [university I've never heard of]. I am interested in applying for a postdoctoral on digital watermarking, information hiding, and information security, etc in your laboratory. I have attached my CV to this email for your review.
The email continues in the form of a typical job-application cover letter for a faculty position in a university: summaries of degrees received, research performed, papers published, etc. In fact, Dr. Joe Bob (not his real name) writes an excellent cover letter. I'm sure he's well-qualified for that which he seeks.
The problems with it, however, are multiple:
- I did not advertise a postdoc position, most likely because I don't have one. We shall assume, for brevity, a causal relationship between these two data points.
- I do not have a research laboratory. Further, I have no web page chronicling the achievements of my laboratory, nor publications generated by my laboratory -- nothing to indicate to the world outside the existence of a research laboratory. All this is largely a consequence of the fact I do not have a research laboratory.
- I do not engage in research in "digital watermarking, information hiding, and information security, etc". And though I have a publication list so enormously long it would be easy to lost track of what I have and have not published [reality: I know undergrads with longer pub lists than me] I don't recall publishing anything in those areas, so it's unlikely Dr. Joe Bob -- someone who has supposedly proved, by virtue of receiving a PhD, that he is able to perform a basic literature searche -- found my name in a search of research pubs in those fields.
Most likely verdict? Email carpet-bomb. Dr. Joe Bob sent his CV and cover letter to as many faculty email addresses as he could find, in the hopes one would catch. Hence the title of this post: "Academic Phishing". I probably get one of these every semester or so, always from foreign students wanting a postdoc position or a graduate research position. I understand, really I do, that many are probably desperate to find work in the US -- anything to get a visa and, perhaps later, citizenship. But come on. Do your homework and target actual advertised positions. Find people who actually do research in your field and have actual dollars to spend on your salary/assistantship. Don't clutter up my inbox and force me to write a short-and-polite-but-with-snarky-subtext response to your form letter email.
The double-oops.org domain is down for the rest of the day due to changing my domain registrar -- and I'm stuck at work without the password to update the CNAME records. The site is still available at https://sites.google.com/site/doubleoops/