Back in late December, I planted a Raspberry Pi camera at a cottage on Georgian Bay, in Northern Ontario, set to take a picture once every two minutes. I had been planning the shoot for a couple months prior to the deployment: There were two Raspberry Pi’s involved, in case one failed somewhere during the winter. One of the Pi’s was set to reboot once a week, just in case the software crashed but the Pi was still awake. I had also written some software for the time-lapse to ensure that pictures were only taken during the day time, and to try to maintain a balance of well-lit, consistent images over the course of each day.
In spite of all the planning, I had a sense that something would go horribly wrong, and, indeed, when we showed up to the cottage, the windows were completely frosted over. The cameras had to be placed inside, so we figured we would mainly see the back-side of an icy window when we retrieved the cameras. Or that the camera boards would literally freeze after about a week of sub-zero temperatures in the unheated cottage. Or that a raccoon would find it way in and gnaw through the shiny Lego cases. Or something else entirely unplanned for.
So it was a bit of a surprise when it turned out that the shoot went perfectly. We retrieved the cameras about a week ago, on May 7th, and found over 42,000 photos waiting for us on one of the cameras and somewhat fewer on the other. Both cameras had survived the winter just fine!
All told, I think the result was really cool! The video at the top is the ‘highlights’ reel, with all of the best days. It comes to 13 minutes at 18 frames per second. Turns out it was a fantastic winter for doing a time-lapse, with lots of snow storms and ice. There’s even the occasional bit of wildlife, if you watch closely. I’ll post the full 40-minute time-lapse on Youtube sometime next week.
Recently I’ve been playing with building a regression model for the brightness of images produced with the Raspberry Pi’s camera board. Essentially, I want to quickly figure out – hopefully from a single image – what shutter speed and ISO to choose to get an image of a given brightness.
This is a pretty standard regression problem: We some data, extract some information from it, and use that information to make a prediction. To get a better handle on the algorithms involved, I wrote my own code to perform the regression, using NumPy for fast linear algebra operations. You always learn something from re-inventing the wheel, after all.
I’ve done quite a bit to improve my Raspberry Pi-based timelapser. I have a pretty good system together now, thanks in part to recent firmware updates to the Pi which allow me to directly control the shutter speed and ISO of the camera sensor instead of relying on auto-exposure settings. The auto-exposure is pretty unreliable from one shot to the next: the camera makes a different decision each time it snaps a picture, which leads to quite a lot of flicker. Previously, I was dealing with this flicker by manipulating the images in post-production, but I’ve now written some code to get the camera to try to maintain a constant image brightness across a long shoot.
The code now consists of a ‘timelapser’ class, which keeps track of its current shutterspeed and ISO (SS/ISO henceforth), and the brightness of the last few images taken. It then adjusts SS/ISO to try to get the image brightness to 100. By keeping track of the last few images, it is a bit less susceptible to being upset by one strange image (like, say, if I put my hand over the lens for one shot, producing a black image), or more standard movement within the frame. On the other hand, it takes a while longer to settle down to the ‘right’ SS/ISO. So it’s currently set up with an initialization step, where it finds a good SS/ISO pretty quickly, and then transitions to actually taking pictures. The result is very little flicker as the timelapse goes on, and a pretty constant level of image brightness when light levels gradually change: like when we watch dawn or dusk. (If you’re interested in playing around with the code, I’ve set up a github repository here.)
As an example, this is a video that we shot over about three days on my friends Ketan and Ananya’s balcony. They have a great view over Toronto, from the CN Tower to Honest Ed’s.
Almost immediately after my first foray with the Raspberry Pi timelapse I was contacted by Cookie Roscoe, who coordinates a weekly farmer’s market for The Stop. The Stop is a community food center here in Toronto, providing lots of programming and emergency food access for lower-income people. My partner, Elizabeth, worked with them for about three years, so I was super happy to try doing a time-lapse of their market.
Here’s the results! Lots of technical stuff about what went into the video after the break!
My recent DIY electronics project has been putting together a Raspberry Pi-based camera. The Pi foundation sells a camera board which plugs into the Pi; it’s sold as ‘a bit better than the average camera in a mobile phone.’ But the Pi’s default Raspbian Linux installation comes with a couple programs for controlling the camera, and lets you take still pictures and videos easily.
In the interest of using the camera for something you can’t usually do with a store-bought digital camera, I wrote a short python script which takes a photo assigns it a file name based on the date and time taken. It then does some sampling of the picture, and only keeps pictures which aren’t ‘too dark.’ And then cron runs the photo script once every five minutes. In other words, the Pi is set up for long-form time-lapse photography. The resulting pictures are then easy to compile into a movie.
I was recently helping out a friend with a headless Raspberry Pi setup, and thought it would be helpful to consolidate a few useful bits here. From here, you can set up all kinds of cool projects using the GPIO pins, set up a headless web server, or anything else you can think of. For my part, when I hurt my ankle a few months ago, I hooked the Pi into a hard-to-get-to stereo system and logged in remotely from the other side of the room to play music… I also used a headless setup to run the really long compile for the Sage computer algebra system a few months ago.
This weekend I took another trip out to Amagoro to meet see the new Amagoro library, opened by a joint effort of Kiwimbi Global and the Amagoro city council. The library opened on February 15th, while I was on a trip to Nairobi, and by all accounts has seen heavy traffic ever since.
I set the groundwork to leave a couple Raspberry Pi computers at the library some time after elections; right now they’re still working on getting electricity together. In the meantime I left a Pi with Jevin, the tech-guy for the Elewana project, so that he can become familiar with the system.
I also met with three groups of primary school students, about to take their final exams before going on to secondary school. With all of the groups, I talked about how computers work, and the importance of math and computers to all of the various future occupations they were dreaming about, ranging from nurses to engineers. (One students wants to be a ‘computer wizard’ when he grows up!) Hopefully planting some Pi’s with interesting resources will help some of the students get where they want to be.
As the term gets underway, I’m working on a number of projects trying to address some of the issues that I discussed in the Looking Backwards post… I was chatting with Thomas Mawora yesterday, listed off all the ongoing projects I could think of, and came up with five. (Or up to seven, depending on how you count it…) It’s a lot, but luckily there’s a good deal of overlap, so work in one place often helps another project move forward. If you’re going to spread yourself thin, you might as well be maximally efficient about it.
After a great deal of effort, I’ve finally finished compiling Sage 5.5 on the Raspberry Pi. It seems to be basically functional; it starts and adds 2+2 successfully. (So already it’s doing better than Trurl’s Machine.) I’m currently running the full test suite, which could very well take a few days. We’ll see when it gets there. For now, here’s a link to a binary; drop me a line if you have trouble with it. To get started, extract it with:
tar -xvpf sage-5.5-pi.tar.bz2
Then cd into the resulting directory and run “./sage” from the command line, or set up a link which runs “[full-path-to-sage]/sage -notebook()]”, which will automatically open a sage notebook in a browser.
If you’re curious about building sage yourself, there are details after the jump. It requires a bit of blood and something like 3 days of processor time, with somewhere south of 3 days of additional time used by the swap memory.
One of the many projects we’re planning to run involves getting some Raspberry Pi computers into rural libraries and/or community centers and giving youths a chance to learn programming. We’re particularly looking at bringing on just-graduated students, who typically hav about eight months of dead time between the end of secondary school and the start of university.
So I’ve gotten my hands on a couple RP’s; the last few days I’ve really started hacking around with one in particular. My short-term goal is to get a working Sage binary together, built for the Pi. (Admittedly, I haven’t tried the ARM binary at sagemath.org, but it claims to not be built for the particular processor in the Pi.) It will generally be a good public service to get a Pi-ready build of Sage out there in the world. One will probably need an 8gb sd card to run it properly, though, as the binary will probably weigh in at a bit over 2gb.