Come vibe with me: Timelapse for tightwads
| Shark under construction. |
It's been a while.
Sorry, I feel like I owe all you adoring fans an apology... I've not updated much lately. I've been busy with a few different things. Life, mainly. But also a new app I've been working on (more on that later) and a game (more on that much later) and some books (more on that...maybe never...let's see).
In the midst of these efforts, my experiments in vibe coding are continuing. Most recently resulting in this excelled typing game where you compete in dinosaur jousting:
But, anyway, I've also been back to my 3D printing over the last month - and doing some of my typical nerdery...So I thought I'd drop an update about that.
In the midst of these efforts, my experiments in vibe coding are continuing. Most recently resulting in this excelled typing game where you compete in dinosaur jousting:
| I know it's not pretty...but whaddya want from a single prompt!? |
But, anyway, I've also been back to my 3D printing over the last month - and doing some of my typical nerdery...So I thought I'd drop an update about that.
So...the whole thing was driven by an upgrade to my machine:
So, why did you upgrade, James?
A few reasons...
So, next up, I printed some AMS mounts. This would enable me to mount the AMS (the thing with all of the rolls of filament on) on top of the machine, instead of having it by the side. It also comes with a couple of braces to stop the machine moving too much with the additional load.
It was around now that I tried to use the onboard "timelapse" feature of the printer. From what I can tell, they basically only include this on the printer so that everyone can say "the timelapse is rubbish... by a more expensive printer instead". You can see that the result isn't ideal...It's not great quality, it's taken too far from the side, and there's quite a lot of fish eye distortion, too:
- The A1 Combo is multi-colour. That's new since I bought my old machine. I thought it would be cool to give it a go!
- Increased build volume.. I'd been frustrated sometimes by the relatively small build volume of the Finder (14cmx14cmx14cm), so the larger (24cmx24cmx24cm) volume of the A1 is appealing.
- Increased speed. I didn't really realise how much faster 3D printers had gotten in the 5 years since I bought the Finder... but a lot of stuff can be printed in half the time now... So that's good!
- Increased accuracy. I was becoming aware that the "3D printing dude" on our local market was selling things that I wasn't able to make myself... such fine detail. I wanted to see if I could make these things myself.
- Better software. The Finder was never really "mainstream", so I was fairly limited in my choice of slicer software etc. I thought that something more common would give me better options.
- Bigger community. Likewise - not many people use the Finder, so there's a limited community around it, which means fewer parts, designs etc.
- Just for fun. I mean, everyone likes new toys, right?
Getting started
I started building a little Benchy boat... the design was built into the machine, and I just wanted to make sure the whole thing worked.
Yep, it works.
| Mounts printed. Supports remove much cleaner on this printer than the old one, which is really handy. |
It was around now that I tried to use the onboard "timelapse" feature of the printer. From what I can tell, they basically only include this on the printer so that everyone can say "the timelapse is rubbish... by a more expensive printer instead". You can see that the result isn't ideal...It's not great quality, it's taken too far from the side, and there's quite a lot of fish eye distortion, too:
So, suitably disappointed with this feature, I moved on to my next print... a poop bucket. This was kinda necessary... without it, the printer just hurls discarded bits of filament (or "poop") all over the place... and for the printer to go where I wanted it, this would mean fishing a lot of poop out from behind cabinets, which wouldn't be fun.
| Funny name, but very practical addition. |
And with that, I had everything I needed to put the printer in its forever home:
| If God wanted every 3D printer to be the same, he'd have given them all braces on their Y axis. |
So, now the printer was in place, what to print first? A polar bear automata, of course.
This thing was one of the reasons I bought a 3D printer in the first place. I wanted to be able to build things like I see at the MAD museum, and thought a 3D printer would help me do it. I only actually got around to trying this print last year, though, and the result was...frustrating. All the tolerances were just...off... so it fit together, but it didn't move very easily, felt like there was too much force involved. So, would my new, more accurate, printer do a better job? Of course it would!
Next up, a bit of cable management for the machine, make it look more "office-ready". Tuck that cable in:
| This little clip looks simple, but it slides really nicely and was print-in-place. Very cool. |
| Taking a rogue, floppy cable and hiding it inside this badass chain has to be the most satisfying thing I've done... |
Keep this cable tidy:
| A small milestone, this is the first model I uploaded back to the community...albeit a "remix" of another similar design. |
I then printed a load of sharks for my daughter's friends. I think they became a whole social currency, before the market collapsed only a couple of days later:
We're going to need a bigger...desk?
Finally working our way round to the point...
Next up, I printed a mount for a camera... specifically a Tapo TC60.
A lot of people use these kinda indoor security cameras to monitor their prints. And I had some points for the Tapo shop that covered the cost of this. (I don't think I ever told you guys about my experiments with thermostatic radiator valves! can't believe you missed out on that!)
So, I set it up and could, indeed, monitor prints remotely...which is, in itself, somewhat useful. But I discovered there's no way for this thing to record a snazzy timelapse of whatever's being printed. And that sucks, because I want to see snazzy timelapses! It could do videos, it could do stills, just no snazzy timelapse.
The science bit...
Key to understanding what happens next is the way the printer's inbuilt "timelapse" works. Every time it finishes printing a layer, it moves the bed to a set position, and moves the print head back to a "home" position, so it will be out of shot. It then takes the photo, and then carries on where it left off...and repeat.
So, knowing this was the case, I was like "surely I could just use the camera to record a video...then find the frames where the bed is in the right place, and the print head is in the right place...and use those frames to build a timelapse?"
So, I asked my now beloved Claude:
So, knowing this was the case, I was like "surely I could just use the camera to record a video...then find the frames where the bed is in the right place, and the print head is in the right place...and use those frames to build a timelapse?"
So, I asked my now beloved Claude:
| Only slightly paraphrased. |
I did a few iterations of the resulting program, and it got to results that weren't bad. But certainly weren't really what I was after:
I tried a few times to improve the quality of my results, but didn't really get anyway... and it got to the stage where I decided I needed a different approach.
I then modified one of the SLR remote cradles to home the little button. You can see the little metal legs of the button sticking out, and the black button itself just behind the actuation pin:
A couple of shakey solders later, and we were ready to test.
Success. So, I told Claude to change the software to detect my little red screen and printed myself another little Benchy, to see if it would work.

Then the algorithm works its magic..complete with handy debug output:
Another science bit...
The issue of the A1's timelapses being a bit rubbish is well known. And lots of people have come up with ways of fixing it. Amongst the most common is a solution for using your DSLR to take photos at the right time.
Basically, you print a cradle for a camera remote. This cradle positions the remote such that when the print head returns to it's "home" position in timelapse mode, the shutter button is pressed, which causes the SLR to take a photo... and you can later stitch together these photos into really good looking timelapses.
This is a good solution... and maybe I should have done it. But I don't want to have my DSLR continuously set up pointing at the printer. It's just more clutter. And I've already got two cameras on the printer, do I really need a third? And...most importantly.. I was just a bit bloodyminded that I wanted to be able to do this with the little Tapo camera that I'd just spent my hard earned reward points on.
Basically, you print a cradle for a camera remote. This cradle positions the remote such that when the print head returns to it's "home" position in timelapse mode, the shutter button is pressed, which causes the SLR to take a photo... and you can later stitch together these photos into really good looking timelapses.
This is a good solution... and maybe I should have done it. But I don't want to have my DSLR continuously set up pointing at the printer. It's just more clutter. And I've already got two cameras on the printer, do I really need a third? And...most importantly.. I was just a bit bloodyminded that I wanted to be able to do this with the little Tapo camera that I'd just spent my hard earned reward points on.
So... my next idea. If I could use a similar mechanism to the SLR shutter remote, but just make a light flash red or something... couldn't I just tell software to pull out all the frames where the light was red, and stitch those into a timelapse?
So, I whipped out a raspberry pi (pico) and wired something up:
So, I whipped out a raspberry pi (pico) and wired something up:
| I wrote all the Python myself! |
I then modified one of the SLR remote cradles to home the little button. You can see the little metal legs of the button sticking out, and the black button itself just behind the actuation pin:
A couple of shakey solders later, and we were ready to test.
Success. So, I told Claude to change the software to detect my little red screen and printed myself another little Benchy, to see if it would work.
This was... encouraging. The aesthetic of the timelapse was just what I wanted... but why did it only start halfway through??
A bit of digging and I discovered the issue... Essentially, the red light wasn't red on film. If you look right in the middle of the red light, it's not red... it's just white. You only know that it's red because of the red "aura" around the outside... so I made it so it took the aura into account (and also made the red light less bright!), and...
A bit of digging and I discovered the issue... Essentially, the red light wasn't red on film. If you look right in the middle of the red light, it's not red... it's just white. You only know that it's red because of the red "aura" around the outside... so I made it so it took the aura into account (and also made the red light less bright!), and...
Ohh yeah! Now we're talking.
With that result, I was pretty sure this was the right approach, so I went ahead and made myself a more permanent home for my little red screen:
And from a hardware perspective, that was us done. So, let me show you the software Claude wrote for me... where the magic really happens:
With that result, I was pretty sure this was the right approach, so I went ahead and made myself a more permanent home for my little red screen:
| The USB is for power. The jack socket connects to the switch on the printer. |
And from a hardware perspective, that was us done. So, let me show you the software Claude wrote for me... where the magic really happens:
"Timelapse studio"
You start by uploading your video file:
Then you tell it where your little screen is located. Remember to select an area that includes the "aura", as well as the white light in the middle...

Then the algorithm works its magic..complete with handy debug output:
You can choose which frames get included in the timelapse... either by adjusting the sensitivity of the algorithm of the slider, or by deselecting outlier frames etc. There's a nice on-hover if you want to see an image in more detail:
And then, finally, some output settings. Output framerate, noise reduction, cropping... All things, let's face it, that I'd probably not have bothered to implement myself. But when it's as easy as just asking Claude to do it, you might as well. He actually did the framerate all by himself, just because he thought it would be useful.
Pushing Timelapse Studio further
I've made a few tweaks since capturing these screenshots.
Firstly, the program is now able to take multiple video files as input instead of just one. The camera actually only records in half hour chunks... so for longer-running prints, having it stitch multiple files together just makes life easier.
Secondly, when you get on to larger prints, the camera rises up more... and that means that my little red light looks lower down in the image. Which meant it was creeping out of the area the algorithm was looking at. So now you can set several positions for the algorithm to look throughout the video, and it interpolated between these accordingly.
Thirdly, I've changed things so you effectively process each 30 min clip by itself, then stitch the output together at the end. In very long videos, if there was an errant frame in the output, it was hard to figure out where it was amongst the 1500 frames that have been selected. So it's easier to find them if you do each little bit at a time. Also, reprocessing short clips is much quicker than having to redo the whole thing if you want to change a setting etc.
I also added a couple of other nice-to-haves. Like having the timelapse linger on the final frame so you can enjoy the finished article for longer.
Example
So, this is where we're up to with current output:
You'll note that the vase goes a little wonky in the middle. That's got nothing to do with this process...something fell on the printer and knocked it out of alignment halfway through the print...the whole timelapse thing worked as intended!
What's next?
What's next?
I need to do a little more cable management (possibly including the "Panda Branch" usb hub for my printer) in order to make the whole solution feel permanent, but otherwise I think I'm going to move this one to the "done" column.
What's the point, James?
So, this has been a fun ride, hasn't it. A bit of 3D printing, a bit of Raspberry Pi, a bit of vibe coding. But what have I learnt along the way?
I think this is a great example of how the way I think about problems, and solutions to problems, is being altered by AI.
I'd never have pursued this solution without AI. Writing algorithms to detect changing lighting patterns in set areas of a video? Writing a whole UI to make this process manageable? I'd just not have bothered. It would have taken far too long for me to build something that, let's face it, I don't care about that much. But when you can just ask Claude to go do the hard work for you, making something bespoke to solve your problem suddenly becomes practical.
I have conflicting feelings about this.
On the one hand, the last thing our environment needs is a bunch of people burning AI tokens to create solutions for problems that are already solved elsewhere... or those that, bluntly, don't really need solving. So maybe I feel guilty that this whole thing is a little indulgent.
On the other hand, as a software professional, I find the whole thing reassuring. People keep talking about how engineers will all be out of jobs now AI agents can do it all for you. But what you see is that the barrier for work to be worthwhile is just being lowered.
Most organisations are used to doing some sort of mechanism to weigh impact and effort, to identify low-hanging fruit, or strategic projects etc... It feels like AI is just moving the effort axis. So a bunch of things that would have fallen into the "time sink" or "won't do" buckets, suddenly become "quick wins" and are suddenly worth doing. So, yes, engineers will be able to work quicker... but we'll also find more value-generating work for them to do.
The other big takeaway I have from this project is just how much more comfortable I'm getting with vibe coding. From my first tentative footsteps last year, filled with worries and cynicism, I now feel I'm pretty AI-native in the way I think about problem solving...and this spills over into the way I think and feel about the AI agents themselves.
Above, I playfully refer to Claude as "he". I know "it" is not a "he". Of course. But I do find myself starting to catch feelings for Claude. No, I'm not in love with my chatbot. (although that would be the kind of content to really drive traffic to this blog, wouldn't it!) But there are other feelings I find myself having. Mainly "gratitude". When I ask Claude to do something for me, and he scurries off and overdelivers... taking an idea that was in my head 5 minutes ago and making it real (or at lease "virtually real") in front of me...it's only natural to feel thankful for his effort..right?
As always, big up to the elite that suffered through the whole thing. I appreciate you!
I think this is a great example of how the way I think about problems, and solutions to problems, is being altered by AI.
I'd never have pursued this solution without AI. Writing algorithms to detect changing lighting patterns in set areas of a video? Writing a whole UI to make this process manageable? I'd just not have bothered. It would have taken far too long for me to build something that, let's face it, I don't care about that much. But when you can just ask Claude to go do the hard work for you, making something bespoke to solve your problem suddenly becomes practical.
I have conflicting feelings about this.
On the one hand, the last thing our environment needs is a bunch of people burning AI tokens to create solutions for problems that are already solved elsewhere... or those that, bluntly, don't really need solving. So maybe I feel guilty that this whole thing is a little indulgent.
On the other hand, as a software professional, I find the whole thing reassuring. People keep talking about how engineers will all be out of jobs now AI agents can do it all for you. But what you see is that the barrier for work to be worthwhile is just being lowered.
Most organisations are used to doing some sort of mechanism to weigh impact and effort, to identify low-hanging fruit, or strategic projects etc... It feels like AI is just moving the effort axis. So a bunch of things that would have fallen into the "time sink" or "won't do" buckets, suddenly become "quick wins" and are suddenly worth doing. So, yes, engineers will be able to work quicker... but we'll also find more value-generating work for them to do.
The other big takeaway I have from this project is just how much more comfortable I'm getting with vibe coding. From my first tentative footsteps last year, filled with worries and cynicism, I now feel I'm pretty AI-native in the way I think about problem solving...and this spills over into the way I think and feel about the AI agents themselves.
Above, I playfully refer to Claude as "he". I know "it" is not a "he". Of course. But I do find myself starting to catch feelings for Claude. No, I'm not in love with my chatbot. (although that would be the kind of content to really drive traffic to this blog, wouldn't it!) But there are other feelings I find myself having. Mainly "gratitude". When I ask Claude to do something for me, and he scurries off and overdelivers... taking an idea that was in my head 5 minutes ago and making it real (or at lease "virtually real") in front of me...it's only natural to feel thankful for his effort..right?
Wrap up
So... there we go. I have a better printer. I can make reasonable timelapses without spending any money. AI made me a pretty nice, useful bit of software...and that's probably enough for now.As always, big up to the elite that suffered through the whole thing. I appreciate you!