February 2026 update

Been a while since I updated the blog so thought I’d post some updates on the project.

Peace and quiet, a TMC2209 story

The original project uses DRV8825 stepper motor drivers. These were the most commonly recommended drivers when I was first learning how to drive motors from a Raspberry Pi computer many years ago. They are simple, but they are noisy. If you are young you can hear them whine at higher frequencies! The signals they send to drive the motor coils are quite rough. So the project works, but I was uncomfortable with the noise especially when making large moves. You can reduce the loud mechanical noise by switching to microstepping, but the high pitched whine seemed to be a permanent feature.

A couple of people in the pilomar project group on Discord have recommended (and used) TMC2209 steppers instead. I am grateful to those alternative builds, it gave me some initial pointers to how to convert the project to use TMC2209s.

TMC2209 boards are a similar size to the DRV8825 and almost the same layout, but more sophisticated devices. They have more options and most importantly much smoother control signals!

Original DRV8825 on left, BigTreeTech TMC2209 on the right.

One big advantage of the TMC2209 is that you can perform more functions AND more configuration tasks directly with the chip via a UART channel. The DRV8825 requires potentiometer adjustments and individual pin signals for everything. The TMC2209 does not which may make it easier for people to get the project up and running initially.

The challenge however was the UART communication! The original design uses Pimoroni Tiny RP2040/RP2350 microcontrollers and there are not enough pins available to add the extra serial communication lines needed. I needed more GPIO pins!

Tiny2350 vs Pico2: At last I have enough GPIO pins!

Very early in the project I had rejected the Raspberry Pi Pico RP2040 because it had proven too unreliable, with constant random resets that I could not eliminate. Now there is the Pico2 RP2350 available and after testing that the reliability is much better. So a large rewrite and restructure of the microcontroller was started. That took all of my spare time throughout 2025 – which explains why there have been no blog posts in 2025 either! Thanks go to Thomas Proffen too for kick starting the code restructuring for the microcontroller.

But after a lot of digging, learning, programming and testing I think I have a TMC2209 solution ready to roll back into the project. So far the results have been astounding, when I first ran the new solution I thought it was not working at all, the move commands resulted in no response from the telescope. After some panic I realised that it was in fact working but was so incredibly quiet and smooth that you couldn’t tell it was moving.

The TMC2209s have other tricks up their sleeve for future development, it looks like they can perform some actions even more autonomously than the DRV8825s, so there may be some more features to introduce still. But I’m really happy so far with the conversion. The telescope will finally sit and work on the desk in front of you and you’ll not notice!

I desperately need some clear calm nights now before the lighter nights arrive to really test the new board on some real observations. I want to verify that all the engineering changes are stable as much as possible before releasing the new
versions.

Some key points in the latest design

  • The motor power monitoring is back. You can now measure the motor power voltage again. This is useful when troubleshooting and also when running from batteries.
  • The board now hosts a Raspberry Pi Pico 2 (RP2350) with more stability and more capacity.
  • The DRV8825 sockets have been reworked to support the TMC2209 instead.
  • There are more GPIO pins available on the Pico 2, so more have been exposed around the board edge for future expansion.
  • 5V, 3V and 12V power points have been added to drive fans or other ‘light load’ accessories.
  • The Pico2 I2C channel is made available to support position sensors and other addons.
  • The ‘reset transistor’ circuit is eliminated for the microcontroller. This was to work around the lack of a reset pin on the Tiny2040. The new design supports a USB cable during operation better than before and the voltage supplied to the microcontroller is much higher now.

Another PCBWay order

By changing the microcontroller and stepper driver I had to revisit the PCB design again. I had already converted the project from EasyEDA to KiCAD in an earlier iteration, so it was time to re-open KiCAD and start adjusting the board design.

I had a new batch of the boards produced by PCBWay.com in China just before Christmas. The boards arrived as always remarkably quickly. I had them on my workbench less than a week after submitting the order. The speed of delivery alone is a great time saver in the project. I couldn’t make such a good quality board at home, and certainly couldn’t make a bunch of them at home as quickly as they can. These manufactured boards are also cheaper than the protoboards I was buying before! They look great, almost as if I know what I’m doing (I confess I’m still nervous!). I ordered the new ‘matt black’ finish on the PCBs and they look fantastic. I’ve used a different PCB color for each version of the design, but I think I’ll stick to the matt black from now on.

The new Matt Black PCB finish looks really neat.

I always assumed I would be building the project with protoboard circuits, but I love the compactness and reliability of a professionally manufactured board so I’m not going back now. My current development route is :

  • 1 – Breadboard : To learn what connections and components are needed.
  • 2 – Protoboard : To test reliability and operation.
  • 3 – Manufactured PCB : For reliable and compact deployment.
Which do you trust most? My protoboard mess or a manufactured PCB?

Having the boards made by PCBWay’s industrial process also increases the reliability of the circuits, I have less to worry about when assembling the final components. I typically order a batch of 5 or 10 boards so I have enough for my working builds plus a few spares for further experiments or repairs.

I usually only publish the gerber files in the GitHUB project but I see I can also create a kind of template order on the PCBWay website which would be fully configured and ready for anyone to place their own order too. That looks like a handy feature for people who are not confident in building or ordering manufactured PCBs.

I’m currently ordering unpopulated PCBs for development flexibility but I’m wondering about trying their more complete assembly service eventually too.

Testing went well, although I found one minor routing mistake in my KiCAD design. Luckily I was able to solve it with a couple of jumper wires, I didn’t even have to hack the board.

Application development

I’ve continued restructuring the code, but of course like all projects it’s also becoming larger as more features are added and more edge cases are handled by the code. So splitting into cleanly defined modules is sensible. Isolating the logic from the user interface more cleanly will also open the door to alternative UIs in the future.

Installation is still the same procedure as before, but there are now more files to transfer when you set up the project. The microcontroller code is very specific to the Pico2/TMC2209/PCB combination so I have split this into a separate circuitpython subdirectory in the github repository.

When using the TMC2209 drivers the software can now handle more of the driver configuration programmatically which makes it easier to get up and running. You don’t need to adjust potentiometers, measure voltages or calculate anything manually to get the motor running properly. Part of me wishes I knew about the TMC2209 at the start of the project, but the other part of me is relieved that I started with the simpler DRV8825s!

AI assistance

I started experimenting with generative AI to help with some programming tasks. Initially out of curiousity, but I soon noticed that it was very useful for a couple of areas.
1) Generating boilerplate code, simple routines that took time but were easy to define and verify.
2) Generating efficient code using algorithms, tools or techniques that I was not confident with.

The first case is relatively simple, as long as you know what code you are expecting and are confident to verify that it is correct and safe to use, it can generate some routines for you rapidly. I always have to go and adjust the code a little to fit with the project better, but it has been a timesaver a few times.

The second case is more interesting. There are two tasks where it helped me significantly.

I wanted to do some complex sky projections of data (see the aurora utility below), I managed to do the projection using my very basic trigonometry skills, but the result was SLOW and I knew for certain that there would be much faster ways to do the job. But I don’t have experience with more advanced transformations and transpositions of arrays of coordinates.
So I defined what data I had, what I wanted to achieve and which tools I wanted to use (eg numpy) and started an iterative development cycle with Microsoft’s copilot. It took some time and I had to restart the discussion three times, but I eventually learned how to discuss and get some useful code from it. The risk here is that I still don’t understand fully some of the calculations that it is performing – but I can verify the input and output data using my original code. You can of course continue the discussion with the AI to get more understanding of the code it wrote, it can make a patient tutor at times. The new code produced a significant performance improvement, my projection calculation went from about 90seconds to about 1 second.

Another task was to do with image cleaning. In the pi-lomar project I have some OpenCV filters, routines that I have written which perform various cleaning or enhancement tasks on images. They are mainly used by the target tracking to clean up an image of the sky, enhance the stars and eliminate any pollution or haze. Up until now I have been doing this by researching online and trial-and-error programming to get the results that I want.

I found that by defining the problem well with copilot I could get it to write code to do the same cleaning efficiently. What was really useful is that copilot can analyse your example images. I described the camera sensor and lens I was using, including that the IR CUTOFF filter was removed. Copilot then made some reasonable analysis of the images and what issues they could have. It then made some efficient OpenCV/Numpy routines to clean the images. This is still an iterative loop but opened a whole new way to solve future image handling problems. Once more I feel confident to try this approach because I can test input/output to see if I get the results I expect.

I have witnessed some truly weird bits of code being generated, so it’s critical that you understand what you are asking for, take time to explain it clearly to the AI, and definitely deeply study the suggested solution. Even if the result is wrong, sometimes you still learn new things though!

Utilities added

METCHECK.COM weather forecast

When planning observations it’s useful to know if the conditions will be right.
Here on the UK coast of the North Sea we don’t get many perfect nights so you don’t want to miss them when they come. There are lots of websites and phone apps for the weather, I use a few of those to plan the week ahead. When the actual night of observing comes I switch to a live data feed from https://www.metcheck.com/BUSINESS/developers.asp

For my location I have found this weather data feed really useful and generally accurate. METCHECK.COM make a weather forecast available for any latitude/longitude on Earth. You can download this as a json file from their website. The file is large and difficult to read in the raw json format so I made a terminal interface to show the information in a table. I have included this utility program now in the github repository for the project.

METCHECK.COM data viewer, demonstrating that it’s pretty cloudy here.

You can open a terminal window, go to the src directory and enter

python pilomarmetcheck.py

This command will display a table of colorcoded weather forecast information.
Each row is a different weather measurement. Each column is a time into the future. The wider your terminal window, the more columns can be shown.
The utility refreshes automatically every few minutes to keep the information up-to-date. The same source code src/pilomarmetcheck.py provides the class metcheck_handler() which you can use in other programs if needed.

from pilomarmetcheck import metcheck_handler

Check in src/pilomarmetcheck.py to see how it is used to extract and display the data. The metcheck_handler class is also included in the pending version of src/pilomar.py so that weather conditions can be monitored and also recorded in image metadata.

NOAA Aurora conditions

I hate to miss an aurora display, we’ve had some very spectacular displays during the current solar maximum but our notoriously cloudy nights mean it’s difficult to actually catch them. I noticed that all the online aurora visibility maps show the data from NOAA.GOV of the ‘ovation data‘.
This is a map of the likely intensity in the next 30 minutes for each lat/lon on Earth. If I understand correctly this is the ‘probability’ of seeing the Aurora directly overhead. But the aurora is usually 100km high in the atmosphere, so even if it is not directly overhead you may still be able to see it overhead a nearby location.

Can you see the aurora if it’s lat/lon is over the horizon? It depends how high it is.

So I wrote a ‘projection’ routine which estimates what the aurora may look like when viewing it from a distance. It is a simple terminal interface which refreshes automatically every 15 minutes. It shows the aurora oval as you might see it yourself. You can see the size of the oval and whether it is above or below
the horizon from your location.

Simple Aurora ovation viewer. Could the aurora be visible where you are?
In this case, no, I probably cannot see it 🙂

The online alerts are still the best way to START looking for the aurora, but if you have an alert ongoing I use this tool to monitor the movement and latitude of the aurora during the event. It relies entirely upon the probability calculate by NOAA, and there are many other issues that can affect your view of the
aurora, but it was an interesting challenge to program given my poor mathematical skills!

To run it go to the /src directory and enter

python pilomarovation.py

The same source code src/pilomarovation.py provides the class pilomarovation() which you can use in other programs if needed.

from pilomarovation import pilomarovation

Check in src/pilomarovation.py to see how it is used to extract and display the data.

Checking images on a terminal

Here’s a crazy one. I’ve developed the project as a character based application. But it’s really useful to be able to see images as they are being captured sometimes. Normally I use SFTP to transfer the .jpg files off the RPi and onto a PC for viewing. This is great, but sometimes I just need a very quick view of an image. I added a utility to display .jpg files through the terminal interface. It uses the XTERM 256 color palette to display a downscaled copy of the image.
I’ve found this strangely useful! So I’ve included it in the project repository now too.

A .jpg image of a tree rendered on a character terminal.

To run it go to the /src directory and enter

python pilomarviewer.py '[filepath]'

where ‘[filepath]’ is a .jpg filename or even a wildcard filepath. The ‘quote marks’ are important if you use wildcards.

It displays the image in the40x160 terminal window. You can pan/zoom around the image as you need. If the image gets updated on disc the display automatically refreshes. If you enter a wildcard in the filepath such as ‘/data/light_.jpg’ it displays the most recent matching image.

You will be able to launch this from the next pilomar.py program too and link it to the current observation automatically.

Observations

2025 was a terrible year for observations here, the summer was fantastic, but up here too light at night. The winters too cloudy. We have had maybe 3 good viewing nights so far this winter and 2 of them had a bright moon to complicate things. But I managed one night recently where I could test everything nicely for a few hours, and the whole package still works, actually when it works really smoothly you have to find a good book to read.

We’ve missed a few spectacular aurora displays due to weather, but sometimes caught a glimpse when the cloud has broken up. I still want to play with the ‘keogram‘ function in the pilomar software on a good aurora display.

Next steps

When the new release is ‘stable’ I will return to the plate solving problem. I’m determined to get a solution up and running for this, and I think that the AI tools will probably speed up the development for me now.

I’m in the process of updating the Github repository for the project with the last 10 months of developments and changes, hope to have all that published soon including the latest TMC2209 PCB design.

Spring is approaching so already my mind is turning back to more summer development tasks. I hope that some of the other builds around the world are in better climates and getting some good images back!

July 2024 update

Instructables builds

There are now 9 builds listed on the Instructables website, I’ve also had contact from other builders, I guess there are around 20 telescope builds out there. So there’s a small – but amazing – community forming for the telescope now. To create a sort of support group for the project I’ve created a Discord group for people who are building or running a copy of the telescope.

PM me via the Instructables project page for a link if you would like to join.

GitHub for pi-lomar updated

A large number of changes to the package are in the latest release just published on GitHub. There is a list of the changes, and some hints about upgrade options here.

The most significant changes in the new release are

Now runs on Raspberry Pi 5.

The GPIO handling is different on the RPi5, so I had to redevelop and retest the GPIO code to work there. This reinforces the advantages of switching to Bookworm 64bit O/S. The RPi4 and RPi5 both run pilomar happily on Bookworm now. Support for the RPi3B with the old ‘Buster’ build remains, but it cannot support some of the new features in this latest release. If you want to upgrade to the latest version I now strongly recommend a RPi4B or RPi5 as the main computer now.

FITS image handling.

With support from a few people in this project, and also from Arnaud and the Astrowl box project there’s now a way to save .fits format image files. FITS file formats are required by some astro image processing software. The raspistill and libcamera-still utilities will save raw images in .DNG format, but that is not accepted by some software. This new FITS format only works on Bookworm builds because it requires the picamera2 package to be available. You may be able to get this installed on earlier O/S versions, but I think it will need some tinkering. The FITS handling has been done by creating a new standalone Python routine (src/pilomarfits.py) which can be called just like the ‘libcamera-still’ command, but which generates .JPG and .FITS files instead. This is likely to improve further in the future, it’s just an initial attempt to add FITS format.

Smoother motor movement.

A whole bunch of ideas came from other builders, it became clear that microstepping is easy and safe to activate for most people. Microstepping makes movement slower, but smoother and quieter. It required some rethinking of the code on the microcontroller because microstepping generates a lot more motor pulses, a limitation with the clock in CircuitPython became apparent but is now resolved. There is also a ‘slew’ mode available in the latest package. This lets the telescope perform LARGE moves using full steps on the motor – noisy by fast. Then when it starts capturing the observation images it switches to microstepping.
Better microstepping support also means that you can build using the 200step stepper motors now. These are generally easier and cheaper to buy.

Easier configuration changes.

Pi-lomar’s configuration is generally held in the parameter file. You can make a lot of changes to the behaviour there. This latest release has moved even more of the configuration into this one file. However some changes can be quite complex to configure correctly. Therefore the latest software has added a few options to perform some common configuration changes directly from the menus. This ensures more consistent and reliable setup for a few camera options and also for configuring several of the microstepping related issues.

Aurora features.

After the spectacular Aurora displays earlier in the spring, I’ve added some experimental Aurora recording features to the software too. Obviously we now have to wait for some good Aurora displays to fully test this feature, but the basic concept seems to work OK. In Aurora mode the camera points to a likely direction for the Aurora and captures images as quickly as possible. It can also generate a simple KEOGRAPH of the aurora display which may be interesting to study sometimes.

Observations

Well, summer is here now, and the skies are too light, too short and sadly still too cloudy. So no practical observations of anything new to show this time. All the project work has gone into this latest software development round instead.
So I’m now looking forward to slightly longer and darker nights coming in August and September, and hoping that the clouds go away.

What’s next?

I’m currently exploring some modifications to the telescope design.

Now that the RPi5 is supported – it has TWO camera ports! So I would like to explore the idea of having two cameras mounted in the telescope. Ideally a 16mm lens dedicated to tracking, and then a 50mm higher quality lens dedicated to observation pictures. There is also some feedback from other builders which is re-opening the design of the camera tower and camera
cradle. I’m currently thinking to make a slightly wider camera tower to accommodate 2 cameras, and probably reorienting the sensors into portrait mode to improve access for focusing. It may make sense to improve the weatherproofing around the camera boards – as others have already done.

After a chat in the Discord group I’m also looking at adding a permanent ‘lens cap’ to the camera tower. This would sit below the horizontal position of the camera, so that the lens can be parked up against it when not in use. There are a
couple of advantages to this idea. (1) You don’t have to remember to remove or reinstall the lens cap. (2) If the cap is sufficiently dark the camera can take the ‘dark’ control images automatically at the end of each observation.

I have a redesign of the motorcontroller PCB nearly ready, with improved power performance for the microcontroller. There will probably be another couple of improvements made to it, and then I’ll try getting some samples printed up. I considered switching from the Tiny2040 microcontroller to something larger with more GPIO pins, but have decided to stick with the current setup. There seems to be a practical memory limit on the RP2040 chip in the microcontroller, it has around 200K of working memory available to it, and the current functionality consumes it all. I cannot even get the current code to run on CircuitPython 9.x yet, so it’s still limited to 7.2 and 8.2. It may be worth waiting to see if any 2nd generation microcontroller comes from RPi in the near future before finalising the design.

February 2024 update

Instructables builds

There are 5 ‘I made this’ posts on Instructables now for pi-lomar, and a few other builders have been in contact with questions and suggestions over the last two months. I’m really looking forward to seeing what people manage to do with the telescope and get some feedback and improvement ideas.

GitHub for pi-lomar updated

The January issues branch in GitHub became quite a monster, there is a list of all the changes available here. I’ll cover a few of the interesting items here.

UART communication problems

A few people have had problems getting the communication to work between the RPi and the Tiny2040. This has been due to a few different issues, but it became clear that more help was needed to identify where the problems are when communication doesn’t work. If something is wrong in the communication chain you might get an error message, or you might simply get a ‘dead telescope’ which refuses to move. So I’ve added features to detect and complain more clearly if something is wrong, and also a way to monitor and test the communication in realtime.

Software versions

The original build required very specific versions of CircuitPython and the Raspberry Pi O/S. I’ve addressed a few of the limitations now so you can use the most recent copies of both. I’ve now got a telescope running happily with Bookworm 64bit on an RPi4 and CircuitPython 8.2 on the microcontroller. This means you can use whatever the current version is – you don’t have to go looking for archived copies anymore. The released version does not work on the RPi5 yet – I’m going to rework the camera handling for that beast first.

Motor configurations

The original design was heavily dependent upon specific stepper motor designs. This was quite restricting for some because they are not always easy or cheap to source. The new software has moved the motor and gearing configuration into the parameter file instead of being hardcoded. So now it is simpler to set up alternative motors AND you can still take updates to the software without having to repeat your own hardcoding changes.

Removing the infrared filter

In the last blog I mentioned that I had removed the infrared cutoff filter from one of the camera sensors. I had to wait a while for a clear enough night, but eventually grabbed a few shots of the region around the Orion Nebula. It was not a great observing night, there was considerable haze and some random cloud, but I got a few images.

I am happy to confirm that it really made a difference though. New objects appeared and previously faint objects are clearly enhanced by expanding the vision of the sensor.

After stacking and some enhancement in The GIMP, this is the region with the new infrared capability. I was not able to remove ALL the haze, but if you are patient you can reduce it considerably.

For clarity I’ve marked the major items that are now visible below.

There is clearly a colour tint still to these images which I need to play with some more, but there are definitely new details here.

Orion Nebula

There seems to be a larger area of nebula visible in this image. The colour variation is not as good as earlier images but I think that’s something I can still work on.

Flame Nebula

This was faintly visible before the infrared cutoff filter was removed, but it seems to be more clear now. Hopefully when I can gather more images to stack I can pull more clarity from that still.

Horsehead Nebula

With the infrared filter in place you could see a very faint hint of the Horsehead Nebula surroundings, but they were very subtle. You had to know there was something there and then play with image enhancement to get even a slight hint of it. But it is now more clear.

Barnard’s Loop

Orion is surrounded by a large ‘infrared only’ area of gas. I’ve never seen this before in any observations I’ve made, but suddenly it’s there. Barnard’s Loop is to the left of the belt, and although faint, there’s no doubt it’s now detected. The gas cloud extends lower down around Orion too, but in this shot it’s hard to separate urban haze from actual gas cloud.

Urban Haze

This brings me to by current problems, urban haze. There is light pollution and generally poor quality atmosphere around here. I’m not living in a big city but the conditions are visibly deteriorating as time goes by.

The IR image above is taken with the 16mm lens, I have now removed the IR cutoff filter from the 2nd telescope with the 50mm lens too. That also has a light pollution filter added. The brief chance I’ve had to test it suggests that it does indeed make a difference to the haze that’s creeping into all the shots. The question is- does it also reduce the infrared wavelengths too? The next clear moonless night may answer that.

There’s another place where haze becomes an issue. That’s in the drift tracking mechanism of the telescope. Pi-lomar checks its position by taking a live image and comparing it with a calculated image of the sky. It uses the difference between star locations to correct the position of the camera. It’s not perfect, but it works well enough for the images to be stackable. But if there is strong haze in the sky the Astroalign package can struggle to recognise and match stars between the images. You can get a cloud of false stars at the bottom of an image which confuses things.

To work around that I use OpenCV to try to enhance the stars in the live tracking image. Basically trying to reduce noise and enhance just the real stars. This requires tuning some OpenCV filter functions to work nicely with MY particular observing conditions. That’s a problem for people in other locations, they may need to tune the filter functions differently.

So I’ve modified the software to make these OpenCV filter functions into ‘scripts’. You nolonger have to play with hardcoded function calls in the software, you can simply edit the scripts and test them rapidly against your conditions. I hope this is a good benefit for people. I will probably refine the configuration and testing further in future versions. This is clearly an area where a graphical interface would help. An early test of this new feature looks promising when trying to filter out tree branches from someone’s live tracking images. It looks like we can still pull stars out of quite busy and noisy shots.

Next steps

I am not intending to develop the software further now until the summer. The latest update needs to be taken into use and tested in more environments, so I want to limit any new changes to bug fixes or tuning related to that. Spring is approaching, it’s better to spend time observing!

January 2024 update

Testing Pi-lomar on the Raspberry Pi 5

Will Pi-lomar run on a Raspberry Pi 5?

Spoiler alert! No.

Not yet.

The camera and GPIO libraries have changed, but how close is it to working?

Interestingly more of the required packages that made the RPi 4 Buster build tricky seem to be pre-installed in Bookworm now. I only added opencv, astroalign, pandas and skyfield, and they all installed cleanly, no conflicts or special tricks needed.

sudo apt install python3-skyfield
sudo apt install python3-opencv
sudo apt install python3-astroalign
sudo apt install python3-pandas

The resulting build script will be much simpler I hope. I’m still installing globally rather than creating containers because the RPi will be dedicated to the telescope.

The pilomar.py program of course errored out fairly quickly, but with relatively little change I got it up and running as far as the first menu. That includes all the data loading and formatting that has to happen when you first run the software.

Right out of the box I have to say “wow!“, I’m impressed.

For comparison: The 2GB RPi 4B with the 32bit operating system takes about an hour to calculate the Hipparcos catalogue of 100000+ stars. On an 8GB RPi 5B with 64bit operating system, it ran in 25 seconds, so fast that I thought it had failed, I had to speed up the progress messages to prove it was doing something. From nearly 60 minutes down to 25 seconds! In regular use I’d estimate Pi-lomar runs about twice as fast on the RPi5.

It looks like the basic migration should be straight forward, and there is capacity there for extra features.

Raspberry Pi 5 Active Cooling hint!

The official cooling unit is great – it’s very easy to attach to the RPi5. BUT – you can’t detach it. So, if you’re thinking of later putting it into a case or occasionally reorganize things, be very careful.

For example: I like the Pibow cases, but a couple of design choices clash. If you connect RPi+Cooler first: You cannot fit all the layers of the case.
If you connect RPi+Cooler second: You cannot remove all the layers of the case, and the camera connectors become more difficult to access.

Next time I’ll change the little spring-loaded feet for nylon bolts so the cooler can be removed – that’s the fundamental design flaw to me.


Back to the RPi 4B version

Motorcontroller PCB

The published PCB as it arrives from a manufacturer. No components, you have to add those yourself.
Unpopulated PCB as received from the manufacturer.

The first PCB design is done and the Gerber files are now in the GitHub project for the PCB. These files can be used to manufacture the PCB. It still needs to have components added, but the wiring is all set in the PCB itself. Many thanks to Dale, Mark, and Ton for their help with the designs so far.

The published PCB has a few improvements on it.

  • Full ground plane on the underside of the PCB.
  • 12V supplies have more copper too.
  • The unused ADC0 pin on the Tiny2040 is now available in the expansion section for your own use.
  • A number of GPIO pins from the RPi header are now exposed in the expansion section.
  • Some development features (LED and motor power measurement) are removed.
  • PCB connector blocks have been standardised.
  • Printed warning to take care when connecting USB and GPIO at the same time.
  • NOTE: On the published Gerber files, the ‘R1’ resistor is marked with a lower value than these images show. Any value from 320 – 680 Ohms seems to work fine. The lower the value, the better the transistor switches.
Published motorcontroller PCB with components populated.
Populated at home with the necessary components.

I have added a new folder to the GitHub project to contain the Gerber files.

https://github.com/Short-bus/pilomar/tree/main/gerber

The files for the PCB are in the /gerber/PCB-2023-12-14 folder on GitHub. You must generate a .zip file from here to use for manufacturing.

cd gerber/PCB-2023-12-14
zip PCB-2023-12-14.zip *

The PCB-2023-12-14.zip file is the one that you should submit for manufacturing.

The file gerber/readme.txt explains more about the manufacturing specifications you will need to provide when placing an order.

A second PCB design is still in testing at the moment, this one eliminates the separate Raspberry Pi power supply. It adds a buck converter onto the board to act as the RPi’s power source. Everything runs from the motor 12V supply.

Software development

At the end of December I released a few improvements to the software, fixing a few issues that the early builders found. I think people should be taking their first images soon, so I’ve done a little more development in January to help tuning the telescope.

The tracking algorithm works for me, but I suspect that it needs finetuning to individual observing conditions. There are some great sounding locations where people are building Pi-lomar at the moment. So I’ve started adding some simple tools to help get the tracking parameters right. The idea is to show the results of the tracking calculation and the related parameters which can impact how it works. (I must explain how the tracking solution works soon)

Getting the telescope working south of the Equator! I am at 50+ degrees North here, out of extreme caution I put a warning on Instructables and in the software that the telescope might not work if you go too far south. But there is interest to make copies in the Southern Hemisphere. So with help from volunteers I’m looking at addressing some minor irritations with how the telescope behaves as you move further south. It looks like Pi-lomar will work already – but with a movement reset while tracking objects through due North. So the January release will accept Southern latitudes for the home location now and just warn you that it’s still under development.

There’s now a parameter “OptimiseMoves” – when you turn that parameter on the telescope will move much more freely through due North which should eliminate some irritations.

Example of the warning message that pilomar.py generates if the OptimiseMoves parameter is enabled.
Screenshot showing the warning if the new OptimiseMoves parameter is enabled. Should make smoother operation when making observations facing north more of the time.
Diagram to explain the motion differences when the OptimiseMoves parameter is used in Pi-lomar.
The effect of the OptimiseMoves parameter. By default the telescope will not pass North (360/0 degrees), it will reverse back to the other side and resume there. Enabling OptimiseMoves allows the telescope to rotate freely past North in either direction.

I’ve opened discussions on the GitHub site for anyone who wants to join in. When the feature is fully developed and proven to work that will become the normal operation everywhere.

The January improvements will be merged back into the main branch in the next few days.

Actual observations

It has been almost constantly cloudy here for months now. And the back yard is turning to mud, even the dog is reluctant to go out there. Really frustrating! I’m SO desperate to get out and get some more images captured. Nights on the east coast seem to come in three flavours…

  1. Too cloudy, calm, no moon.
  2. Clear, too windy, no moon.
  3. Clear, calm, full moon.

I’m hoping that some of the other builders will start capturing soon, maybe people can share those images too.


Hardware developments

Upgrading mk1

I have two completed Pi-lomar telescopes at the moment. After a break from 3D printing, I’m returning to the earlier build to upgrade it. The drive mechanism now feel less smooth than the Instructables version. That’s a relief! All the tweaks I put into the Instructables version made a difference. So I’ll be tearing mk1 down and testing out some further improvements to the drive and telescope tower. I’ll take the opportunity to widen the camera cradle – it will allow more room for higher quality lenses, and also let me test out the idea of an RPi5 with twin cameras later this year.

Prototype 3d printed part. This is a twin camera cradle idea. When connected to a Raspberry Pi 5 the telescope could support 2 cameras, each one optimised for different purposes.
First version of twin camera cradle. This will need a RPi5, but could run a 16mm lens for targeting/tracking and a separate 50mm lens for observations. (I hope!)
3D printer underway. Printing a new version of the tower walls to make more space inside the dome for more and larger lenses.
Modified design for tower walls under way to make space for the twin camera cradle.

Removing the infrared filter

Finally time to rip that infrared cut-off filter out of a Hi Quality sensor. The official instructions work, it is simple to do. The lens mount comes off the sensor board easily and the IR filter pops out cleanly with gentle push. I have left the sensor exposed, protected only when a lens is attached. I may try to re-cover it with OHP film as suggested as exposed sensors are dust magnets! I put the sensor inside a fresh clean freezer bag to minimise dust when making the mod.

I placed the 16mm telephoto lens on the sensor and stuck it on a tripod just to see what things looked like. Everything has now gone ‘pink’ so SOMETHING has changed anyway!

A very quickly captured image of the sword of Orion. Just moments after removing the infrared filter from a Raspberry Pi Hi Quality sensor. The sky has turned pink, so there is definitely a change in the wavelengths being captured. Just waiting for less moon and less wind to test it properly.
Infra red filter removed from Raspberry Pi Hi Quality sensor. Tripod mounted photo of Orion’s sword with 16mm telephoto lens and moonlit sky. It’s all gone PINK, that must be good right?!?

It’s not clear how wide the HiQ sensor’s infrared sensitivity is, but I think any expansion of wavelengths will be interesting to play with.

Fiddly focusing

I had to refocus the lens when I reattached it, and realised a better way to get it in focus. The focus ring of the 16mm lens is not very precise compared with larger lenses, I’ve always struggled a bit with this. I tried a different approach this time.

I set the lens focus ring fully to ‘FAR’ and locked it off. Then released the screw clamping the sensor mounted rear focus ring. That’s a much finer screw thread, it has a more positive movement, and allows really fine focus adjustment. It is mentioned in the official instructions, but I think it’s the BEST way to focus if you’re being fussy.

With this, the ‘raspistill –focus‘ command trick and some patience you can get quite fine control over the focus. You DO need a monitor connected via the HDMI port though. The preview image does not appear through VNC or puTTY sessions.

As always, it’s best to close the aperture ring a little to increase depth of field. I always reduce it to F2.8 so it’s still bright, you can reduce further if you are having problems.

Light pollution

We sit just outside an expanding town which is switching to LED streetlighting. Light pollution is an increasing problem. I have purchased a simple ‘light pollution‘ filter to add to the 50mm Nikkor lens. I will be testing this as conditions allow, I wonder if it helps, and I hope it doesn’t block infrared!


Other builds

As mentioned earlier, quite a few makes are now underway with the Instructables project. The first ‘I made this‘ post has appeared (well done Jeff!), and from the messages I have seen there are a few nearing completion.

It looks like the most common pain points have been the PCB (see above) and sourcing the motors and worm gear components. Hopefully PCB sourcing is easier now with the published Gerber files. I saw a couple of cases where people shared a PCB order to reduce costs.

For the worm gear, I wonder if a future design could switch to using planetary gearboxes on the Nema17 motors instead. They seem to be more widely available, and can even be purchased as complete units. They may require a rethink of the drive mechanism, I have ideas already.

At least one builder is improving the weatherproofing of the design, that will be exciting to see when it is ready. I think there is a lot to learn from that development if it happens.

There are a couple of really interesting alternative motor controller designs out there too, including alternative ways to power/reset the Tiny2040 as well.


Off the shelf motorcontrollers

I mentioned in December that I haven’t found a suitable off-the-shelf motor controller board yet. Well, in the tech world, a month is a long time. I recently came across an announcement from Pimoroni about their new YUKON board. The specifications sound interesting. It supports high current stepper motor drivers, has a modular design and an onboard RP2040. There’s a fully featured builders kit available, but you can also buy the bare board and individual driver components. Pimoroni’s website has a couple of overview videos, and there’s an example robot project on Youtube by Kevin McAleer. I’d like to try one of these at some point, IF I find the time. Maybe someone else will give it a try?

Pimoroni's image of their new Yukon robotics board. The specification sounds really useful for telescope control too.
Pimoroni’s website image of their Yukon board. Haven’t tested it yet, but the specification sounds really useful for controlling the telescope motors without having to build your own PCB.

So, quite a bit to test now. Here’s hoping for some clear, calm, dark skies before the winter is over!