The Raspberry Pi foundation have posted news of their new 8 mega pixel camera which was made available today. This was followed by members of the Pi in the Sky project confirming compatibility. So there will be even better SSDV images from near space for those people flying a PITS board.
The 5-megapixel visible-light camera board was our first official accessory back in 2013, and it remains one of your favourite add-ons. They’ve found their way into a bunch of fun projects, including telescopes, kites, science lessons and of course the Naturebytes camera trap. It was soon joined by the Pi NoIR infrared-sensitive version, which not only let you see in the dark, but also opened the door to hyperspectral imaging hacks.
As many of you know, the OmniVision OV5647 sensor used in both boards was end-of-lifed at the end of 2014. Our partners both bought up large stockpiles, but these are now almost completely depleted, so we needed to do something new. Fortunately, we’d already struck up conversation with Sony’s image sensor division, and so in the nick of time we’re able to announce the immediate availability of both visible-light and infrared cameras based on the Sony IMX219 8-megapixel sensor, at the same low price of $25. They’re available today from our partners RS Components and element14, and should make their way to your favourite reseller soon.
In our testing, IMX219 has proven to be a fantastic choice. You can read all the gory details about IMX219 and the Exmor R back-illuminated sensor architecture on Sony’s website, but suffice to say this is more than just a resolution upgrade: it’s a leap forward in image quality, colour fidelity and low-light performance.
VideoCore IV includes a sophisticated image sensor pipeline (ISP). This converts “raw” Bayer-format RGB input images from the sensor into YUV-format output images, while correcting for sensor and module artefacts such as thermal and shot noise, defective pixels, lens shading and image distortion. Tuning the ISP to work with a particular sensor is a time-consuming, specialist activity: there are only a handful of people with the necessary skills, and we’re very lucky that Naush Patuck, formerly of Broadcom’s imaging team, volunteered to take this on for IMX219.
Regarding the tuning process, I guess you could say the bulk of the effort went into the lens shading and AWB tuning. Apart from the fixed shading correction, our auto lens shading algorithm takes care of module to module manufacturing variations. AWB is tricky because we must ensure correct results over a large section of the colour temperature curve; in the case of the IMX219, we used images illuminated by light sources from 1800K [very “cool” reddish light] all the way up to 16000K [very “hot” bluish light].
The goal of auto white balance (AWB) is to recover the “true” colours in a scene regardless of the colour temperature of the light illuminating it: filming a white object should result in white pixels in sunlight, or under LED, fluorescent or incandescent lights. You can see from these pairs of before and after images that Naush’s tune does a great job under very challenging conditions.
As always, we’re indebted to a host of people for their help getting these products out of the door. Dave Stevenson and James Hughes (hope you and Elaine are having a great honeymoon, James!) wrote most of our camera platform code. Mike Stimson designed the board (his second Raspberry Pi product after Zero). Phil Holden, Shinichi Goseki, Qiang Li and many others at Sony went out of their way to help us get access to the information Naush needed to tune the ISP.