Night Mode made filming the Northern Lights no problem (and that’s crazy)

Image of the Northern Lights in Iceland taken using a Google Pixel 9 Pro XL
Pictured: Image of the Northern Lights in Iceland taken using a Google Pixel 9 Pro XL
// Google doesn’t get enough credit for making it this easy to film the Northern Lights.
Fergus Halliday
Nov 11, 2024
Icon Time To Read6 min read

Low light camera performance rarely gets any higher stakes than the Northern Lights.

Visible in a handful of locations during a specific time of year, the aurora is one of the most challenging things on earth to capture with a camera. It used to be the case that taking pictures of the northern lights required a winning combination of luck, skill, equipment and expertise to bring it all together. 

Nowadays, all you need to do is swipe over to Night Mode and hit the button. Seeing the aurora in-person during a visit to Iceland earlier this year, I was genuinely blown away by not just how well the Google Pixel 9 XL Pro managed to capture the phenomenon but how painless that experience was. 

That outcome is no accident. It’s the by-product of years of refinement. While a dedicated astrophotography mode was originally introduced with the Google Pixel 4, the team behind the feature has continued to build on that foundation in the years since. 

Speaking to Reviews.org, Google product manager Michael Specht cited his background as a professional photographer and expressed a deep familiarity with the pains of astro and low-light photography. 

“I’ve seen the Northern Lights twice in my life and not getting the right image when you’re there is frustrating and disappointing. We’ve worked really hard to make sure that regardless of your skill level and experience with a camera you can take really amazing images on Pixel,” he said.

Asked to describe the challenges that he and the rest of the Pixel team are trying to solve when it comes to edge cases like this one, Specht started by pointing out the differences between dedicated and smartphone cameras. The biggest of these is size.

“Dedicated cameras come in all shapes and sizes and forms. Mobile cameras still have to fit in your pocket so we are essentially stuck with a physics issue where we can only fit a certain size of sensor if we want to keep the device able to fit into your pocket as a customer.”
Image of the Northern Lights in Iceland taken using a Google Pixel 9 Pro XL

“The smaller the sensor the less light sensitive it is so as you dig into low-light or astrophotography, the problem becomes harder and harder to overcome,” he said.

On Google’s Pixel hardware, Night Mode is much more than the long exposure setting it’s often mistaken for. 

“It’s a blending of Google’s computational imaging pipeline and traditional long exposure photography,” Specht explained. 

Google has a database of thousands of raw images that it uses to refine its computational photography efforts. It also incorporates more personal feedback from the in-house photographers that cultivate that resource.

“This allows us to learn how to tune better, build better algorithms that are ML-based and get feedback from them [by] asking ‘How did this feel to actually go take this image’?”

Every time you take an image with the camera on a Pixel smartphone, the finished product isn’t what you’d get from a traditional camera. Instead of just capturing the light as it hits the sensor and letting the chips fall where it may, Google’s smartphones capture a string of nearly identical images at different exposure times and then uses an algorithm to merge those fragments into one more cohesive whole. 

The rise of generative AI has prompted many in the tech world to question the existential nature of a photo, but there’s a case to be made that smartphones that capture HDR images in this way crossed that threshold years ago.

The Night Mode and astro settings found on the Google Pixel 9 and Pixel 9 Pro apply this same process over a longer timeframe. For each frame involved, your phone is generating a synthetic image that it can then use to produce a long exposure one. 

“With astro mode, we’re taking 15 frames – up to 16 seconds – per exposure and that’s a pretty hard thing to go and match these stars together that don’t line up as hot pixels and merge those together, denoise in a way and keep the actual stars [and] keep stuff that really should be there,” Specht explained.

Asked how things like the Night Boost mode introduced with last year’s Pixel 8 and Pixel 8 Pro answers those challenges, Specht framed the feature as an attempt to try and take advantage of the unique cloud-based resources that Google has to offer its customers something that its rivals cannot. 

Practically, applying the Pixel camera’s entire imaging pipeline to low light video footage would take a long time. It would also be extremely taxing for the mobile silicon inside a smartphone like the Pixel 9 Pro XL.

“It’s much quicker for customers to deliver them to the cloud, process them and deliver them back as quickly as possible,” Specht said.

“If we look down the road and as technology on devices gets faster and faster and more efficient [then] ideally we would bring these features that are in the cloud today back onto devices but ultimately that would leave even more headroom for more computationally intensive features as that technology changes too,” he added.

When it comes to these sorts of more niche corners of the smartphone photograph landscape, AI has proved to be something of a shortcut. However, for the Northern Lights at least, Specht is keen to stick with more traditional technical wizardry.

According to him, AI-powered measures like Samsung’s moon setting aren’t really required because the Pixel’s imaging pipeline is already robust enough to handle the use case without the benefit of more specific tuning. 

“Broadly, as a camera product, we do recognition of different scene types and adjust and tune differently and so we definitely have the AI smarts built into our pipeline to do specifically that but aurora is not one of those,” Specht said. 

He said that the goal for the Pixel camera team is to “stay true to life and what our eyes see”.

Of course, one of the most fascinating things about the Northern Lights is that they don’t actually look the way they do in pics that they do in person. 

While every image you’ve probably ever seen of the phenomenon is a dazzling display of greens and yellows, the experience that casual aurora-chasers are likely to encounter is one where their eyes don’t actually absorb enough light to pick up on the colors that a camera sensor can. In that way, the Northern Lights represent something of a reversal of the usual challenge for people like Specht.

“Usually we’re chasing to capture the thing that our eyes can see so well and with the northern lights we’re kinda doing the opposite where our eyes are trying to catch up to the seeing what the camera can see so well,” he said.
Image of the Northern Lights in Iceland taken using a Google Pixel 9 Pro XL

Even with advancements like night mode though, one thing that most smartphones still struggle with is taking video footage of the northern lights. All that same computation that goes into rendering a single frame is multiplied manifold. For modern smartphones like the iPhone 16 that amount of computation is no easy feat.

Fortunately, for those in the Pixel ecosystem, things became a lot easier with the recent introduction of the Astro Lapse camera mode. As the name suggests, this hybrid of astrophotography and timelapse will kick in and automatically turn selected segments of a long exposure capture into animated clips. The Pixel camera can’t actually capture video footage of the aurora, but it can use its image-based coverage to generate something that’s just as good. 

“What’s amazing is that you had it instantly available. You could share it and see it and watch it with your own eyes,” Google product manager Maayan Rossman explained. 

As for what’s coming in the future, Specht couldn’t comment on any upcoming features but emphasized the importance of balancing fidelity and functionality. 

“As someone coming from an imaging background, image quality is very important to me and I always want to walk away with the best image possible. I also have to think about [how] my mom is going to use this phone too and maybe she wants the best image possible but isn’t going to wait four minutes for an astro image.”

To him, the convenience of using features that let you get the most out of the Pixel camera is just as important a challenge as the quality of those images.

“How can we improve the experiences exponentially without degrading the quality? It’s not always about getting the best quality possible, it could be how do we get really good quality but much quicker,” he explained

That balancing act is a key concern when it comes to determining what’s next for the Pixel camera. Specht said that the roadmap for building on or adding new features to Night Mode is more or less the same as its daytime counterpart.

“We [ask] where are the areas and new technologies that we can develop to squeeze out the next quality gain for these really tough situations. Whether we’re developing a new ML-based denoiser to improve noise quality in minimal light or a better merge and alignment of the multi-frames so we get better and sharper results.” 

“Maybe the denoiser needs to work less because we have more frames we can average together or maybe it’s [a case] where our computation imaging pipeline is getting better where we need less frames or exposure time and our user experience is getting better,” he said.

The most impressive feats that the camera on the Pixel 9 Pro XL is capable of might be found on the fringes, but the foundation is where the magic happens. You’ll probably never make use of the most impressive things that the camera on the Google Pixel can do. Fortunately, what’s happening under the hood is almost as incredible as the aurora it can be used to capture.

Fergus Halliday
Written by
Fergus Halliday is a journalist and editor for Reviews.org. He’s written about technology, telecommunications, gaming and more for over a decade. He got his start writing in high school and began his full-time career as the Editor of PC World Australia. Fergus has made the MCV 30 Under 30 list, been a finalist for seven categories at the IT Journalism Awards and won Most Controversial Writer at the 2022 Consensus Awards. He has been published in Gizmodo, Kotaku, GamesHub, Press Start, Screen Rant, Superjump, Nestegg and more.

Related Articles

Bloodborne Pixelated
Fever Dreams: How the quest to bring Yharnam to PC took over the PS4 emulation scene
For as much acclaim as Bloodborne accrued when it launched back in 2015, it didn’t...
Shark PowerDetect Vacuum
Shark PowerDetect Vacuum review: Opposing forces
Full of juxtapositions and pet fur