“This work comes from personal experience,” said Florian Koenigsberger, who leads Google’s image equity team. Koenigsberger’s mother is Jamaican and Black, his father German and white. His skin tone is relatively pale, but his brother is quite a bit darker. So those Thanksgiving family photos have always been an issue.
Shooting good images of Black or brown faces is famously challenging, especially when they share the frame with white faces, or when the background is brightly lit. If you expose the rest of the image correctly, a Black face can become an indistinct blur. But if you increase the exposure to bring out a Black person’s features, other things get overexposed. In addition, the extra light can overwhelm the natural warmth of the Black skin and produce an ashen effect.
You can blame the technical limitations of the cameras, and you’d be partly right. But Lorna Roth, professor emerita in communication studies at Canada’s Concordia University, said the problem goes back to the era of film cameras.
For many years, the chemical formulas for film emulsions were designed specifically to make white skin look good. Industry leader Eastman Kodak even created a skin color reference card to help professional photographers calibrate their equipment. It was called a “Shirley card,” in honor of the first model who posed for it, a white woman.
Kodak could have improved its film’s rendering of dark skin. As a former lighting engineer at Black Entertainment Television told Roth: “If Black people had designed films, we wouldn’t be having this problem.” But for decades the company didn’t bother. In 1978, the French film director Jean-Luc Godard, hired to shoot a movie in Mozambique, refused to use Kodak film because it made Black people look bad on screen. Godard denounced the company as racist for failing to do better.
But it wasn’t malicious racism, according to Roth. Instead, it was sheer cluelessness. Kodak was run almost entirely by white people who never gave much thought to the needs of their non-white customers. And even as the civil rights movement roiled the nation during the 1950s and 1960s, Black citizens were focused on voting rights and desegregated schools. Demanding better camera film wasn’t a very high priority.
Kodak made efforts to improve starting in the 1960s, but at first it wasn’t because of race. Instead, advertising agencies that represented chocolate makers and lumber companies griped that their clients’ dark-colored products looked awful when shot with Kodak film. So Kodak began making films that did better with darker shades. In the 1990s, it launched a new Shirley card for camera calibration, featuring three women — Asian, Black and white.
But by then, film was being supplanted by digital cameras afflicted with the same dark-color problem. Again, technology is partly to blame. Ramesh Raskar, who runs the Camera Culture group at the Media Lab of the Massachusetts Institute of Technology, said that until recently, smartphone cameras weren’t capable of doing justice to every skin tone.
“Only for the last three or four years, maybe three years, have photo sensors been good enough to capture a wide enough dynamic range,” Raskar said.
But today’s digital camera makers have run out of excuses. Not only have the cameras gotten much better — so have the computer chips inside the phones. Every time you shoot a smartphone image, the phone’s processor instantly modifies it to produce the best possible image. It’s called computational photography, and it makes possible the extraordinarily rich, sharp images from today’s best phones.
The Pixel 6 uses a number of smart gimmicks. For instance, most high-end phone cameras don’t take a single photo when you hit the shutter. Instead, they shoot five or six or more, each with different light and color balance settings. Then the phone’s computer stitches these images together, using bits and pieces of each image to create the finished photo.
The Pixel 6 software has been optimized to pick out the best-looking facial images from these multiple shots. If one shot gets the face right while putting too much light on the background, it will combine the good-looking facial shot with a better background image. At the same time, it uses artificial intelligence algorithms to adjust the color balance and lighting for each face, to make sure it’s well-lit, while displaying accurate skin tones.
The overall effect is subtle, but quite noticeable. For instance, even in low light, the camera delivers clear images of brown faces, with lots of detail and rich coloring.
To teach the camera, Google analyzed huge image databases featuring human skin of every shade. And the company brought in a team of Black, Hispanic, and Asian photographers and videographers to help train its AI software.
“The tool that we’re saying is designed to work for you was also built with people who look like you,” Koenigsberger said.
It’s all stuff that camera makers could have done a few years ago, but recent racial-equity discussions may have put pressure on tech companies to act. And not just Google. The social network Snapchat has said that it will introduce software that will modify the performance of existing smartphone cameras, so they produce better photos of dark-skinned people. Apple says it has upgraded the AI in its latest iPhones for improved rendering of dark skin, and a spokeswoman for smartphone giant Samsung said the company’s latest model, the Galaxy S21, has also been tailored to do better with brown skin.
“I think that’s fantastic,” said Koenigsberger. “We should get to a place where this doesn’t have to be a competitive thing, right? Where everyone knows that no matter what tool they pick up, they will be seen, fairly, as they are.”