Google is trying to make the cameras on its phones more comprehensive. We tested it against its competitors.
The back and forth began. Young and Joanna Lucky took photos with different smartphones for a while to determine which one gave the best results.
Today, the phones in our pockets can produce images of the kind of resolution they can rival—and sometimes even outperform! Dedicated cameras. Young shared with me that people of color still feel marginalized in the photos they take. This is partly because smartphones don’t always understand how to deal with black and brown faces.
“I suspect [clarity] is what a lot of people are looking for — they see a picture and say, “Hey, that looks obvious,” she said. “But is it just like me? Is it comfortable? Does it accommodate the way my hair naturally looks?”
Google is one of the many companies that are trying to make money selling smartphones, according to Canalys research. Only a small percentage of phonesShipped in America – they were most open about making the cameras more inclusive. Starting in 2021Google’s Pixel smartphones now come with “Real Tone”, an under-the-hood camera tweak that the company claims it has developed. It will enable them to take more pleasing photos of colorful subjects.
To verify these claims, I took pictures with people who visited a San Francisco vacation hotspot. Google Pixel 7 Pro, $899Compare the results with images taken from the Samsung Galaxy S22 Ultra for $1,199 or the Apple iPhone 14 Pro Max for $1,099.
It wasn’t long until it was obvious: Google and other companies hadn’t solved the problem. (Not right now anyway.) The proof is in the pictures. And how their subjects felt about it.
However, it is important to understand how smartphones cameras work.
Your phone’s camera makes decisions for you
It took a lot of work to get a decent shot back in the days when film photography was purely manual. Apart from making sure you have the right film, it is important to adjust the aperture and decide how long the shutter should stay open before you click any button. Then, I developed it.
Your phone can do all of this in a blink of an eye. Your phone’s computing power is exponentially greater than that of the old point-and shoot technology. It can automatically adjust and process these images much faster than you might notice. Smartphone cameras with more advanced features can do smarter tricks like taking multiple shots of the same scene and combining the best parts.
In other words, when you press the shutter button on the screen, you won’t be alone – you have a second command in form of a programme.
This is a method of image production called computational photography. You might notice your phone photos occasionally because of this approach. It looks much brighter and more colorful than the real world. The problem is According to Googleis that some of the key technologies — such as camera sensors and processing algorithms — that help determine the way a color subject in a photo looks were mostly trained using photos of light-skinned people.
Google claims it has made the images that define how these technologies interact and work more diverse to try and fix this problem.
“Over the past year, we have added more than 10,000 photos to the datasets used to fine-tune the Pixel camera,” Shenaz Zack, director of product management at Google, said when the Pixel 7 was unveiled earlier this year. “Through this work we adjusted exposure to better represent darker skin tones when working in low-light environments.”
Google claims that its Pixel phones can also detect faces of people with darker skin to improve their accuracy and adjust the white balance in photos to make them more accurate.
The company is open about this fact, but we did find some people willing to let someone else take their picture in the same circumstances Zack described. We will attempt to discern the differences between the Google claims.
To get a feel for the camera’s capabilities, I took one photo of people bathing in the warm sunlight and another photo where the subjects are lit from the side. That’s when I meet Anthony Sturgess—a Bay Area native and employee of a 3D-printing company—and Michelle Neal, his long-distance girlfriend who’s visiting San Francisco from South Africa.
Both preferred the Pixel’s results in the close-up, which Sturgis found particularly surprising — he’s a Samsung guy, after all. Sturgess & Neil agree on the photo of the well-lit tree behind them. Preferred iPhone results
We took:In the first set, the Samsung phone let the yellow light from the tree overpower Sturgis, Nell’s natural skin colors, but the Pixel was very close to the iPhone. The Samsung phone has significantly lightened the second photo. While the Pixel’s image captured my attention, the iPhone kept some of the warmth and tones of their skin while brightening their faces.
Cloudy days, difficult colors
Gloomy days aren’t just frustrating — they can also wreak havoc on some photos. How did phones handle all this?
Every year before Christmas, Dennis Santoyo and Lance Hobson drive to San Francisco for a bit of shopping and people-watching — I caught the two of them posing for selfies.
“Samsung photo and to a lesser extent Google Pixel photo appear to be overexposed, resulting in excessive brightness,” said Hobson. The iPhone image seems to be the most accurate representation. The iPhone image shows skin tones that are closer to reality than the images to the left. The colors are crisp/warm and they look very lifelike.”
We took: The photo from Samsung is a little more purple than the others. Also, the phone has automatically smoothed some of the details in Santoyo and Hopson’s faces. The iPhone vs. Pixel is a matter of preference. However, Denise’s warm skin definitely shines more on the latter.
Young and her mother? I took a photo of Young and her mother at night, with their backs facing Macy’s multi-story bright screen.
Young stated that while she liked how crisp the iPhone images were, she preferred the natural colors that the Samsung phone produced. “I would definitely say that our skin tone came out better using the Samsung camera,” she said.
The Pixel was a mixed bag. Luckey and the Pixel both thought the result was a bit “damp”, although Luckey said that it made the image look “graceful”.
We took:The iPhone did a great job of highlighting Luckey’s facial features. However, the Galaxy was a better choice. This theme is more striking. Unfortunately, Pixel seems to have a lot of trouble, which makes it difficult to see Joanna’s entire face.
Google did not respond immediately to a request for comment about how these images are processed. Apple and Samsung could not be reached immediately for comment.
It’s not surprising that everyone has different photo preferences. Your personal preferences about how you want to look on the camera are rooted in your relationship and with your date. These tastes may not align with what Samsung, Apple, or Google think is the best way to make yourself look like you.
As always, it’s all about taste. This means that editing photos won’t be an easy task. Google’s work is not ignored, however.
“Just for someone to come out and be like, ‘Hey, we want to make sure you’re represented in a good light and we want to make sure we capture everything’ — it’s definitely important,” Young said.
[Denial of responsibility! reporterbyte.com is an automatic aggregator of the all world’s media. In each content, the hyperlink to the primary source is specified. All trademarks belong to their rightful owners, all materials to their authors. If you are the owner of the content and do not want us to publish your materials, please contact us by email – reporterbyte.com The content will be deleted within 24 hours.]