AI-driven tools on real photos
Mar. 5th, 2024 09:35 am![[personal profile]](https://www.dreamwidth.org/img/silk/identity/user.png)
Not the article, it’s fine. It’s the photos. Mostly the top one.
It looks fake.
There’s a credited photographer: Ann Hermes. She’s 100% real, a working professional, she’s done a lot of work for a lot of clients, she’s easy to find.
And I don’t know what happened – what tool caused it or what – but the lead photo shouts AI PHOTOGRAPHY at me.
There are a few reasons for it – the central figure’s facial and head hair is a big one, something about the antique touchtone telephone on the left really bugs me (proportions? size? I can’t tell?), the lighting just feels odd, and some letters are kinda fucked up) – but the details aren’t important. That’s not what bothers me.
What bothers me is that this is a most-likely real photo that’s making me think it’s an AI-generated fake. And it bothers me because…
…see…
if people are going to start using tools on real photos that make them look a little fake, particularly if the ways they look a little fake are the same way that AI-generated actual fakes look fake, then that’s incredibly bad.
So far we’ve been lucky enough in ’24 not to have to deal with really good fakes. We’ve mostly had trash fakes. But if the real photographers start to use tools that make their real photos look fake in the same way as AI renderings, then these AI renderings not being really good fakes stops mattering very much – if at all. We lose that little bit of edge we had against AI-driven disinformation.
I mean, maybe that’s not what happened here. Maybe his hair really is that kinda weirdly defined and she did some HDR tricks to bump up the definition and contrast not even using AI tools. Maybe she corrected some lens distortion and that’s why the phone looks funny. Maybe she boosted the black level a little to de-emphasise some dark areas – it’s tempting, I’ve done it, but you have to be careful with it or it looks weird. Maybe the light just was a little odd, and/or she set up some reflectors to make it that way because the scene was too high contrast otherwise. Or maybe she did it afterwards using ordinary levelling tools.
But however it happened – and whoever did it, be it the photographer, the layout artist, the web designer – it still came out setting off my AI alarms.
And that’s incredibly bad.
Fuck.
Posted via Solarbird{y|z|yz}, Collected.
no subject
Date: 2024-03-06 02:16 am (UTC)Photo editing software is going increasingly AI because that's what users want. Well, more specifically what users want is "auto" mode. Seriously powerful photo editing tools have been available for years. The problem is, a lot of what they're capable of doing is expert-mode stuff. To really get the value of it, a person editing a photo has to:
(1) Understand, at some technical level, what is wrong with the photo.
(2) Have enough expertise with the tool to understand the transformations it offers and how to use them. And
(3) Have time to perform the edits.
Amateurs lack the know-how of #1 & #2. Pros may have those but are frequently challenged for time (#3). The answer to both is "auto" mode, where the computer makes expert decisions for you. That means AI.
Source: I'm an enthusiast photographer who's used digital imaging tools for 30 years.
no subject
Date: 2024-03-06 06:38 am (UTC)no subject
Date: 2024-03-06 06:33 am (UTC)no subject
Date: 2024-03-06 06:39 am (UTC)This way lies the return of honour culture.
no subject
Date: 2024-03-06 01:49 pm (UTC)I think a large part of what is going on is the lens used. The photographer seems to have been trying to get a large part of a small room from inside the room, with all of it in focus. So, a wide-angle lens (or a zoom all the way wide) and small aperture (for greater depth of field) were used. Probably a long exposure, too.
Wide-angles are used a lot in landscape photography, too.