How to Use AI to Edit and Generate Stunning Photos

Photo credit: Charles Desmarais
Photo credit: Charles Desmarais
TT
20

How to Use AI to Edit and Generate Stunning Photos

Photo credit: Charles Desmarais
Photo credit: Charles Desmarais

By Brian X. Chen

Much of the hype and fears around generative AI has been about text. But there have also been rapid and dramatic developments in systems that can generate images. In many cases, these share a similar structure to text-based generative AI, but they can also be much weirder — and lend themselves to some very fun creative pursuits.

Image generators are trained on billions of images, which enable them to produce new creations that were once the sole dominion of painters and other artists. Sometimes experts can’t tell the difference between AI-created images and actual photographs (a circumstance that has fueled dangerous misinformation campaigns in addition to fun creations).

Compared to products like ChatGPT, image generating AI tools are not as well developed. They require jumping through a few more hoops and may cost a bit of money. But if you’re interested in learning the ropes there’s no better time to start.

AI Photoshop

Last week, Adobe added a generative AI feature into a beta version of Photoshop, its iconic graphics software, and creators on social networks like TikTok and Instagram have been buzzing about it ever since.

When I tested the new feature, called “generative fill,” I was impressed with how quickly and competently the AI carried out tasks that would have taken me at least an hour to do on my own. In less than five minutes and with only a few clicks, I used the feature to remove objects, add objects and swap backgrounds.

(To experiment with these tools yourself, start by signing up for a free trial of Adobe Creative Suite. Then, install the new Adobe Photoshop beta, which includes generative fill.)

Once you have Photoshop beta installed, import a photo and try these tricks:

To change a background, click the “object selection” icon (it has an arrow pointed at a box), then under the Select menu, click “inverse” to select the background. Next, click the “generative fill” box and type in a prompt — or leave it blank to let Photoshop come up with a new background concept for you.

I used these steps to edit a photo of my corgi, Max. I typed “kennel” for the prompt and clicked “generate" to replace the background.

To remove objects, use the lasso tool. In this photo of my motorcycle, I wanted to erase a tractor behind a fence in the background. I traced around the tractor, and then I clicked the “generative fill” box and hit “generate” without entering a prompt. The software correctly removed the tractor and filled in the background while leaving the fence intact.

Photo editors at The New York Times do not enhance or alter photos or generate images using artificial intelligence. But my first thought after testing generative fill was that photo editors working in other contexts, like marketing, could be soon out of work. When I shared this theory with Adobe’s chief technology officer, Ely Greenfield, he said that it might make photo editing more accessible, but he was optimistic that humans would still be needed.

“I can make really pretty images with it, but frankly, I still make boring images,” he said. “When I look at the content that artists create when you put this in their hands versus what I create, their stuff is so much more interesting because they know how to tell a story.”

I confess that what I’ve done with generative fill is far less exciting than what others have been posting on social media. Lorenzo Green, who tweets about AI, posted a collage of famous album covers, including Michael Jackson’s “Thriller” and Adele’s “21” that were expanded with generative fill. The results were quite entertaining.

(One note: If installing Photoshop feels daunting, a quicker way to test Adobe’s AI is to visit the Adobe Firefly website. There, you can open the generative fill tool, upload an image and click the “add” tool to trace around a subject, such as a dog. Then click “background” and type in a prompt like “beach.”)

More image generators

Tools like DALL-E and Midjourney can create entirely new images in seconds. They work similarly to chatbots: You type in a text prompt — the more specific, the better.

To write a quality prompt, start with the medium you’d like to emulate, followed by the subject and any extra details. For example, typing “a photograph of a cat wearing a sweater in a brightly lit room” in the DALL-E prompt box will generate it.

DALL-E, which is owned by Open AI, the maker of ChatGPT, was one of the first widely available AI image generators that was simple for people to use. For $15, you get 115 credits; one credit can be used to generate a set of four images.

Midjourney, another popular image generator, is a work in progress, so the user experience is not as polished. The service costs $10 a month, and entering prompts can be a little more complicated, because it requires joining a separate messaging app, Discord. Nonetheless, the project can create high-quality, realistic images.

To use it, join Discord and then request an invitation to the Midjourney server. After joining the server, inside the chat box, type “/imagine” followed by a prompt. I typed “/imagine a manga cover of a corgi in a ninja turtle costume” and generated a set of convincing images.

Though it’s fine to type in a basic request, some have found obscure prompts that generated exceptional results. At Columbia University, Lance Weiler is teaching students how to leverage AI, including Midjourney, to produce artwork.

Whichever tool you use, bear in mind that the onus is on you to use this tech responsibly.

Technologists warn that image generators can increase the spread of deepfakes and misinformation. But the tools can also be used in positive and constructive ways, like making family photos look better and brainstorming artistic concepts.

The New York Times



Video Game Actors Are Voting on a New Contract. Here’s What It Means for AI in Gaming

A picketer holds a sign for the SAG-AFTRA video game strike at Warner Bros. Games headquarters on Aug. 1, 2024, in Burbank, Calif. (AP)
A picketer holds a sign for the SAG-AFTRA video game strike at Warner Bros. Games headquarters on Aug. 1, 2024, in Burbank, Calif. (AP)
TT
20

Video Game Actors Are Voting on a New Contract. Here’s What It Means for AI in Gaming

A picketer holds a sign for the SAG-AFTRA video game strike at Warner Bros. Games headquarters on Aug. 1, 2024, in Burbank, Calif. (AP)
A picketer holds a sign for the SAG-AFTRA video game strike at Warner Bros. Games headquarters on Aug. 1, 2024, in Burbank, Calif. (AP)

An 11-month strike by video game performers could formally end this week if members ratify a deal that delivers pay raises, control over their likenesses and artificial intelligence protections.

The agreement feels "like diamond amounts of pressure suddenly lifted," said Sarah Elmaleh, a voice actor and chair of the Screen Actors Guild-American Federation of Television and Radio Artists' interactive branch negotiating committee.

Union members have until Wednesday at 5 p.m. Pacific to vote on ratifying the tentative agreement.

Voice and body performers for video games raised concerns that unregulated use of AI could displace them and threaten their artistic autonomy.

"It’s obviously far from resolved," Elmaleh said. "But the idea that that we’re in a zone where we might have concluded this feels like a lightening and a relief."

AI concerns are especially dire in the video game industry, where human performers infuse characters with distinctive movements, shrieks, falls and plot-twisting dialogue.

"I hope and I believe that our members, when they look back on this, will say all of the sacrifices and difficulty we put ourselves through to achieve this agreement will ultimately be worth it because we do have the key elements that we need to feel confident and moving forward in this business," said Duncan Crabtree-Ireland, the SAG-AFTRA national executive director and chief negotiator.

Here’s a look at the contract currently up for vote, and what it means for the future of the video game industry.

How did the current strike play out? Video game performers went on strike last July following nearly two years of failed negotiations with major game studios, as both sides remained split over generative AI regulations.

More than 160 games signed interim agreements accepting AI provisions SAG-AFTRA was seeking, the union said, which allowed some work to continue.

The video game industry is a massive global industry, generating an estimated $187 billion in 2024, according to game market forecaster Newzoo.

"OD," and "Physint" were two games delayed due to the strike during the filming and casting stage, video game developer Hideo Kojima wrote in December. Riot Games, a video game developer, announced that same month that some new skins in "League of Legends" would have to use existing voice-overs, since new content couldn't be recorded by striking actors. Skins are cosmetic items that can change the visual appearance of a player and is sometimes equipped with new voice-overs and unique recorded lines.

The proposed contract "builds on three decades of successful partnership between the interactive entertainment industry and the union" to deliver "historic wage increases" and "industry-leading AI provisions," wrote Audrey Cooling, a spokesperson for the video game producers involved in the deal.

"We look forward to continuing to work with performers to create new and engaging entertainment experiences for billions of players throughout the world," Cooling wrote.

Video game performers had previously gone on strike in October 2016, with a tentative deal reached 11 months later. That strike helped secure a bonus compensation structure for voice actors and performance capture artists. The agreement was ratified with 90% support, with 10% of members voting.

The proposed contract secures an increase in performer compensation of just over 15% upon ratification and an additional 3% increase each year of the three-year contract.

How would AI use change in video games? AI concerns have taken center stage as industries across various sectors attempt to keep up with the fast-evolving technology. It’s a fight that Hollywood writers and actors undertook during the historic film and TV strikes that forced the industry to a stop in 2023.

"In the last few years, it’s become obvious that we are at an inflection point where rules of the road have to be set for AI, and if they aren’t, the consequences are potentially very serious," Crabtree-Ireland said. "I think that really made this negotiation extra important for all of us."

SAG-AFTRA leaders have billed the issues behind the labor dispute — and AI in particular — as an existential crisis for performers. Game voice actors and motion capture artists’ likenesses, they say, could be replicated by AI and used without their consent and without fair compensation.

The proposed contract delineates clear restrictions on when and how video game companies can create digital replicas, which use AI to generate new performances that weren't recorded by an actor.

Employers must obtain written permission from a performer to create a digital replica — consent which must be granted during the performer’s lifetime and is valid after death unless otherwise limited, the contract states. The time spent creating a digital replica will be compensated as the same amount of work time it would have required for a new performance.

The agreement also requires the employer to provide the performer with a usage report that details how the replica was used and calculates the expected compensation.

Elmaleh, who has been voice acting since 2010 and had to turn down projects throughout the strike, said securing these gains required voice actors bring vulnerability and openness to the bargaining table.

"We talked a lot about the personal, the way it affects our displacement as workers and just the sustainability of our careers," Elmaleh said. "Our work involves your inner child. It’s being very vulnerable, it’s being playful."

What’s next for the video game industry? The tentative agreement centers on consent, compensation and transparency, which union leaders say are key elements needed for the industry to keep progressing.

As the contract is considered by union members, Elmaleh and Crabtree-Ireland said further work needs to be done to ensure the provisions are as broad as necessary.

"Even though there’s a deal that’s been made now, and we’ve locked in a lot of really crucial protections and guardrails, the things that we haven’t been able to achieve yet, we’re going to be continuing to fight for them," Crabtree-Ireland said. "Every time these contracts expire is our chance to improve upon them."

Elmaleh said she hopes both the video game companies and performers can soon work collaboratively to develop guidelines on AI as the technology evolves — a process she said should start well the proposed contract would expire in October 2028.

Leading negotiations has felt like a full-time job for Elmaleh, who took on the role in a volunteer capacity. As the efforts die down, she said she anxiously anticipates returning to video game acting in a landscape that is safer for performers.

Voice acting "is core to who I am. It’s why I fought so hard for this. I wouldn’t do this if I didn’t love what I do so much. I think it’s so special and worthy of protection," she said.