EU 'Enforcer' Visits Twitter, Meta as New Rules Loom

EU commissioner Thierry Breton will meet with Meta's Mark Zuckerberg and Twitter owner Elon Musk, who took over the highly influential platform in 2022. Ludovic MARIN / POOL/AFP/File
EU commissioner Thierry Breton will meet with Meta's Mark Zuckerberg and Twitter owner Elon Musk, who took over the highly influential platform in 2022. Ludovic MARIN / POOL/AFP/File
TT

EU 'Enforcer' Visits Twitter, Meta as New Rules Loom

EU commissioner Thierry Breton will meet with Meta's Mark Zuckerberg and Twitter owner Elon Musk, who took over the highly influential platform in 2022. Ludovic MARIN / POOL/AFP/File
EU commissioner Thierry Breton will meet with Meta's Mark Zuckerberg and Twitter owner Elon Musk, who took over the highly influential platform in 2022. Ludovic MARIN / POOL/AFP/File

The EU commissioner in charge of enforcing Europe's new landmark rules on online content is heading to San Francisco on Thursday to ensure that the big platforms are ready.

The two-day visit by Thierry Breton comes just weeks before the European Union's Digital Service Act (DSA) comes into full force for the world's biggest platforms, including Facebook and Instagram, both owned by Meta, as well as TikTok and Twitter.

Breton will meet with Meta's Mark Zuckerberg and Twitter owner Elon Musk, who took over the highly influential platform late last year.

All eyes are on Musk, who since taking ownership of Twitter has, sometimes abruptly, modified many rules about what language is allowed on the site, even if it is found offensive or delivers hate and misinformation -- in direct opposition to the EU's new rules.

Breton also plans to meet in California with Sam Altman, the chief executive of OpenAI, the tech company behind ChatGPT as well the boss of AI chipmaker Nvidia.

EU lawmakers are in final negotiations to complete the AI Act, another proposed European law with the potential for imposing huge influence on US big tech companies.

"I am the enforcer. I represent the law, which is the will of the state and the people," Breton said to Politico last month when announcing the trip.

In an effort to reassure the Europeans, Musk has accepted that Twitter undergoes a DSA "stress test" to see if his platform will reach the EU's standards, though the results will not be public.

On a visit to Paris last week, Musk said he had every intention of meeting the demands of the DSA.

But with Twitter's payroll cut to the bone and content moderation teams decimated, observers doubt whether Musk is in a position to stand by his commitment.

'Easy target'

The DSA is one of the most ambitious legislations on controlling online content since the advent of social media, putting major obligations on how the world’s biggest platforms deal with the free flow of speech.

Like the EU's General Data Protection Regulation, the DSA is expected to become a global benchmark as governments worldwide struggle to find ways to rein in the excesses of social media.

To meet the new rules, Twitter, Meta, TikTok and other platforms will have to invest heavily on building compliance teams just at a time when big tech companies having been firing staff, including their content moderation workforce.

Under the DSA, 19 platforms have been designated as "Very Large Online Platforms," which will be subject to specially designated rules beginning on August 25, when the full force of the regulation will take effect.

"It's going to come down to what the first enforcement action looks like. Who will be made an example of?" said Yoel Roth, the former head of Trust and Safety at Twitter, who is now a Technology Policy Fellow at UC Berkeley.

"I think my former employer is an easy target, but what does that look like?" he said, in an interview with AFP.

Roth said that the DSA's biggest challenge for big platforms will be the transparency requirements.

Under the DSA, Meta, Twitter and others will have to provide officials and researchers unprecedented access to their algorithms and content decisions.

This will be especially a challenge for Meta, which since the 2018 Cambridge Analytica data breach scandal has severely limited access to data for third parties, Roth said.

And in a hunt to make money, Twitter and Reddit have also cut off access to data by charging high fees for outsiders – including researchers – to have access to their data through something called APIs, that were free until recently.

The wide-ranging DSA has many other provisions, including an obligation that platforms designate a representative in the EU who would be responsible for content matters.

Users will also be handed unprecedented rights to lodge an appeal when subjected to takedown orders by a platform.

Major violations of DSA rules could see tech giants slapped with fines as high as six percent of annual turnover and, if violations persist, be banned outright from the EU as a last resort measure.



Can AI Make Video Games More Immersive? Some Studios Turn to AI-Fueled NPCs for More Interaction

The AI (Artificial Intelligence) letters and robot hand are placed on computer motherboard in this illustration taken, June 23, 2023. (Reuters)
The AI (Artificial Intelligence) letters and robot hand are placed on computer motherboard in this illustration taken, June 23, 2023. (Reuters)
TT

Can AI Make Video Games More Immersive? Some Studios Turn to AI-Fueled NPCs for More Interaction

The AI (Artificial Intelligence) letters and robot hand are placed on computer motherboard in this illustration taken, June 23, 2023. (Reuters)
The AI (Artificial Intelligence) letters and robot hand are placed on computer motherboard in this illustration taken, June 23, 2023. (Reuters)

For decades, video games have relied on scripted, stilted interactions with non-player characters to help shepherd gamers in their journeys. But as artificial intelligence technology improves, game studios are experimenting with generative AI to help build environments, assist game writers in crafting NPC dialogue and lend video games the improvisational spontaneity once reserved for table-top role-playing games.

In the multiplayer game “Retail Mage,” players help run a magical furniture store and assist customers in hopes of earning a five-star review. As a salesperson — and wizard — they can pick up and examine items or tell the system what they'd like to do with a product, such as deconstruct chairs for parts or tear a page from a book to write a note to a shopper.

A player’s interactions with the shop and NPCs around them — from gameplay mechanics to content and dialogue creation — are fueled by AI rather than a predetermined script to create more options for chatting and using objects in the shop.

“We believe generative AI can unlock a new kind of gameplay where the world is more responsive and more able to meet players at their creativity and the things that they come up with and the stories they want to tell inside a fantasy setting that we create for them,” said Michael Yichao, cofounder of Jam & Tea Studios, which created “Retail Mage.”

The typical NPC experience often leaves something to be desired. Pre-scripted interactions with someone meant to pass along a quest typically come with a handful of chatting options that lead to the same conclusion: players get the information they need and continue on. Game developers and AI companies say that by using generative AI tech, they aim to create a richer experience that allows for more nuanced relationships with the people and worlds that designers build.

Generative AI could also provide more opportunities for players to go off-script and create their own stories if designers can craft environments that feel more alive and can react to players' choices in real-time.

Tech companies continue to develop AI for games, even as developers debate how, and whether, they’ll use AI in their products. Nvidia created its ACE technologies to bring so-called “digital humans” to life with generative AI. Inworld AI provides developers with a platform for generative NPC behavior and dialogue. Gaming company Ubisoft said last year that it uses Ghostwriter, an in-house AI tool, to help write some NPC dialogue without replacing the video game writer.

A report released by the Game Developers Conference in January found that nearly half of developers surveyed said generative AI tools are currently being used in their workplace, with 31% saying they personally use those tools. Developers at indie studios were most likely to use generative AI, with 37% reporting using the tech.

Still, roughly four out of five developers said they worry about the ethical use of AI. Carl Kwoh, Jam & Tea's CEO, said AI should be used responsibly alongside creators to elevate stories — not to replace them.

“That’s always been the goal: How can we use this tool to create an experience that makes players more connected to each other?” said Kwoh, who is also one of the company’s founders. “They can tell stories that they couldn’t tell before.”

Using AI to provide NPCs with endless things to say is “definitely a perk,” Yichao said, but "content without meaning is just endless noise." That's why Jam & Tea uses AI — through Google's Gemma 2 and their own servers in Amazon — to give NPCs the ability to do more than respond, he said. They can look for objects as they’re shopping or respond to other NPCs to add “more life and reactivity than a typically scripted encounter.”

“I’ve watched players turn our shopping experience into a bit of a dating sim as they flirt with customers and then NPCs come up with very realistic responses,” he said. “It’s been really fun to see the game react dynamically to what players bring to the table.”

Demonstrating a conversation with an NPC in the game “Mecha BREAK,” in which players battle war machines, Ike Nnole said that Nvidia has made its AI “humans” respond faster than they previously could by using small language models. Using Nvidia's AI, players can interact with the mechanic, Martel, by asking her to do things like customize the color of a mech machine.

“Typically, a gamer would go through menus to do all this,” Nnole, a senior product marketing manager at Nvidia said. “Now it could be a much more interactive, much quicker experience.”

Artificial Agency, a Canadian AI company, built an engine that allows developers to bring AI into any part of their game — not only NPCs, but also companions and “overseer agents” that can steer a player towards content they’re missing. The AI can also create tutorials to teach players a skill that they are missing so they can have more fun in-game, the company said.

“One way we like to put it is putting a game designer on the shoulder of everyone as they’re playing the game,” said Alex Kearney, cofounder of Artificial Agency. The company’s AI engine can be integrated at any stage of the game development cycle, she said.

Brian Tanner, Artificial Agency's CEO, said scripting every possible outcome of a game can be tedious and difficult to test. Their system allows designers to act more like directors, he said, by telling characters more about their motivation and background.

"These characters can improvise on the spot depending on what’s actually happening in the game,” Tanner said.

It's easy to run into a game's guardrails, Tanner said, where NPCs keep repeating the same phrase regardless of how players interact with them. But as AI continues to evolve, that will change, he added.

“It is truly going to feel like the world’s alive and like everything really reacts to exactly what’s happening," he said. “That’s going to add tremendous realism.”