OpenAI Reveals Sora, a Tool to Make Instant Videos from Written Prompts 

The OpenAI logo is seen on a mobile phone in front of a computer screen displaying output from ChatGPT, March 21, 2023, in Boston. (AP)
The OpenAI logo is seen on a mobile phone in front of a computer screen displaying output from ChatGPT, March 21, 2023, in Boston. (AP)
TT

OpenAI Reveals Sora, a Tool to Make Instant Videos from Written Prompts 

The OpenAI logo is seen on a mobile phone in front of a computer screen displaying output from ChatGPT, March 21, 2023, in Boston. (AP)
The OpenAI logo is seen on a mobile phone in front of a computer screen displaying output from ChatGPT, March 21, 2023, in Boston. (AP)

The maker of ChatGPT on Thursday unveiled its next leap into generative artificial intelligence with a tool that instantly makes short videos in response to written commands.

San Francisco-based OpenAI’s new text-to-video generator, called Sora, isn’t the first of its kind. Google, Meta and the startup Runway ML are among the other companies to have demonstrated similar technology.

But the high quality of videos displayed by OpenAI — some after CEO Sam Altman asked social media users to send in ideas for written prompts — astounded observers while also raising fears about the ethical and societal implications.

“An instructional cooking session for homemade gnocchi hosted by a grandmother social media influencer set in a rustic Tuscan country kitchen with cinematic lighting,” was a prompt suggested on X by a freelance photographer from New Hampshire. Altman responded a short time later with a realistic video that depicted what the prompt described.

The tool isn’t yet publicly available and OpenAI has revealed limited information about how it was built. The company, which has been sued by some authors and The New York Times over its use of copyrighted works of writing to train ChatGPT, also hasn’t disclosed what imagery and video sources were used to train Sora. (OpenAI pays an undisclosed fee to The Associated Press to license its text news archive).

OpenAI said in a blog post that it’s engaging with artists, policymakers and others before releasing the new tool to the public.

“We are working with red teamers  —  domain experts in areas like misinformation, hateful content, and bias  —  who will be adversarially testing the model,” the company said. “We’re also building tools to help detect misleading content such as a detection classifier that can tell when a video was generated by Sora.”



US Judge Finds Israel's NSO Group Liable for Hacking in WhatsApp Lawsuit

Israeli cyber firm NSO Group's exhibition stand is seen at "ISDEF 2019", an international defense and homeland security expo, in Tel Aviv, Israel June 4, 2019. REUTERS/Keren Manor/File Photo
Israeli cyber firm NSO Group's exhibition stand is seen at "ISDEF 2019", an international defense and homeland security expo, in Tel Aviv, Israel June 4, 2019. REUTERS/Keren Manor/File Photo
TT

US Judge Finds Israel's NSO Group Liable for Hacking in WhatsApp Lawsuit

Israeli cyber firm NSO Group's exhibition stand is seen at "ISDEF 2019", an international defense and homeland security expo, in Tel Aviv, Israel June 4, 2019. REUTERS/Keren Manor/File Photo
Israeli cyber firm NSO Group's exhibition stand is seen at "ISDEF 2019", an international defense and homeland security expo, in Tel Aviv, Israel June 4, 2019. REUTERS/Keren Manor/File Photo

A US judge ruled on Friday in favor of Meta Platforms' WhatsApp in a lawsuit accusing Israel's NSO Group of exploiting a bug in the messaging app to install spy software allowing unauthorized surveillance.

US District Judge Phyllis Hamilton in Oakland, California, granted a motion by WhatsApp and found NSO liable for hacking and breach of contract.

The case will now proceed to a trial only on the issue of damages, Hamilton said. NSO Group did not immediately respond to an emailed request for comment, according to Reuters.

Will Cathcart, the head of WhatsApp, said the ruling is a win for privacy.

"We spent five years presenting our case because we firmly believe that spyware companies could not hide behind immunity or avoid accountability for their unlawful actions," Cathcart said in a social media post.

"Surveillance companies should be on notice that illegal spying will not be tolerated."

Cybersecurity experts welcomed the judgment.

John Scott-Railton, a senior researcher with Canadian internet watchdog Citizen Lab — which first brought to light NSO’s Pegasus spyware in 2016 — called the judgment a landmark ruling with “huge implications for the spyware industry.”

“The entire industry has hidden behind the claim that whatever their customers do with their hacking tools, it's not their responsibility,” he said in an instant message. “Today's ruling makes it clear that NSO Group is in fact responsible for breaking numerous laws.”

WhatsApp in 2019 sued NSO seeking an injunction and damages, accusing it of accessing WhatsApp servers without permission six months earlier to install the Pegasus software on victims' mobile devices. The lawsuit alleged the intrusion allowed the surveillance of 1,400 people, including journalists, human rights activists and dissidents.

NSO had argued that Pegasus helps law enforcement and intelligence agencies fight crime and protect national security and that its technology is intended to help catch terrorists, pedophiles and hardened criminals.

NSO appealed a trial judge's 2020 refusal to award it "conduct-based immunity," a common law doctrine protecting foreign officials acting in their official capacity.

Upholding that ruling in 2021, the San Francisco-based 9th US Circuit Court of Appeals called it an "easy case" because NSO's mere licensing of Pegasus and offering technical support did not shield it from liability under a federal law called the Foreign Sovereign Immunities Act, which took precedence over common law.

The US Supreme Court last year turned away NSO's appeal of the lower court's decision, allowing the lawsuit to proceed.