I attended the swap meet held by the Neon Temple, Tampa Bay’s security guild, where attendees were selling, swapping, or simply giving away old tech gear and books they no longer needed.
That’s where I found and took a photo of the relic above: a PCMCIA card (a name that got shortened to “PC Card”), which used to be a way of adding peripherals to laptops. The card above was for a 56K modem, which means that it was likely used to download Backstreet Boys songs using Napster.
“What did they call those things before they shortened the name to ‘PC Card’?” someone behind me asked.
“PCMCIA,” someone else replied. “Can’t remember what that was short for.”
I have a great memory for trivia, and even I couldn’t remember. I confessed: “I only remember the joke that it was short for ‘People Can’t Memorize Computer Industry Acronyms’.”
If you’re a regular reader of this blog, you’ve probably seen (or at least heard about) the demo of GPT-4o’s voice assistant mode featuring a voice named “Sky” providing vivacious — even flirty — assistance:
When I saw it, my first thought was “Wow, that’s a lot like Scarlett Johansson’s portrayal of the AI in Her,” and that seemed to be a lot of other people’s first thoughts.
There’s also Altman’s single-word post on X/Twitter, which he posted on the day of the GPT-4o’s premiere on Monday, May 13th:
And now, we find out that Scarlett Johansson — the “Her” herself — issued a statement on Monday, May 20th saying that she was approached by Sam Altman to be the voice of this version of GPT, and that she turned down the offer.
Here’s the text of the statement:
Last September, I received an offer from Sam Altman, who wanted to hire me to voice the current ChatGPT 4.0 system. He told me that he felt that by my voicing the system, I could bridge the gap between tech companies and creatives and help consumers to feel comfortable with the seismic shift concerning humans and A.I. He said he felt that my voice would be comforting to people. After much consideration and for personal reasons, I declined the offer. Nine months later, my friends, family and the general public all noted how much the newest system named ‘Sky’ sounded like me.
When I heard the released demo, I was shocked, angered and in disbelief that Mr. Altman would pursue a voice that sounded so eerily similar to mine that my closest friends and news outlets could not tell the difference. Mr. Altman even insinuated that the similarity was intentional, tweeting a single word, ‘her’ — a reference to the film in which I voiced a chat system, Samantha, who forms an intimate relationship with a human.
Two days before the ChatGPT 4.0 demo was released, Mr. Altman contacted my agent, asking me to reconsider. Before we could connect, the system was out there. As a result of their actions, I was forced to hire legal counsel, who wrote two letters to Mr. Altman and OpenAI, setting out what they had done and asking them to detail the exact process by which they created the ‘Sky’ voice. Consequently, OpenAI reluctantly agreed to take down the ‘Sky’ voice.
In a time when we are all grappling with deepfakes and the protection of our own likeness, our own work, our own identities, I believe these are questions that deserve absolute clarity. I look forward to resolution in the form of transparency and the passage of appropriate legislation to help ensure that individual rights are protected.
“OpenAI’s gonna OpenAI,” as this soon-to-be-common phrase goes, and they’ve been making their trademark obfuscating statements. As Ed Zitron summarizes in an article titled Sam Altman is Full of Shit:
Just so we are abundantly, painfully, earnestly clear here, OpenAI lied to the media multiple times.
Mira Murati, OpenAI’s CTO, lied to Kylie Robison of The Verge when she said that “Sky” wasn’t meant to sound like Scarlett Johansson.
OpenAI lied repeatedly about the reasons and terms under which “Sky” was retired, both by stating that it “believed that AI voices should not deliberately mimic a celebrity’s distinct voice” and — by omission — stating that it had been “in conversations” with representatives to bring Johansson’s voice to ChatGPT, knowing full well that she had declined twice previously and that OpenAI’s legal counsel were actively engaging with Johansson’s.
If you haven’t seen the movie Her, you’re probably wondering where you can find it on a streaming service. Here’s where you can watch it right now if you’re based in the U.S. (where I’m based):
Here it is — the video of my presentation, xz made EZ, which covers the security incident with the xz utils utility on Unix-y systems, which I gave at BSides Tampa 2024 on April 6th:
The details of the xz vulnerability were made public mere days before the BSides Tampa 2024 cybersecurity conference, and on a whim, I emailed the organizers and asked if I could do a lightning talk on the topic.
They quickly got back to me and let me know that they’d had a last-minute speaker cancellation and gave me a full slot in which to do my presentation.
The moral of the story? It never hurts to ask, and it can lead to opportunities!
What’s this xz thing, anyway?
Let me answer with this slide from my presentation:
xz is short for xz Utils, a compression utility that you’ll find in Unix-y operating systems, including:
Linux distributions
macOS
It’s usually used by Unix greybeards who generally use it in combination with tar.
What happened with xz?
xz was one of those open source projects that had a vulnerability best illustrated by this xkcd comic:
xz was like that project pointed out in the comic, except that the “random person” doing the maintaining was Lass Collin, a developer based in Finland, who was experiencing burnout. As a result, xz was languishing.
In what appeared to be a stroke of good fortune, a developer who went by the handle of “Jia Tan” on GitHub came to the rescue and started submitting patches to xz.
At about the same time, there were a number of complaints about xz’s lack of apparent maintenance. In hindsight, it looks like a clever two-pronged campaign:
A group of people loudly clamoring for someone else to take the reins of the xz project, and
A friendly developer who swoops in at the right time, making patches to the xz project…
…all while a burned-out Lasse Collin was facing a lot of stress.
On November 30, 2022, Lasse changed the email address for xz bug reports to an alias that redirected to both his email address as well as Jia Tan’s. At that point, Jia Tan, the apparently helpful developer who appeared at just the right time, was now an official co-maintainer.
Not long after, Lasse releases his last version of xz, and soon after Jia Tan, now the sole maintainer of the project, releases their own version.
With full control of the project, Jia Tan starts making changes — all the while, carefully disguising them — that create a “back door” within the xz application.
On any system that had Jia Tan’s tainted version of xz installed, an unauthorized user with the right private key could SSH into that system with root-level access. By becoming the maintainer of a trusted application used by many Linux versions, Jia Tan managed to create a vulnerability by what could have been one of the most devastating supply-chain attacks ever.
I originally posted a series of articles on date/time programming in Swift here on Global Nerdy, updated it, and published it on the Auth0 Developer Blog when I worked there.
I just checked to see how it ranked, and at least for me — remember, everyone sees different Google results — the series is still the number one result for swift dates times and smilar search terms.