Human hand shaking a robot hand

A company wanted creators to promote its service offering AI clones to do job interviews. Now, it’s gone dark online.

When career advice guru Eve Peña was offered a five-figure brand deal to promote a company she had never heard of to her TikTok following, she was intrigued. Then she learned what it claimed to be offering: a service that would create AI clones of people, taking on their full human likenesses, to attend virtual job interviews and generate answers based on the clients’ résumés.

“The first thing I said to that was: ‘This is really unethical. People are going to tell me, too,’” Peña said in a phone interview. “And [the representative] said: ‘Well, I do think it’s unethical, as well … but we’re going to create some talking points that you can deflect hate comments with.’”

Peña is one of three creators who told NBC News that they were approached with the offer — and that they quickly grew wary that it was a scam. Less than two weeks later, the company, StartupHelper, went dark online, with its website’s contents taken down entirely and its TikTok page set to private.

It was a bizarre episode featuring the collision of two distinct dynamics: the murky world of partnerships, in which creators are approached to sponsor relatively unknown companies, and the rise of generative AI technology, which has tremendous potential but little oversight.

The first thing I said to that was: ‘This is really unethical.’

-TikTok Creator Eve Peña

“It was such a crazy amount that they were offering that I was like, ‘This has to be some kind of scam or some kind of fraud,’” Peña said of the brand deal, which promised $48,000 over six months, as well as a 10% commission per client recruited to the platform. The company would charge clients $500 down payments along with 10% of their first year’s salaries if they got jobs using the service.

Most brands don’t pitch numbers in emails without wanting to discuss creators’ rates first, Peña said, especially not in such an aggressive manner. So she asked for more clarity in her Zoom call with the representative.

“They said, ‘Oh, it’s a net 30 payment,’ so I would have had to create videos for them for 30 days before I saw any payment,” she said. “And I was like, this seems like you guys are looking for free marketing and then you’re just not going to pay anybody.”

The explosion of generative AI capabilities has opened the door to a variety of uses, from drafting emails to producing whole deepfakes, sparking a rush to figure out ways to capitalize on the technologies.

In the world of human resources, that has meant questions about just how much employers should use AI to comb through applications and whether applicants risk crossing ethical lines with AI-generated résumés and cover letters.

But an AI-generated clone — what StartupHelper describes as “a digital body double that attends job interviews on your behalf” — was over the line to some creators who specialize in career content.

Career content has become its own popular niche on TikTok and other tech platforms, where influencers regularly amass hundreds of thousands of followers. And it’s common for popular TikTok creators to be approached by brands looking to promote their services through sponsored deals. But it’s a largely opaque — and sometimes fraught — market, meaning such deals can fall apart if negotiations reach dead ends or products don’t align with the creators’ values. 

Read More:   Diddy’s companies named in new suit accusing an ex-Bad Boy president of sexually assaulting an assistant

The creators who spoke with NBC News said that after StartupHelper approached them, they were concerned by the services it was offering, as well as the terms of the suggested brand deal. Before the content on the StartupHelper website was removed, it advertised the digital clone and other services, such as auto-applying for jobs and optimizing LinkedIn profiles. 

Peña, whose videos focus on teaching people how to join the corporate world, quickly called out the company on TikTok for being unethical. In her video, she issued a “warning to career tik tok,” sharing details of what she learned about the service after she held a Zoom call with a representative. StartupHelper’s clients, Peña said, are asked to send the company a few videos of themselves speaking so its AI developers and engineers could study their mannerisms and program clones to attend virtual interviews on their behalf.  

“If you see any influencers peddling the ‘What if you never had to be in a job interview again? What if you had a clone?’ just judge them,” she said in the video. “Know that they sold their soul for that one. And please, don’t fall for it.”

After Peña’s video started gaining traction, StartupHelper’s TikTok profile left comments claiming to be the company in question. She said she blocked the account to avoid giving it free promotion. Then, bot accounts began swarming her comments claiming StartupHelper helped them secure jobs. After she used a filter to block out any mention of the company name, its own TikTok page began posting now-deleted videos disparaging her.

StartupHelper didn’t respond to a request for comment. Its startuphelper.com email addresses were no longer active Tuesday. Its website is now just a landing page with a message that states: “We would be back with a better and ethical product that champions the rights of the working class in a fast changing AI driven world. Thanks for all your feedback.” 

The saga is the latest manifestation of ethical and security concerns surrounding the unregulated use of AI technology, demonstrating how creators often struggle to navigate balancing personal morals with the need to make money off content creation.

Ever since OpenAI introduced ChatGPT late last year, technologists have anticipated an explosion in the number of startups looking to use generative AI — artificial intelligence systems capable of creating humanlike content, including print, photos and video — for all manner of business and consumer services.

Some companies have already pushed ahead with technologies that offer ways for people to make AI versions of themselves. Aphid, a fintech company that creates AI workers to handle multiple online tasks at once, envisions a future in which people make digital clones that can work in their place.

Such companies remain largely unregulated, though there has been plenty of discussion about how to write rules concerning the development of AI.

Peña and the two other creators who spoke with NBC News said that when they first got StartupHelper’s pitch, they were eager to learn more. But the initial email sent to them, a copy of which NBC News reviewed, was vague, and it made no mention of an AI-driven cloning service, describing itself only as “a job placement company looking to change how people get jobs and earn more in the corporate space.”

Read More:   Michigan man who kept wife in freezer for months is sentenced to up to eight years in prison

“We envision a collaborative partnership where you can be our brand mascot and bring to life our company by association,” a representative had written in the outreach emails.

Farah Sharghi, a creator who gives career advice on TikTok, said the terms of the deal caused her to ghost the company after she initially asked for more information. After she read through its content plan later on, she said, she discovered what she said were a host of ethical concerns that reaffirmed her decision not to accept.

The 10-page content plan the company sent provides scripts for creators to use, including specific responses to critical comments, such as: “Isn’t this illegal? Like what if they find out that you did this to get the job?” It’s a concern that highlights the lack of federal regulation around the use of AI technologies.

To that, the document urges creators to say: “I think it is funny how it is always deemed illegal when it is an individual not a company that uses a shortcut to bypass traditional systems and not the other way round. … We are about to step into a post labor economics, your only priority should be getting the job that pays the most without any of the hassle.”

The document also encourages creators to follow scripts that openly tout “cheat[ing] your way to getting that job” and urges followers “to block all the career advice people,” reasoning that “they don’t care about you, they simply care about turning you to sheep people who would bend over backwards for companies that don’t care about you.”

It doesn’t address questions about data privacy and whether the company plans to use clients’ likenesses for other purposes. Sharghi said that she believes AI tools can help job seekers in more ethical ways but that such a cloning service immediately set off alarm bells.

“I’ve spent three years on my social platform, and I would never want to risk my own reputation or my brand reputation just for some money. Like, it doesn’t sit well with me,” she said. “What they look to be doing is just blatant fraud, and it does not pass the smell test for me at all.”

The creators also said the money StartupHelper offered seemed too good to be true.

But Gabrielle Judge, the TikTok creator credited with coining the term “lazy girl job,” said the payment actually amounted to very little once you looked more closely at the request. It asked for one post a day over six months, on top of putting StartupHelper’s link in creators’ bios, along with a short sentence.

“At first you’re like, ‘Oh, my God, that’s excellent money,’ but then when you look at the actual work that you had to do, it’s not worth it,” she said. 

In addition, Peña said she was told the company’s CEO is based in Dubai. But no business license is registered under the company’s name in the United Arab Emirates.

Angela Yang

Angela Yang is a culture and trends reporter for NBC News.