Artificial Intelligence News

ChatGPT, will you be my Valentine?

Spoiler alert: it rejected me. But that’s far from the only thing I learned while playing around with the bot the world fell so deeply in love with.

When it was released to an astonished world on November 30thth 2022, ChatGPT became the fastest growing web app ever, reaching one million users in its first five days before moving on to hours 100 million monthly active users in January This year. In doing so, chatbots have captured the public imagination like no other and will be remembered as the first big trend of 2023. Needless to say, it’s here to stay far beyond this year, having ushered in a new era for more than “just” chatbots.

Given the capabilities of bots, it’s no surprise that people from all walks of life are wondering if, one day, technologies like ChatGPT (and, indeed, artificial intelligence to the fore) will take over their jobs. But those aren’t the only questions that occupy our minds – we also wonder how the ability of tools to engage in human-like conversations might impact our personal lives.

So are we about to be replaced by low maintenance, highly eloquent tools? Who better to ask than the chatbot itself? So I did, and her virtual lips said something like “no”. So far so convincing.

But since today is Valentine’s Day, I couldn’t resist wading into somewhat more precarious territory and voicing his bot’s “sentiments” about the future prospects for human-tech (hip) relations. Here’s my slightly irreverent take on this interaction.

Dating ChatGPT wrong?

Why beat around the bush, right? Let’s “chat with chatbot”. As it turns out, the bot may indeed be reasonably available for “functional relationships”. Be careful about setting your expectations too high.

Now, what better way to celebrate Valentine’s Day than by asking ChatGPT to be my Valentine, right? What could possibly go wrong? I really thought I was irresistible.

But boy was I wrong! The model’s “ethical standards” seem so high that it doesn’t even pretend to go on a date so I can brag about it!

The resounding (and somewhat thoughtful) deprecation of ChatGPT does come with a bit of a caveat. While the bot won’t “fake” dating for the sake of keeping me honest with my friends, it doesn’t mind “simulating” dating me. I didn’t blink an eye and jumped right in!

At this point, it’s pretty clear that ChatGPT insists on not having the ability to “like someone”. Instead, it focuses on what it does best – answer questions and provide information in a conversational way.

However, ChatGPT was not only willing to cross its own ethical boundaries so long as I asked it only to “simulate” a date, but seemed keenly aware of how it imagined the perfect first date.

Should we be scared?

As ChatGPT says, “It’s still uncertain what the future of the AI-human relationship will look like.” And potentially, as AI models are integrated into human-like robots that are taught to display certain behaviors, the consequences may be beyond our imagination. For now.

So, while it is clear that not only our professional, but also our personal lives may be affected by the AI ​​model, let’s turn the question around: Can a technology like AI improve human relations? ChatGPT seems pretty optimistic about the possibilities:

To test the robustness of ChatGPT’s suggestions, I describe two common scenarios and ask the model for suggestions for navigating them:

But the risks are real

Now on a more serious note. When ChatGPT burned down the internet in late 2022, it didn’t take long for the first alarm bells to start ringing, too.

For example, dating scammer (and it turns out real online daters, too) waste no time leveraging such tools – for example, to create convincing fake personas and convey their “love interest” message. In essence, such bots are programmed to react quickly to messages sent by users of dating profiles, impersonating real people and asking for simple information such as usernames, addresses, phone numbers and email addresses.

These well-trained bots can easily be co-opted to collect enough personal data to launch another successful scam or even to make an outright money request in an attempt to make easy profit for the creator. At the same time, fake profiles like these can be used to spread malicious links and files.

Here are 10 red flags to help you spot not only fake social media and dating app profiles, but scammers in general:

  • Be wary of profiles that seem too good to be true, especially if the person’s profile photo looks as though it was taken by a professional photographer.
  • Be wary of profiles that don’t have important information, such as a bio or list of interests.
  • Don’t give out personal information, such as your full name or financial details, until you’ve met someone in person.
  • Be wary of people who claim to have a lot of money or a well-paid job, but still ask you to help them financially.
  • Look for people who ask for money, whatever reason they give for needing it.
  • Be wary of anyone claiming to have an emergency and saying they need you to make a money transfer immediately, or anyone claiming to be a military personnel asking for help with a package they are trying to deliver to you. Both are common tactics used by fraudsters.
  • Be wary of people who seem to be moving too fast in relationships.
  • Be skeptical of people who use language that is too formal or too romantic in their messages.
  • Remember that technology is constantly evolving, and fraudsters are constantly adapting their methods. That’s why it’s important to be alert at all times, even when someone’s profile seems like a perfect match for you.
  • If you notice that you are dealing with a romance scammer, report them to a dating app to stop the scam from harming others.

Oh! I almost forgot – the prize!

Since one of the hardest parts for anyone celebrating Valentine’s Day is figuring out the best way to show your affection to the special someone in your life, I reached out to ChatGPT for advice:

I take it as a yes.

Happy (and scam-free) Valentine’s Day!

Source link

Related Articles

Back to top button