At the end of February, Meta surprised everyone with the launch of LLaMA, an artificial intelligence composed of several different language models that would serve primarily as a tool for AI research (in order to understand some of the problems they sometimes manifest), and which was made available to universities, non-profit organizations and laboratories free of charge, upon request.
But, in less than a week, the AI was leaked on the Internet through a 4chan user who shared it via Torrent and subsequently added as a pull request on Meta’s official project page on Github. With AI publicly available, and without the protections and limitations enjoyed by popular artificial intelligences such as ChatGPT or Bing‘s chatbot, it was only a matter of time before less “scientific” uses of AI emerged. And so it has.
According to Vice, LLaMA is being used to increase the chances of hooking up on the dating app Tinder. The AI is being used to, among other things, automatically generate profile biographies, as well as responses in chat conversations. All this with the aim of making the user “flirt” with someone (or pretend to do so).
“The plan is to develop a series of prompts for the AI to ‘help’ you establish a conversation. I think this may encourage a lot of people to start chatting,” Alfredo Ortega, a computer engineer specializing in information security, told Vice. So that others could try out this application of Meta’s AI, Ortega made a bot available for users of his Discord server to interact with.
The computer engineer detailed on the Discord server his goal and how users can help: “The goal of this channel is to conduct ongoing research to discover the smallest LLM model capable of arranging a date with a human, male or female. Use AI as a proxy: Give the AI the conversations with your partner, and type back the response to the dating app chat. Please anonymize any information before posting it.”

So far, however, the bot has proven to be as basic as the users who make use of it. While it is capable of generating biographies of Tinder profiles and creating responses in chats, the texts it generates are of such simplicity that any adult would think it is a child posing as an adult. “Hey, I like your dog, do you want to grab a drink sometime?”; “There’s a new bar that just opened on Main Street, have you been yet?” are two examples of what AI is capable of when it comes to flirting on Tinder.
The question now is: is this use of AIs moral? Beyond helping users to flirt, it is important to note that behind Tinder conversations there is usually a person with feelings (unless it is a bot or a fake account). If the first step someone takes to flirt is to use an AI to speak for him/her, perhaps that person should rethink leaving dating and engage in other activities that do not involve deceiving a person.