TikTok is leading the race now, and Meta is still trying to catch up and dethrone it. We saw the company making constant changes to its platforms, especially in rolling out TikTok-like features on Facebook and Instagram Reels. For its latest move? It is testing a new feature allowing you to own up to five accounts.
The test will be tried on a limited number of users, and Meta says it will give them the ability to link four more accounts to their original profile. They can then be used for other purposes or when interacting with specific groups of Facebook friends, like personal friends, coworkers, family, and so on. Additionally, Meta says the users owning the accounts will be able to switch from one to another with just a tap or click on their devices.
“To help people tailor their experience based on interests and relationships, we’re testing a way for people to have more than one profile tied to a single Facebook account. Anyone who uses Facebook must continue to follow our rules,” Meta Spokesperson Leonard Lam told The Verge.
Interestingly, this new feature is said to let the user NOT use their actual names. Well, is it just to prevent the repetition of your name on the platform? We don’t know. But one thing’s clear: all profiles will be under Facebook policies, and a violation on any profile will affect all the others linked to the original one.
This is where the problem will surface in the future. If Twitter is being a little problematic about its bots issues (which, according to Elon Musk, is one of the main reasons for him to back out of the deal), here is Facebook allowing you to get up to five profiles. Well, that is not a problem, is it? The profiles will be used by real people, right? After all, Facebook says the sub-profiles will constantly be under the scrutiny of its policies, so what could go wrong?
Facebook is known as a breeding ground for trolls who are using the account for specific purposes. For instance, trolls are known to be used to influence voters during elections, and we’ve witnessed tons and boatloads of reports regarding this issue around the globe in the last couple of years. In response, Facebook would take down pages and flag the posts as “misinformation” after some time or after a series of reports received, but how many people such fake news had already reached before Facebook took action? (And it is not like Facebook is capable enough to remove all misleading info from its entire platform.)
That said, are Facebook’s policies and actions flawlessly adequate and responsive enough to handle such problems that could arise from multiple-account ownership? Probably not. But who cares? Facebook’s user growth is more important for Meta, so, yeah, give the public another weapon they could mess with.