
This demonstrates how browser campaigns usually operate: they don’t care about preaching to the chorus.
HIGHLY SYNCHRONISED Sharing OF Feedback
The scammer accounts as a whole were found to be posting comments more usually between 10 am and 3 pm on April 26, which is when it was discovered that the posting was very synchronized.
It was common to see numerous different fake records commenting on the same blog right away after it was published.
For instance, on April 26, a 10-minute windows of roughly 2.35pm to 2.45pm saw 90 different fake accounts post comments.
On April 25, the government’s declaration was released in a burst of 157 fake comments, which were made between 10 p.m. and 11 p.m.
This bot strategy tries to coincide with the local audience’s needs and to influence public opinion immediately by creating narratives in the morning and strengthening them later in the day.
Evaluation of the comments ‘ timing revealed planned collapse designs. At least 10 or more different accounts may be gathered by the bot network at once every 10 minutes.
SUPERSPREADERS AND TOP BOT
Facebook user” Angeline Tan,” who posted 71 times in four and a half hours between 8:19 p.m. on April 26 and 12:54 a.m. on April 27, is the top app that was found in this battle.
The patient’s 71 comments were posted on two CNA posts on April 26: one picture of Mr. Wong’s remarks at a press event on the foreign disturbance, and another video of Mr. Singh’s comments to the media on the same subject.
A few unauthentic accounts were also identified as superspreaders because they appeared to comment on each of these posts, and their Facebook profiles reveal that they are connected to a large number of suspicious accounts that also posted similar messages, also known as” co-commenting,” according to their Facebook connections.
This could indicate that they may be the major contributors to these bot systems or that automated bots have a similar model.