The company announced a policy change this allows the service to delete accounts that violate too many of its rules for a period of time. Until now, Instagram only deleted accounts containing "a certain percentage of broken content".
This percentage rule – the company does not disclose the exact percentage – is still in effect, but a new rule allows Instagram to delete accounts that violate many rules in a shorter period of time.
The company in question is facing criticism for its inability to block the graphic images of a teenager's corpse on his platform. Although Instagram claims to have used image blocking technology to prevent the broadcast of photos, the platform was not able to capture everything, such as my colleague from Mashable, Morgan Sung. reported earlier. Instagram said it has disabled many anonymous accounts that are responsible for further sharing.
In addition, Instagram now says that it will send warnings to people whose accounts may be deleted for too many rules. The application will alert users whose publications have been deleted for violation of the rules and will tell them if the deletion of an account is imminent.
Previously, users could have their accounts deleted without warning and without necessarily understanding what they had done wrong. Some people whose accounts were deleted were hacked, their account disappeared without warning.
However, there is a common scenario in which these warnings do not apply: accounts disabled for trademark or copyright infringement. This process, which is a sore point for many accounts posting viral videos, will remain separate, according to an Instagram spokesperson.
. (tagsToTranslate) tech (t) instagram (t) social media companies</pre></pre>