Five adverts for the Chinese shopping app Temu have been banned in the UK for their sexualised nature, with one found to have been irresponsible in how it depicted a young girl to sell bikinis.
The company has been warned against presenting under-18s in a sexual way in future or portraying adults as “stereotypical sexual objects” after several ads featured “disembodied images of the women wearing tight and revealing clothing”.
The fast-growing tech business was reprimanded by the UK’s advertising watchdog after one of its ads used a model who appeared to be aged between eight and 11 posed with her hand on her hip, which was found to be “quite adult for a girl of her age”.
The picture appeared alongside ads for household products including a facial roller, balloon ties and a jockstrap which, as they lacked labels, “appeared to be items sexual in nature”, according to the Advertising Standards Authority.
…
The ads were shown alongside those for household objects that “could have been interpreted as sexual in nature”, according to the ASA. The facial roller and balloon ties were “phallic” and the foot massager “could also have been understood in the same way”
A jockstrap was “augmented in the crotch, emphasising the outline of genitalia” while some cycling underwear had pink padding at the back and “appeared as underwear with the bottom cut out”. A further ad, shown in a puzzle app, featured images of leopard-print underwear with the back removed and a woman wearing a short black skirt and tights.
The ASA found that the ads were “likely to cause widespread offence” as they appeared in media where adult-themed or sexual products were “unlikely to be anticipated”.
All this and potentially sinister practices on the app.
Emperor@feddit.uk 1 year ago
It’s a relief to hear this, I’d genuinely started to worry if it was me over-interpreting the ad images, like the Rorschach test joke: “you think I’m sexually frustrated? You’re the one sending me the mucky pictures!”
Another article on The Guardian suggests they are doing this to go viral (I have shared a screenshot of an ad on WhatsApp asking if other people saw the same things) or catch your eye:
The fact that they’d deliberately use sexualised images of women and children to as part of this strategy is pretty grim. 😕