What is image-based sexual abuse and what's being done to combat its rise? | CNA Explains
Deepfake porn, upskirt videos and sextortion are some examples of non-consensual harmful online content known as image-based sexual abuse or IBSA, and cases are rising as new technologies like generative AI improve. To better support victims, it was announced in October that a new government agency in Singapore will help those affected, to get the damaging content removed quickly. How worrying is the situation? And how effective are current laws? We asked experts for their take on the situation. 00:00 Introduction 02:26 What is image-based sexual abuse or IBSA? 02:55 How worrying is the situation? 08:33 How can we better help victims? 13:49 How effective are current laws? 17:49 Where to get help if you're a victim of image-based sexual abuse #SGCreatorsForImpact #YTCreatorsForImpact This video was independently created, and is owned by Mediacorp as a part of the Singapore YouTube Creators for Impact program. The views, ideas and opinions expressed therein are Mediacorp's own, and is not endorsed by, or representative of, YouTube or Google in any way. More CNA Explains episodes: https://www.youtube.com/playlist?list=PLbnMTcZEga8RZkFnvrVbOlyleVMcz1-vR Follow us: CNA: https://cna.asia CNA Lifestyle: http://www.cnalifestyle.com Facebook: https://www.facebook.com/channelnewsasia Instagram: https://www.instagram.com/channelnewsasia Twitter: https://www.twitter.com/channelnewsasia TikTok: https://www.tiktok.com/@channelnewsasia