DDCSRH

Hot Topics

Hot Topics

ON THIS PAGE:

Hot Topics suggested by our health practitioner participants. 

This page is a work–in-progress and was last updated in March 2025.

What role does commercial/private technology play in sexual and reproductive health?

Commercial and private companies – including tech start-ups – increasingly play a role in sexual and reproductive health.  

Commercial digital sexual health technologies include private online STI testing services, dating apps, menstruation apps and sex toys. Commercial software platforms (ie. Microsoft) are also used to collect and store patient data.    

Almost all of these technologies collect data every time they are used.  

Data can be actively entered by users – for example, menstrual apps users upload the length of a cycle, and other information (including symptoms experienced) for self-tracking purposes.  But most apps collect data ‘passively’ as well – such as user location or the duration of app use.  

This data can be used for a range of purposes – including advertising, research, or policing/surveillance. These uses may be stated in the initial ‘Terms of Use and Terms of Service’, or they may be altered according to the commercial interests of the companies that own the technologies. Health data may also be aggregated with data collected from other sources – for example social media and retail platforms. 

Both individual sexual and reproductive health practitioners and organisations, should consider the ethical, privacy and security implications of engaging with commercial actors who provide SRH care and information.  

Currently such platforms and devices are unregulated in Australia, and the Privacy Act is under review. Additionally, while both tech companies and health researchers have suggested that digital health platforms will increase health care access for populations who are excluded from existing health care services, recent research has shown that this is often not the case. 

This does not mean that tech/health partnerships are a no-go. However, our research suggests that they should be approached with caution, and due diligence in terms of potential user privacy and security risks.

Why do young adults seek sexual and reproductive health content online?

Our participants actively sought information about the accessibility of local healthcare providers. They also sought accounts of their peers’ ‘lived experience’ that helped them better understand specific health conditions (such as endometriosis) or procedures (such as IUD insertion). 

Participants also described how their sexual, gender and reproductive health and wellbeing was supported opportunistically through participation in collective spaces and affinity-based communities on social media.  

Health was not always the organising focus of these spaces – for example, sites mentioned included punk Facebook groups and gamer Discord chats. This meant that participants were not always actively and purposefully seeking out health information. Instead, they encountered (and shared) support and advice through everyday participation in trusted (but often anonymous) communities with shared interests, desires and experiences. 

Do young people want health providers to ‘give advice’ or ‘bust myths’ in online communities?

Our Participants suggested it is not always necessary or ethical for healthcare professionals to be present in these digital platforms or communities themselves.  

However, it IS desirable for healthcare professionals to have a contextual awareness of the different ways young adults might access health information and support – and suspend their judgement about the ‘quality’ of social media content.  

Participants were aware that the information available online was not always accurate and/ or could catastrophise sexual health concerns. However, they suggested that healthcare providers could bridge this gap by taking health consumers’ concerns and information-seeking experiences seriously during clinical consultations:  

I would be hesitant to say that a lot of young people are coming into clinics having seen one TikTok video… But, I think to reduce that down to a young person is identifying that source of information was social media initially, and assuming that one thirty second clip of somebody catastrophising a bunch of symptoms was all they looked at, I think this is where some of the miscommunication happens. 
Jaqueline (24, cis woman, bisexual) 

 

I think maybe [health practitioners] just being aware of the different ways people are getting the information…I guess they need to be aware so they might be able to say ‘okay is this something you may have seen talked about on social media?’ and just use that as a gateway to investigating rather than staying up to date with the content themselves. 
Alexandra (23, cis woman, straight) 

 

You can read more about participants insights (including platforms and technologies used) in our Stage Two research report.

Digital health outreach on social media – censorship and safety

Sexual and reproductive health organisations use social platforms – including Google’s YouTube, Meta’s Instagram and ByteDance’s TikTok – to share health information, and connect with health consumers and service users.   

This may involve:   

  • Paid content – such as advertising or information campaigns. This is ‘pushed’ to specific audiences, according to contractual agreements with the platform. 
  • ‘Native’ content – including text-based posts, images and short-videos. This is shared directly with existing followers or subscribers via organisational accounts; and ‘recommended’ to other users via algorithmic systems, based their existing preferences. 


The rules that govern these processes are opaque and subject to frequent change. Globally, many organisations and individuals have had paid campaigns censored or refused. Native content is also taken down or ‘shadow-banned’ (hidden from recommendations), due to perceived breaches of platform ‘community standards’.
   

In some cases, health content is misidentified by human moderators as pornography or ‘sexual solicitation’. In others, key words or images are misidentified by AI moderation systems.    

Health organisations are also targeted by politically motivated attacks (such as those orchestrated by anti-woman and anti-LGBTQ+ groups) in organised campaigns.    

These can involve mass reporting or malicious ‘flagging’ of health content as hate speech or pornography – resulting in automated content takedowns and difficult appeal processes.   

While these issues have been reported for several years, the rapid political shifts towards ‘anti-woke’ sentiment among major platform owners has increased safety concerns for sexual and reproductive health organisations and health consumers.    

Online hostility can exclude marginalised populations (including Indigenous people, migrant and refugee people and people with disabilities) from ‘mainstream’ digital spaces where health content can be easily accessed.  

This is especially harmful and isolating for people whose sexual and reproductive health needs (such as trans-affirming care, abortion or HIV treatments and prevention) are already stigmatised. 

What works? Learning from success in digital sexual & reproductive health promotion

As part of her PhD research, Swinburne PhD candidate Joanna Williams talked with a number of local and international sexual health organisations about the ways they built their audience (or increased their follower counts) on Instagram.  

Organisations described a process of ‘trial and error’, where they collected and analysed a range of publicly available data to help inform their own approach to creating social media content. 

It was not just a matter of crafting the ‘right’ sexual health message to deliver via social media, but also about spending the time to better understand social media platforms and digital cultures.  

This included talking with other sexual health organisations (a practice known as  ‘algorithmic gossip’) about the ways they had successfully navigated Instagram’s automated content moderation policies. Successful strategies included using euphemistic language, or deliberately misspelling words (such as seggs).  

Williams’ participants also monitored current social media trends, to better adapt to trends in language use and visual design (or the platform vernacular and vocabulary).  

This might mean embracing trending catchphrases in social media content (such as ‘mindful and demure’) or using popular memes to convey health messaging. For example, Brook UK used photos of Moo Deng to talk about sex positivity.  

Organisations also collected engagement metrics or platform analytics such as likes, comments and/or shares of a post, to understand what content was performing well with their followers.   

This work was time-consuming and required specific skills and knowledge. Participants emphasised it was not possible without broader organisational support.  

This meant allocation of appropriate resources and recognition from organisational leadership of the time and skills needed to produce shareable content including: 

  • crafting shareable health content,  
  • adopting trial and error required to figure out what works and why,  
  • and actively moderating comments on social media.  
What’s the next big social platform for digital sexual & reproductive health promotion?

Social media platforms are highly valued as sites for community-building, entertainment and communication. Academic and industry research shows that marginalised people – including LGBTQ+ people and women – actively seek out and share sexual and reproductive health content on social platforms. 

 

As content moderation guidelines change, popular platforms like Facebook and X (formerly Twitter) are increasingly on NSFSRH (not safe for sexual and reproductive health). This plays out in unevenly racialised and gendered ways. For example, a recent survey led by the CensHERship group and Intimacy Justice Coalition,  found that paid content using words like ‘vagina’ was more likely to be suppressed than erectile dysfunction advertising.  

As yet, there is no single ‘next big thing’ on the horizon, in terms of English-language social media spaces. This is not necessarily a bad thing. Indeed, it’s probably fair to say that the near-monopoly status of big US-based platforms (and Chinese platform TikTok) has led to a lack of responsiveness to concerns related to the suppression of SRH content. 

 

So, what happens next? Because this is all literally unfolding in real time, there’s not a body of literature that academics can refer to say ‘do this, post here’ – we’re in a space of experimentation. 

HOWEVER – we can already point to examples of where things might be moving. For example, in April 2025, Brisbane-based Greens MP Stephen Bates posted his party’s health policy announcements on OnlyFans. 

According to a Guardian report, he’s the first Australian politician with an OnlyFans account. 

Obviously this is ‘newsworthy’. But why might it be a timely strategy for communicating policies specifically related to HIV pre and post exposure prophylaxis medications? 

Bates is quoted as saying: 

“Ending HIV is too important to fly under the radar,” he said. “I campaign on OnlyFans and Grindr because it gets attention. Sometimes you have to make a splash to make people pay attention to the things that matter.” 

And yes, this is good PR in an election campaign.  

 But we believe Bates’ campaign strategy is illustrative of the experimental approaches to sexual and reproductive health outreach we will see more and more in the aftermath of ‘anti-woke’ backlash against health content on X (formerly Twitter) and Meta platforms. 

There is obviously a need to document and pushback against platform suppression of health content (particularly where content related to abortion rights, trans healthcare and LGBTQ+ health is disproportionately targeted). 

But we also need workarounds. Now is the time to develop emergent strategy for digital connection. 

What do we need to know about AI? Should we build a sex-ed chatbot?

A recent technical brief from the WHO examines Artificial Intelligence (AI) specifically in the sexual and reproductive health context.   

The WHO guidance defines AI as ‘the capability of algorithms integrated into systems and tools to learn from data so they can perform automated tasks without explicit programming of every step by a human’.  

Generative AI (or GenAI) is particularly relevant to the health context. GenAI refers to systems or models that generate text, images or videos based on existing datasets. This includes large language models (LLM’s), like “chatbots” or programs like ChatGPT that can respond to various prompts entered by users.   

In the sexual and reproductive health context, AI has been used to support the delivery of health education via chatbots; to promote screening and diagnosis through analysis of large amounts of health data (such as electronic medical records, medical imaging test results, and clinical notes); and self-monitoring through wearable and mobile devices (ie. fertility tracking apps).   

However, the WHO guidance also reminds us of the possible risks attached to the use of AI for SRH. These include the risk of data breaches, the proliferation of misinformation due to the training of AI models on inaccurate or biased data sets, and a lack of context and cultural awareness.   

The WHO guidance highlights that:  

the responsible and ethical use of AI in SRHR requires concerted efforts among stakeholders, including policy-makers, commercial actors, funding agencies, developers, health workers and civil society, to mitigate the rising risks’. 

To mitigate the risks of AI use within SRH, the WHO guidance recommends:  

  • The implementation of ‘community-led and open-source fact-checking programmes’ to clarify AI-generated recommendations and fight misinformation and targeted disinformation  
  • The establishment of collaborative oversight mechanisms, such as local and international regulatory bodies and community representatives  

 

What does this mean for sexual and reproductive health practice? 

Current GenAI systems (such as ChatGPT) are primarily trained on US-centric data. Significant concerns have been raised by researchers and activists about racist, sexist and transphobic content ‘baked’ into initial training datasets.   

Even where organisations have created their own chatbot data, concerns have been raised that the bots are not sensitive enough to recognise the cultural contexts surrounding questions relating to sexual and reproductive health issues.   

In some cases they have provided incorrect or actively harmful answers to questions. This blogpost from Hera Hussein, CEO of trauma-informed gender and tech organisation CHAYN, provides an excellent overview of key questions organisations should ask when contemplating the creation of a chatbot. 

What is the current Australian digital transformation policy environment? (as of December 2024)

The 2021 National Digital Health Capability Action Plan (CAP) outlines a national policy agenda for digital transformation across both health services and  health workforce training and education. The CAP describes high level actions that aim to ‘equip Australia’s health workforce for a connected, digitally enabled future’. Actions include: the development of frameworks and guidelines; development of education and training opportunities; and regulation and collaboration.  

One outcome of the CAP is the Australian Digital Health Capability Framework (ADHCF) released by the Australasian Institute of Digital Health and the Australian Digital Health Agency in late 2023.   

The ADHCF focuses on supporting the education and training of the current health workforce. It is an applied, self-assessment instrument, “intended to act as a practical guide for organisations and individuals on the skills and knowledge required to effectively deliver health care in an increasingly digital world”.   

Both the National Digital Health Capability Action Plan and Australian Digital Health Capability Framework adopt a “standardised, profession-agnostic approach”,  for use across diverse health services, and within education and training organisations.   

This is one of the key differences between the Digital Health Workforce resources and the Digital and Data Capabilities for sexual and reproductive health models, which explicitly target sexual and reproductive health contexts.  

The Digital Health Hub is a platform hosting all the relevant health workforce content – including an opportunity to assess your digital health capability via an online self-assessment tool.   

The digital transformation policy space is rapidly evolving. You can find more information about the current Australian policy environment here.