spot_imgspot_img

Algorithms don’t care: how AI worsens the double burden for Indonesia’s female gig workers

Artificial intelligence is often celebrated as the future of work. It is efficient, innovative and neutral. Yet, for many women in Indonesia’s gig economy, AI feels like a source of mounting pressure.

In my recent research on female gig workers in Indonesia, I examine what I call AI colonialism. This term describes how colonial influence persists today through technology and digital systems that maintain control.

This concept captures how powerful actors use AI – often based in the Global North – to exploit workers in the Global South. Much like historical colonialism, this digital iteration relies on the extraction of data, labour and resources to cement unequal power relations.

In Indonesia, AI-driven platforms like ride-hailing and e-commerce draw on informal labour but push the risks and responsibilities back onto workers. But women pay the highest price because algorithms fail to recognise the realities of care work, safety concerns and social norms.

AI and the gendered restructuring of work

Indonesia’s labour market has long been defined by informality. Millions are working without formal contracts or social protections. Tech companies like Gojek, Grab, Maxim and Shopee didn’t formalise this workforce – they only digitised it.

Drivers are classified as partners rather than employees. This means no minimum wage, no sick pay and no maternity leave. Income is dictated entirely by completed tasks and algorithmic ratings.

For women, this structure collides with the so-called “double burden” since they are responsible for paid work and unpaid care.

Lia, a 33-year-old food delivery rider, wakes before sunrise to cook and get her children ready for school. It is only after she has cleared her domestic duties that she finally logs into the app.

“The system doesn’t know I have children,” she told me. “It only knows whether I am online.”

Platform algorithms reward constant, uninterrupted availability. Incentive schemes demand a specific number of trips within narrow time windows – a high bar for those with domestic ties.

If Lia logs off to pick up her children, she risks losing potential bonuses. If she reduces her hours due to menstrual pain or fatigue, her performance metrics drop.

Neoliberal capitalism relies on a massive amount of unpaid “invisible labour”, such as childcare and housework, but refuses to pay for it or provide a safety net for those who do it. Far from correcting this imbalance, AI systems make things worse.

When Cinthia, a female food delivery rider and a single mother of a one-year-old, fell ill and turned off her app for several days, she noticed fewer job offers upon returning. “It felt like the system punished me,” she said. “Now I’m afraid to stop working.”

The algorithm does not explicitly discriminate. However, it operates on the assumption of a worker without caregiving constraints – a norm that systematically disadvantages women.

Discrimination behind a ‘neutral’ interface

The digital economy often claims neutrality. But gender bias persists.

Yanti, a 43-year-old ride-hailing driver in Yogyakarta, regularly messages male passengers before pickup: “I am a woman driver. Is that okay?”

Many cancel immediately.

The app records cancellations. It does not record gender bias.

Because Yanti avoids working late at night for safety reasons, she misses out on rush-hour incentives. The system, however, doesn’t account for safety – it simply interprets her absence as lower productivity.

Scholars like [suspicious link removed] and Virginia Eubanks have pointed out that automated systems often mirror and amplify social inequalities rather than eliminate them.

In Indonesia’s platform economy, discrimination isn’t necessarily hard-coded. It is a byproduct of a design logic that favours efficiency over equity.

In India, women drivers also report earning less on average than their male counterparts, partly due to safety-driven choices regarding timing and route selection. The algorithm does not account for risk in its calculations. It only measures raw output.

Safety, surveillance and algorithmic discipline

For women drivers, safety is a constant negotiation.

Around 90% of the women in our focus group discussions chose food delivery because it felt safer than ride-hailing. Even so, harassment persists in delivery work.

Lia shared how a male colleague targeted her with inappropriate comments as they waited for orders. “It’s not only customers,” she said. “Sometimes it’s other drivers.”

During the COVID-19 pandemic, gig workers were labelled “essential”. Yet their income dropped dramatically by as much as 67% in early 2020. To cover the loss, many worked 13 or more hours per day.

Indonesian female online drivers are having dinner together
Indonesian female ride-hailing drivers break their fast together during the month of Ramadan in March 2026. Discrimination from platforms’ algorithms often forces these women to work extended hours.
foto pix/shutterstock

Platforms maintained their rigid performance metrics throughout the crisis. Drivers who are forced to stop working due to illness often see their ratings decline. Health vulnerability was translated directly into an algorithmic penalty.

This reflects labour discipline through digital infrastructure: control shifting from foreman to code.

AI colonialism is more than just foreign ownership. It is about the way extractive logics are woven into everyday digital systems. Workers bear the burden of labour, data, time and risk – yet the platforms hold all the power over algorithmic governance.

Coping, solidarity and everyday resistance

Female gig workers have built dense networks of solidarity through WhatsApp and Telegram groups. They share information about policy changes, warn each other about unsafe customers and exchange strategies for navigating algorithmic shifts.

If an account becomes “gagu/silent” (receiving few orders), experienced drivers “warm it up” by temporarily boosting its activity. They lend money for fuel. They pool resources for vehicle repairs.

When someone faces harassment, others circulate the information quickly to protect fellow drivers. They visited the platform office together when a member was suspended.

Rather than waiting to be formally acknowledged as employees, these women build protection among themselves. This “solidarity over recognition” emerges from shared vulnerability as mothers, caregivers and workers in male-dominated spaces.

Their mutual aid turns care into a strategy and a form of “everyday resistance” – subtle acts that challenge dominant systems, while reflecting a distinctly feminist ethic of survival through relational solidarity.

Beyond innovation narratives

AI is not colonial by design. But when embedded in platform capitalism within unequal societies, it can reproduce colonial patterns of exploitation and loss of ownership.

If we are serious about building just digital futures, we must move beyond innovation narratives and listen to workers, especially women and vulnerable groups in the Global South.

Their stories are a vital reminder that behind every “efficient” algorithm is a human being navigating the delicate balance of survival, dignity and hope.

0 0 votes
Article Rating
Subscribe
Notify of
guest
0 Comments
Oldest
Newest Most Voted
Inline Feedbacks
View all comments

Popular Articles

0
Would love your thoughts, please comment.x
()
x