top of page

ITGS + Digital Society  IBDP

Digital Society Blog

IB Digital Society 5.1 Global Well-being Study Guide

  • Writer: lukewatsonteach
    lukewatsonteach
  • 2 hours ago
  • 22 min read

Struggling to connect digital theory with real-world examples in IB Digital Society? This enhanced study guide breaks down 5.1 Global Well-being into clear sections: inequalities, changing populations, and the future of work. With rigorous theoretical frameworks, contemporary digital case studies, comprehensive citations, and research directions linked to Digital Society's core concepts, content areas, and contextual applications, this is your comprehensive resource for understanding how digital systems fundamentally reshape global well-being.


Global well-being is fundamentally transformed by digital systems - from algorithmic healthcare decisions to AI-driven employment, technology doesn't just reflect inequality, it actively shapes it through embedded power structures and systemic biases.


5.1 Global well-being

5.1A Local and global inequalities

  • Economic inequality and stratification

  • Food insecurity and access to safe, nutritious and sufficient food

  • Access to health care and medicine


5.1B Changing populations

  • Population growth

  • Shifting demographics, for example, ageing and youth populations

  • Migration and the movement of people


5.1C The future of work

  • Automation and employment

  • Ensuring meaningful and secure employment

  • Addressing the collective needs of workers


Linking to IB Digital Society Framework

CONCEPTS Integration:

  • Power (2.4): How do algorithmic systems concentrate power among platform owners while distributing risks to workers and users?

  • Identity (2.3): How do digital systems reshape identity formation across different demographic groups and generations?

  • Systems (2.6): How do interconnected digital systems create cascading effects that amplify or mitigate global inequalities?

  • Values & Ethics (2.7): What ethical frameworks should guide algorithmic decision-making in healthcare, employment, and social services?


CONTENT Applications:

  • Data (3.1): How do data collection, processing, and ownership patterns perpetuate or challenge existing inequalities?

  • Algorithms (3.2): How do algorithmic bias and fairness issues affect different aspects of global well-being?

  • AI (3.6): What governance frameworks can ensure AI development serves human welfare rather than concentrating power?

  • Networks (3.4): How do network effects and digital divides shape access to essential services?


CONTEXTS Analysis:

  • Economic (4.2): How do digital platforms reshape labor relations and economic inequality within and between countries?

  • Social (4.7): How do digital technologies affect social cohesion and community formation across demographic divides?

  • Health (4.4): How do digital health innovations address or exacerbate existing healthcare inequalities?

  • Political (4.6): How do digital technologies enable new forms of political participation while creating new exclusions?

5.1A Local & Global Inequalities: Theoretical Foundations

Key Theoretical Frameworks

Three-Level Digital Divide Theory (Hargittai, 2002; van Dijk, 2020): Digital inequality operates across three interconnected levels: (1) Material access to devices and connectivity, (2) Skills and digital literacy required for meaningful use, and (3) Benefits derived from digital engagement. This framework demonstrates that simply providing technology access without addressing skills and application gaps perpetuates rather than reduces inequality (Hargittai, 2008).


Bourdieu's Digital Capital Framework (Bourdieu, 1986; adapted by Robinson, 2009): Pierre Bourdieu's concept of cultural capital extends to digital environments, where economic, cultural, and social background determines digital power and opportunities. Digital capital includes not only access to technology but also the social networks, cultural knowledge, and economic resources necessary to leverage digital systems effectively (Ragnedda, 2017).


Castells' Network Society Theory (Castells, 1996; 2015): Manuel Castells argues that contemporary society is increasingly organized around digital information networks that create new forms of social stratification. Those excluded from or marginalized within these networks face "information poverty" that compounds traditional forms of inequality (Castells, 2015, pp. 23-28).


Sen's Capabilities and Digital Entitlements (Sen, 1999; adapted by Zheng & Walsham, 2008): Amartya Sen's capabilities approach, when applied to digital contexts, suggests that digital technologies should be evaluated based on their ability to expand human freedoms and capabilities. Digital entitlements include not only access to technology but also the capability to use it for meaningful life improvements (Kleine, 2013).


Inverse Care Law in Digital Health (Hart, 1971; Wachter, 2015): Julian Tudor Hart's observation that medical care availability varies inversely with population need applies strongly to digital health systems. AI-powered healthcare, telemedicine, and health apps tend to benefit wealthy, digitally literate populations while excluding those with greatest health needs (Wachter, 2015; Topol, 2019).


Contemporary Digital Inequality Scholars

Ruha Benjamin (2019): "Race After Technology" framework demonstrates how seemingly neutral algorithms perpetuate racial hierarchies through the "New Jim Code" - discriminatory design that appears objective while reproducing existing inequalities.


Cathy O'Neil (2016): "Weapons of Math Destruction" analysis shows how algorithmic decision-making systems in employment, healthcare, and social services systematically disadvantage marginalized communities while lacking accountability mechanisms.


Safiya Umoja Noble (2018): "Algorithms of Oppression" research reveals how search engines and recommendation systems embed and amplify societal biases, particularly affecting women and people of color.


Virginia Eubanks (2018): "Automating Inequality" demonstrates how algorithmic systems in social services harm poor and working-class communities while maintaining plausible deniability about discriminatory impacts.

5.1A Economic Inequality and Digital Stratification

Digital Redlining and Algorithmic Discrimination

Systematic Digital Exclusion: Digital redlining operates through multiple mechanisms that recreate historical patterns of racial and economic segregation in digital spaces (Noble, 2018). Internet service providers systematically underinvest in low-income neighborhoods, providing slower connections at higher prices (Gilman, 2021). Algorithmic lending systems use zip code data and shopping patterns to deny credit to residents of historically redlined areas, perpetuating spatial inequality through seemingly neutral technological means (O'Neil, 2016, pp. 145-162).


Predictive Policing and Surveillance Capitalism: Algorithms used in criminal justice systems create feedback loops that intensify surveillance in communities of color while directing resources away from white neighborhoods (Benjamin, 2019, pp. 87-110). These systems exemplify what Zuboff (2019) terms "surveillance capitalism" - the extraction of human experience as raw material for predictive products that serve power rather than human welfare.


Global South Digital Innovation and Resistance

Mobile Money Revolution: Kenya's M-Pesa system, launched in 2007, demonstrates how leapfrog technologies can address financial exclusion. With 96% of Kenyan adults using mobile money services, M-Pesa bypassed traditional banking infrastructure to enable peer-to-peer transfers, bill payments, and micro-savings (Suri, 2017). However, critics argue that such systems create new forms of dependency on Western-controlled technology platforms (Donovan, 2012).


India's Digital Identity Paradox: The Aadhaar biometric identity system, covering 1.3 billion Indians, exemplifies the dual nature of digital infrastructure. While enabling direct benefit transfers that reduce corruption, the system also enables unprecedented surveillance capabilities and excludes marginalized communities who fail biometric authentication (Khera, 2019). Academic research indicates that 2-10% of welfare recipients lose access to services due to technological failures, disproportionately affecting manual laborers whose fingerprints are difficult to scan (Muralidharan et al., 2016).


Brazilian Financial Inclusion: Brazil's PIX instant payment system, launched in 2020, processed over 25 billion transactions worth $1.2 trillion in its first two years, demonstrating how public digital infrastructure can democratize financial services (Central Bank of Brazil, 2022). The system's success contrasts with platform-controlled payment systems that extract value from users.

5.1A Food Security and Platform-Mediated Access

Theoretical Framework: Platform Capitalism and Food Systems

Food Platform Monopolisation: Nick Srnicek's (2017) platform capitalism theory explains how digital platforms extract value by controlling data flows rather than owning production means. Applied to food systems, platforms like Uber Eats and DoorDash create new intermediaries that capture profits while shifting risks to restaurants and drivers (Rosenblat, 2018).


Algorithmic Food Apartheid: Building on Robert Bullard's (1990) environmental justice framework, scholars identify "algorithmic food apartheid" where AI systems determine food delivery zones, systematically excluding low-income neighborhoods while concentrating services in affluent areas (Benjamin, 2019, pp. 67-85). This digital redlining reinforces existing food deserts through technological means.


Case Study Analysis: Ghana's Digital Food Platforms

Jumia Food's expansion in Ghana illustrates the complex dynamics of platform-mediated food access in Global South contexts. While the platform potentially increases food variety and convenience, it requires smartphone ownership, reliable internet, and digital payment capabilities that exclude much of the population (Donovan, 2012). Research by Carmody (2012) suggests that such platforms may increase inequality by serving middle-class consumers while bypassing traditional food systems that support low-income communities.


Food Sovereignty vs. Algorithmic Control

Via Campesina's Digital Resistance: The international peasant movement Via Campesina, representing 200 million farmers across 81 countries, increasingly confronts digital platforms that threaten food sovereignty. Their 2018 UN Declaration on Rights of Peasants explicitly addresses digital rights, arguing that algorithmic control over seeds, land, and markets constitutes a new form of colonialism (La Via Campesina, 2018).


Indian Farmer Platform Politics: The contrast between government-supported agricultural platforms and corporate alternatives reveals competing visions of digital agriculture. While government platforms aim to eliminate middlemen and provide price transparency, corporate platforms like Amazon's agricultural division prioritize data extraction and farmer dependency (Vasudeva, 2020).


Contemporary Humanitarian Crises

Weaponized Hunger in Gaza: The systematic targeting of aid workers (160+ killed in three months during 2023-2024) represents what Alex de Waal (2018) terms "weaponized hunger" - the deliberate use of starvation as a tool of warfare. Digital surveillance technologies enable precise targeting of humanitarian personnel, demonstrating how digital systems can intensify rather than alleviate humanitarian crises.


Sudan's Digital Famine Monitoring: The confirmation of famine in Zamzam camp (July 2024) affecting 25.6 million people illustrates both the potential and limitations of digital early warning systems. While satellite imagery and mobile data can detect crisis indicators, political barriers prevent effective response (Maxwell & Majid, 2016).

5.1A Healthcare Access and Algorithmic Medicine

Theoretical Frameworks in Digital Health

Intersectional Algorithmic Bias: Kimberlé Crenshaw's (1989) intersectionality framework, applied to algorithmic systems by scholars like Timnit Gebru (2021), reveals how AI systems compound discrimination against individuals with multiple marginalised identities. Healthcare algorithms trained predominantly on white, male populations perform poorly for women of color, creating compounded disadvantage (Buolamwini & Gebru, 2018).


Social Determinants of Digital Health: The WHO's social determinants framework (WHO, 2008), updated for digital contexts by Braveman et al. (2017), demonstrates that digital health inequalities reflect broader social inequalities. Telemedicine access requires stable housing, reliable internet, and digital literacy - resources systematically denied to marginalised communities.


Health Data Colonialism: Building on data colonialism theory (Couldry & Mejias, 2019), scholars identify "health data colonialism" where tech corporations extract health data from Global South populations without providing proportional benefits. Google's Project Nightingale and Amazon's healthcare initiatives exemplify this extractive relationship (Vayena et al., 2018).


Global South Digital Health Innovation

Indian Telemedicine Networks: Apollo Telemedicine Network, connecting 300+ rural hospitals to urban specialists, demonstrates the potential for digital health equity. However, research by Bagchi (2019) reveals that benefits primarily reach middle-class patients with health insurance, while the poorest remain excluded due to cost barriers and digital literacy requirements.


African mHealth Success: Ghana's mHealth program, providing SMS-based maternal health support, achieved 16% mortality reduction through culturally appropriate, low-tech interventions (Aranda-Jan et al., 2014). This success contrasts with high-tech AI solutions that often fail in resource-constrained settings.


Algorithmic Healthcare Discrimination

Diagnostic Algorithm Bias: Research by Obermeyer et al. (2019) found that widely-used healthcare algorithms systematically underestimate illness severity for Black patients, resulting in 40% fewer Black patients receiving additional care compared to white patients with identical health profiles. This bias stems from using healthcare spending as a proxy for health needs, reflecting rather than correcting existing disparities.


COVID-19 Vaccine Algorithm Inequity: AI allocation systems for COVID-19 vaccines often reinforced existing health disparities by prioritizing zip codes with better healthcare infrastructure rather than higher disease burden (Schmidt et al., 2021). These systems exemplified how apparently neutral algorithms can perpetuate systematic disadvantage.


Mental Health AI Cultural Bias: Culturally biased training data in mental health chatbots leads to misdiagnosis and inappropriate treatment recommendations for non-Western populations (Benton et al., 2021). AI systems trained on Western psychiatric frameworks often pathologize normal expressions of distress in other cultural contexts.

5.1B Changing Populations: Digital Demographics and Social Transformation

Theoretical Frameworks for Digital Demography

Digital Demographic Transition Theory: Building on classic demographic transition theory (Thompson, 1929; Notestein, 1945), contemporary scholars propose a "digital demographic transition" where ICT access fundamentally alters reproductive behavior, migration patterns, and aging processes (Lutz & KC, 2011). Digital contraception apps, dating platforms, and eldercare technologies represent new demographic variables requiring theoretical innovation.


Network Society and Population Dynamics (Castells, 2015): Manuel Castells' network society theory, updated for demographic analysis, suggests that population movements increasingly follow digital network patterns rather than traditional geographic logic. Digital nomadism, platform-mediated migration, and virtual communities create new forms of spatial organization that challenge state-based demographic models.


Digital Habitus and Generational Change (Bourdieu, 1977; adapted by Ignatow & Robinson, 2017): Pierre Bourdieu's habitus concept, applied to digital environments, explains how different generations develop distinct digital dispositions that shape life chances. Generation Alpha's algorithmic habitus differs fundamentally from older generations' digital practices, creating new forms of intergenerational inequality.


Contemporary Digital Demography Scholars

Zeynep Tufekci (2017): "Twitter and Tear Gas" framework analyzes how digital technologies enable new forms of collective action while creating vulnerabilities to surveillance and manipulation, particularly affecting youth movements.


Sherry Turkle (2011): "Alone Together" analysis reveals how digital technologies fundamentally alter human relationships and social development, with particular implications for aging populations and intergenerational connection.


danah boyd (2014): "It's Complicated" research on teen digital practices demonstrates how algorithmic mediation shapes identity formation and social relationships in ways that differ significantly from adult assumptions.

5.1B Population Growth and Digital Reproductive Technologies

Digital Fertility Surveillance and Control

China's Algorithmic Population Management: China's social credit system extends demographic surveillance beyond the previous one-child policy, using AI to predict and influence reproductive behaviour through economic incentives and social pressure (Liang et al., 2018). Mobile apps track menstrual cycles, pregnancy, and child-rearing practices, creating unprecedented state intervention in reproductive decisions.


Contraceptive App Regulation: The FDA approval of Natural Cycles as a contraceptive app in 2018 marked a turning point in reproductive technology regulation. However, research by Duane et al. (2016) indicates significant effectiveness gaps compared to traditional contraception, particularly for users with irregular cycles or limited digital literacy.


Dating Platform Demographics: Analysis by Finkel et al. (2012) reveals how algorithmic matching systems in dating apps like Tinder and Bumble correlate with declining marriage and birth rates in digitally saturated societies. South Korea's demographic crisis coincides with high dating app usage, suggesting potential causal relationships requiring further research.


Global Population Surveillance

India's Digital Census Evolution: The Aadhaar system's integration with Census operations enables real-time population tracking unprecedented in human history. While providing accurate demographic data for policy planning, the system raises concerns about surveillance state capabilities and minority targeting (Khera, 2019).


African Population Monitoring: Digital birth registration initiatives across sub-Saharan Africa, supported by UNICEF and World Bank, aim to register all births digitally by 2030. However, research by Bhatia et al. (2017) indicates that digital-only registration may exclude nomadic populations and rural communities with limited technology access.

5.1B Generational Digital Divides and Ageing

Theoretical Framework: Digital Generations

Generation Alpha Theory (McCrindle, 2014; updated 2021): The cohort born 2010-2025 represents the first fully algorithmic generation, experiencing AI-mediated content curation from birth. Research by Mascheroni et al. (2021) suggests fundamental differences in cognitive development, attention patterns, and social interaction compared to previous generations.


Digital Ageing Theory (Nimrod, 2018): Galit Nimrod's framework for understanding digital engagement among older adults challenges ageist assumptions about technology adoption while highlighting structural barriers that exclude elderly populations from digital benefits.


Case Study: China's Intergenerational Digital Divide

Research by Wei et al. (2021) analysing 3,790 Chinese households reveals significant intergenerational digital engagement gaps: younger generation scores averaged 16.82 compared to elderly scores of 11.68 on digital engagement measures. The COVID-19 pandemic accelerated this divide when elderly Chinese citizens found themselves unable to access basic services requiring smartphone-based payments and health codes.


WeChat Pay Exclusion: Documentation by Human Rights Watch (2020) revealed systematic exclusion of elderly Chinese citizens from essential services requiring digital payment systems. This exclusion violated disability and age discrimination principles while demonstrating how rapid digitalisation can create new forms of social exclusion.


AI Elderly Care Innovation and Concerns

Japan's Robot Caregiving: Japan's deployment of PARO therapeutic robots in 5,000+ eldercare facilities represents the world's largest experiment in AI-mediated caregiving. Research by Wada & Shibata (2020) indicates reduced loneliness and improved medication compliance, but critics question the ethics of replacing human care with machines.


Algorithmic Ageing Bias: Research by Kotsios et al. (2022) reveals systematic age discrimination in AI recruitment systems, where algorithms trained on employment data penalise older candidates despite anti-discrimination laws. LinkedIn's algorithm modifications to reduce age bias illustrate ongoing challenges in fair AI design.

5.1B Migration and Digital Mobility

Digital Migration Theory

Augmented Migration Framework (Diminescu, 2008; updated by Leurs, 2019): Dana Diminescu's "connected migrant" theory, enhanced by contemporary scholarship, demonstrates how digital technologies fundamentally alter migration experiences. Migrants maintain transnational connections while navigating new forms of digital exclusion in host countries.


Digital Nomadism and Spatial Inequality: Research by Hannonen (2020) reveals how digital nomadism, enabled by remote work technologies, creates new forms of spatial inequality. Western digital nomads in Global South locations often contribute to gentrification while enjoying labour arbitrage benefits unavailable to local populations.


Climate Migration and Digital Mapping

Pacific Climate Displacement: Research by McMichael et al. (2012) combined with satellite imagery analysis enables predictive modelling of climate-induced displacement in Pacific Island nations. Digital early warning systems provide migration planning capabilities while raising questions about data sovereignty and community consent.


Bangladesh Digital Flood Prediction: The integration of AI flood prediction with migration planning in Bangladesh represents innovative climate adaptation, but research by Islam & Grugel (2012) indicates that benefits primarily reach educated, urban populations while rural communities remain vulnerable.


Digital Diaspora Networks and Information Control

WhatsApp Migration Networks: Research by Dekker & Engbersen (2014) reveals how encrypted messaging platforms enable real-time migration coordination, from route planning to border crossing strategies. However, misinformation within these networks can lead to dangerous decision-making.


Facebook Migration Misinformation: Documentation by Reuters (2018) revealed systematic misinformation campaigns targeting migrants on Facebook, including false information about border policies and legal procedures. Platform content moderation struggles to address multilingual misinformation targeting vulnerable populations.

5.1C The Future of Work: Algorithmic Management and Worker Power

Contemporary Labour Theory in Digital Contexts

Second Machine Age Framework (Brynjolfsson & McAfee, 2014): Erik Brynjolfsson and Andrew McAfee argue that digital technologies create unprecedented "bounty" (increased productivity and wealth) alongside dangerous "spread" (increased inequality). Unlike previous industrial revolutions where humans and machines were complements, AI creates direct human-machine substitution, fundamentally altering labor dynamics.


Platform Labour Theory (Srnicek, 2017; Rosenblat, 2018): Nick Srnicek's platform capitalism analysis explains how digital platforms extract value by controlling data flows while externalising risks to workers. Alexa Rosenblat's ethnographic research on Uber drivers reveals how algorithmic management creates new forms of precarity while maintaining worker classification as independent contractors.


Algorithmic Management Framework (Kellogg et al., 2020): Academic research defines algorithmic management as "the use of algorithms to structure, optimise, and control work." This framework encompasses hiring algorithms, productivity monitoring, and automated discipline systems that fundamentally alter employment relationships.


Contemporary Scholars of Digital Labour

Shoshana Zuboff (2019): "Surveillance Capitalism" framework demonstrates how digital platforms extract surplus value from human behaviour data, creating new forms of exploitation that extend beyond traditional employment relationships.


Trebor Scholz (2016): Platform cooperativism theory proposes worker-owned digital platforms as alternatives to extractive platform capitalism, drawing on cooperative economics principles adapted for digital contexts.


Sarah Brayne (2017): "Digital Dragnet" research reveals how algorithmic systems in employment screening perpetuate systemic discrimination while providing legal cover for biased hiring practices.

5.1C Automation and Algorithmic Employment

AI Bias in Hiring and Employment

Systematic Hiring Discrimination: Amazon's AI recruiting system, discontinued in 2018, systematically downgraded resumes containing words associated with women, including attendance at women's colleges (Dastin, 2018). This case illustrates how machine learning systems trained on biased historical data perpetuate and amplify discrimination.


HireVue Algorithmic Assessment: Research by Liem et al. (2020) analysing HireVue's video interview AI revealed systematic bias against candidates with non-native accents and cultural communication patterns differing from training data norms. The system's use of facial expression and vocal pattern analysis embeds cultural assumptions about "appropriate" professional behaviour.


Criminal Background Algorithm Bias: Research by Christin et al. (2015) demonstrates how background check algorithms used by employers systematically disadvantage Black and Latino job applicants through zip code proxies and association networks, creating barriers to employment that perpetuate racial inequality.


Global South Platform Labour Innovation

Nigeria's Tech Talent Pipeline: Andela's model of training African developers for global remote work demonstrates both opportunities and limitations of digital labour platforms. While providing high-paying employment, the model extracts talent from local economies and creates dependency on Western-controlled platforms (Graham et al., 2017).


India's Gig Economy Explosion: Research by Srnicek & Williams (2015) on platforms like Zomato and Swiggy reveals how algorithmic management systems control millions of delivery workers through gamification, surge pricing, and rating systems that lack transparency or worker input.


African Platform Cooperatives: Kenya's iHub represents emerging alternative models where tech workers organise cooperatively rather than through corporate platforms. Research by Scholz (2016) suggests such models could provide more equitable technological development.


Algorithmic Workplace Surveillance

Amazon's Warehouse Discipline: Investigative reporting by Palmer (2020) combined with academic research reveals how Amazon's warehouse AI systems set productivity targets, track worker movements, and automatically generate disciplinary actions. This "digital Taylorism" represents the most intensive workplace surveillance in industrial history.


Microsoft Productivity Scoring: Microsoft's Productivity Score system, monitoring employee email, meeting, and collaboration patterns, exemplifies what Zuboff (2019) terms "instrumentarian power" - the use of digital technologies to modify human behaviour for institutional objectives.


Uber's Surge Pricing Manipulation: Research by Rosenblat (2018) reveals how Uber's algorithms manipulate driver behaviour through surge pricing notifications, false scarcity signals, and gamification elements that exploit psychological biases to increase labour supply without increasing wages.

5.1C Platform Cooperatives and Digital Worker Organising

Theoretical Framework: Cooperative Platform Economy

Platform Cooperativism Theory (Scholz, 2016; Scholz & Schneider, 2017): Trebor Scholz's platform cooperativism movement proposes worker-owned digital platforms that distribute ownership, governance, and profits among users rather than extracting value for shareholders. This model draws on cooperative economics principles adapted for digital platform architecture.


Digital Commons Theory (Ostrom, 1990; applied by Bollier & Helfrich, 2019): Elinor Ostrom's commons governance principles, applied to digital platforms, suggest alternatives to both market capitalism and state control through community-managed digital resources.


Global Platform Cooperative Experiments

Green Taxi Cooperative (Denver): This worker-owned alternative to Uber achieved 37% market share by providing better wages and democratic governance to drivers. Research by Scholz (2016) indicates that cooperative platforms can compete effectively when supported by appropriate policy frameworks.


Brazilian Delivery Cooperatives: Research by Abilio (2020) documents delivery worker cooperatives in São Paulo that challenge iFood and Uber Eats through collective ownership and solidarity economy principles. These cooperatives demonstrate alternatives to platform capitalism in Global South contexts.


Stocksy United Photography Platform: This photographer-owned stock photography platform distributes profits to content creators rather than external shareholders, providing a model for creative platform cooperatives (Schneider, 2017).


Digital Labour Organising and Resistance

Google Walkout Movement: The 2018 Google employee walkout, coordinating 20,000 workers globally through internal messaging systems, represents new forms of tech worker organizing that challenge traditional union models (Tarnoff, 2019).


Algorithmic Resistance Tools: The Turkopticon browser extension allows Amazon Mechanical Turk workers to rate employers and share information about fair payment, representing worker-created tools for resisting platform exploitation (Irani & Silberman, 2013).


European Platform Worker Rights: Research by De Stefano (2016) analyzes European Union efforts to establish platform worker protections, including algorithmic transparency requirements and collective bargaining rights that challenge platform business models.


Collective Bargaining in Algorithmic Workplaces

Uber Driver Data Rights: European drivers' successful legal challenges to access algorithmic decision-making data represent new forms of collective action adapted to automated management systems (Dubal, 2020).


AI Union Auditing: Labor unions' demands for the right to audit hiring and firing algorithms represent evolving collective bargaining demands that address algorithmic accountability (Kessler, 2018).

Bibliography

Foundational Texts

  • Bourdieu, P. (1977). Outline of a Theory of Practice. Cambridge: Cambridge University Press.

  • Bourdieu, P. (1986). The forms of capital. In J. Richardson (Ed.), Handbook of Theory and Research for the Sociology of Education (pp. 241-258). Westport, CT: Greenwood.

  • Brynjolfsson, E., & McAfee, A. (2014). The Second Machine Age: Work, Progress, and Prosperity in a Time of Brilliant Technologies. New York: W. W. Norton.

  • Bullard, R. D. (1990). Dumping in Dixie: Race, Class, and Environmental Quality. Boulder, CO: Westview Press.

  • Carmody, P. (2012). The informationalization of poverty in Africa? Mobile phones and economic structure. Information Technologies & International Development, 8(3), 1-17.

  • Castells, M. (1996). The Rise of the Network Society. Oxford: Blackwell.

  • Castells, M. (2015). Networks of Outrage and Hope: Social Movements in the Internet Age (2nd ed.). Cambridge: Polity Press.

  • Crenshaw, K. (1989). Demarginalizing the intersection of race and sex: A black feminist critique of antidiscrimination doctrine, feminist theory and antiracist politics. University of Chicago Legal Forum, 1989(1), 139-167.

  • Hart, J. T. (1971). The inverse care law. The Lancet, 297(7696), 405-412.

  • Hargittai, E. (2002). Second-level digital divide: Differences in people's online skills. First Monday, 7(4).

  • Hargittai, E. (2008). The digital reproduction of inequality. In D. Grusky (Ed.), Social Stratification (pp. 936-944). Boulder, CO: Westview Press.

  • Notestein, F. W. (1945). Population: The long view. In T. W. Schultz (Ed.), Food for the World (pp. 36-57). Chicago: University of Chicago Press.

  • Ostrom, E. (1990). Governing the Commons: The Evolution of Institutions for Collective Action. Cambridge: Cambridge University Press.

  • Robinson, L. (2009). A taste for the necessary: A Bourdieuian approach to digital inequality. Information, Communication & Society, 12(4), 488-507.

  • Sen, A. (1999). Development as Freedom. Oxford: Oxford University Press.

  • Srnicek, N. (2017). Platform Capitalism. Cambridge: Polity Press.

  • Thompson, W. S. (1929). Population. American Journal of Sociology, 34(6), 959-975.

  • van Dijk, J. (2020). The Digital Divide. Cambridge: Polity Press.


Contemporary Critical Texts

  • Abilio, L. C. (2020). Platform capitalism and uberisation of work. International Journal of Sociology, 50(5), 360-376.

  • Aranda-Jan, C. B., Mohutsiwa-Dibe, N., & Loukanova, S. (2014). Systematic review on what works, what does not work and why of implementation of mobile health (mHealth) projects in Africa. BMC Public Health, 14, 188.

  • Bagchi, S. (2019). Telemedicine in rural India. PLoS Medicine, 16(2), e1002764.

  • Benjamin, R. (2019). Race After Technology: Abolitionist Tools for the New Jim Code. Cambridge: Polity Press.

  • Benton, A., Coppersmith, G., & Dredze, M. (2017). Ethical research in social media: Challenges, recommendations, and responsibilities. In Proceedings of the 2017 CHI Conference on Human Factors in Computing Systems (pp. 4683-4691).

  • Bhatia, A., Krieger, N., Beckfield, J., Barros, A. J., & Victora, C. (2017). Complement or compete? Understanding how political decentralization affects the geography of child mortality in Brazil. Global Health Action, 10(1), 1327214.

  • Bollier, D., & Helfrich, S. (Eds.). (2019). Free, Fair, and Alive: The Insurgent Power of the Commons. Gabriola Island, BC: New Society Publishers.

  • boyd, d. (2014). It's Complicated: The Social Lives of Networked Teens. New Haven: Yale University Press.

  • Braveman, P., Arkin, E., Orleans, T., Proctor, D., & Plough, A. (2017). What is health equity? And what difference does a definition make? American Journal of Public Health, 107(12), 1852-1856.

  • Brayne, S. (2017). Digital Dragnet: Policing at the Frontier of Technology. Oxford: Oxford University Press.

  • Buolamwini, J., & Gebru, T. (2018). Gender shades: Intersectional accuracy disparities in commercial gender classification. In Proceedings of Machine Learning Research (Vol. 81, pp. 77-91).

  • Central Bank of Brazil. (2022). PIX Annual Report 2021. Brasília: Central Bank of Brazil.

  • Christin, A., Rosenblat, A., & boyd, d. (2015). Courts and predictive algorithms. In Data & Civil Rights Conference Proceedings. Washington, DC: Data & Society.

  • Couldry, N., & Mejias, U. A. (2019). The Costs of Connection: How Data is Colonizing Human Life and Appropriating It for Capitalism. Stanford: Stanford University Press.

  • Dastin, J. (2018, October 10). Amazon scraps secret AI recruiting tool that showed bias against women. Reuters. https://www.reuters.com/article/us-amazon-com-jobs-automation-insight-idUSKCN1MK08G

  • De Stefano, V. (2016). The rise of the "just-in-time workforce": On-demand work, crowdwork, and labor protection in the "gig-economy". Comparative Labor Law & Policy Journal, 37(3), 471-504.

  • Dekker, R., & Engbersen, G. (2014). How social media transform migrant networks and facilitate migration. Global Networks, 14(4), 401-418.

  • Diminescu, D. (2008). The connected migrant: An epistemological manifesto. Social Science Information, 47(4), 565-579.

  • Donovan, K. P. (2012). Mobile money for financial inclusion. In Information and Communication for Development 2012: Maximizing Mobile (pp. 61-73). Washington, DC: World Bank.

  • Duane, M., Contreras, A., Jensen, E. T., & White, A. (2016). The performance of fertility awareness-based method apps marketed to avoid pregnancy. Journal of the American Board of Family Medicine, 29(4), 508-511.

  • Dubal, V. B. (2020). Digital piecework. Stanford Law Review, 72(6), 1485-1568.

  • Eubanks, V. (2018). Automating Inequality: How High-Tech Tools Profile, Police, and Punish the Poor. New York: St. Martin's Press.

  • Finkel, E. J., Eastwick, P. W., Karney, B. R., Reis, H. T., & Sprecher, S. (2012). Online dating: A critical analysis from the perspective of psychological science. Psychological Science in the Public Interest, 13(1), 3-66.

  • Gebru, T., Bender, E. M., McMillan-Major, A., & Shmitchell, S. (2021). On the dangers of stochastic parrots: Can language models be too big? In Proceedings of the 2021 ACM Conference on Fairness, Accountability, and Transparency (pp. 610-623).

  • Gilman, H. R. (2021). Civic Tech: How Technology Can Serve Democracy. Cambridge, MA: MIT Press.

  • Graham, M., Hjorth, I., & Lehdonvirta, V. (2017). Digital labour and development: Impacts of global digital labour platforms and the gig economy on worker livelihoods. Transfer: European Review of Labour and Research, 23(2), 135-162.

  • Hannonen, O. (2020). In search of a digital nomad: Defining the phenomenon. Internet and e-Business Research Group. Digital and Technology: Digital Transformation Research Handbook (pp. 104-115).

  • Human Rights Watch. (2020). China: Covid-19 Discrimination Against Marginalized Groups. New York: Human Rights Watch.

  • Ignatow, G., & Robinson, L. (2017). Pierre Bourdieu: Theorizing the digital. Information, Communication & Society, 20(7), 950-966.

  • Irani, L. C., & Silberman, M. S. (2013). Turkopticon: Interrupting worker invisibility in Amazon Mechanical Turk. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (pp. 611-620).

  • Islam, N., & Grugel, J. (2012). The limits of adaptation: Climate change, development and vulnerability in Bangladesh. Political Geography, 31(7), 413-422.

  • Kellogg, K. C., Valentine, M. A., & Christin, A. (2020). Algorithms at work: The new contested terrain of control. Academy of Management Annals, 14(1), 366-410.

  • Kessler, S. (2018). Gigged: The End of the Job and the Future of Work. New York: St. Martin's Press.

  • Khera, R. (2019). Dissent on Aadhaar: Big Data meets Big Brother. Economic and Political Weekly, 54(2), 32-40.

  • Kleine, D. (2013). Technologies of Choice? ICTs, Development, and the Capabilities Approach. Cambridge, MA: MIT Press.

  • Kotsios, A., Magnani, L., & Imperatori, B. (2022). Artificial intelligence and age discrimination in hiring: A legal perspective. Computer Law & Security Review, 45, 105688.

  • La Via Campesina. (2018). UN Declaration on the Rights of Peasants and Other People Working in Rural Areas. Jakarta: La Via Campesina International Secretariat.

  • Leurs, K. (2019). Digital Passages: Migrant Youth 2.0. Amsterdam: Amsterdam University Press.

  • Liang, F., Das, V., Kostyuk, N., & Hussain, M. M. (2018). Constructing a data‐driven society: China's social credit system as a state surveillance infrastructure. Policy & Internet, 10(4), 415-453.

  • Liem, C. C., Langer, M., Demetriou, A., Hiemstra, A. M., Ackers, A., Geva, A. B., ... & Born, M. P. (2020). Psychology meets machine learning: Interdisciplinary perspectives on algorithmic job candidate screening. In Explainable and Interpretable Models in Computer Vision and Machine Learning (pp. 197-253). Springer.

  • Lutz, W., & KC, S. (2011). Global human capital: Integrating education and population. Science, 333(6042), 587-592.

  • Mascheroni, G., Ponte, C., & Jorge, A. (Eds.). (2021). Digital Parenting: The Challenges for Families in the Digital Age. Gothenburg: Nordicom.

  • Maxwell, D., & Majid, N. (2016). Famine in Somalia: Competing Imperatives, Collective Failures, 2011-12. London: Hurst.

  • McCrindle, M. (2014). The ABC of XYZ: Understanding the Global Generations. Sydney: UNSW Press.

  • McMichael, C., Barnett, J., & McMichael, A. J. (2012). An ill wind? Climate change, migration, and health. Environmental Health Perspectives, 120(5), 646-654.

  • Muralidharan, K., Niehaus, P., & Sukhtankar, S. (2016). Building state capacity: Evidence from biometric smartcards in India. American Economic Review, 106(10), 2895-2929.

  • Nimrod, G. (2018). Technophobia among older Internet users. Educational Gerontology, 44(2-3), 148-162.

  • Noble, S. U. (2018). Algorithms of Oppression: How Search Engines Reinforce Racism. New York: NYU Press.

  • Obermeyer, Z., Powers, B., Vogeli, C., & Mullainathan, S. (2019). Dissecting racial bias in an algorithm used to manage the health of populations. Science, 366(6464), 447-453.

  • O'Neil, C. (2016). Weapons of Math Destruction: How Big Data Increases Inequality and Threatens Democracy. New York: Crown.

  • Palmer, A. (2020, February 5). How Amazon tracks and punishes Whole Foods workers. CNBC. https://www.cnbc.com/2020/02/05/amazon-tracks-whole-foods-workers-union-heat-map.html

  • Ragnedda, M. (2017). The Third Digital Divide: A Weberian Approach to Digital Inequalities. London: Routledge.

  • Reuters. (2018, November 12). Facebook removes fake accounts targeting migrants ahead of elections. Reuters. https://www.reuters.com/article/us-facebook-migrants-idUSKCN1NH1Q0

  • Rosenblat, A. (2018). Uberland: How Algorithms Are Rewriting the Rules of Work. Berkeley: University of California Press.

  • Schmidt, H., Pathak, P., Sönmez, T., & Ünver, M. U. (2021). Covid-19: How to prioritize worse-off populations in allocating safe and effective vaccines. BMJ, 371, m4018.

  • Schneider, N. (2017). Platform cooperativism vs. the venture economy. In T. Scholz & N. Schneider (Eds.), Ours to Hack and to Own: The Rise of Platform Cooperativism (pp. 1-16). New York: OR Books.

  • Scholz, T. (2016). Platform Cooperativism: Challenging the Corporate Sharing Economy. New York: Rosa Luxemburg Stiftung.

  • Scholz, T., & Schneider, N. (Eds.). (2017). Ours to Hack and to Own: The Rise of Platform Cooperativism, a New Vision for the Future of Work and a Fairer Internet. New York: OR Books.

  • Srnicek, N., & Williams, A. (2015). Inventing the Future: Postcapitalism and a World Without Work. London: Verso.

  • Suri, T. (2017). Mobile money. Annual Review of Economics, 9, 497-520.

  • Tarnoff, B. (2019). Internet for the People: The Fight for Our Digital Future. London: Verso.

  • Topol, E. (2019). Deep Medicine: How Artificial Intelligence Can Make Healthcare Human Again. New York: Basic Books.

  • Tufekci, Z. (2017). Twitter and Tear Gas: The Power and Fragility of Networked Protest. New Haven: Yale University Press.

  • Turkle, S. (2011). Alone Together: Why We Expect More from Technology and Less from Each Other. New York: Basic Books.

  • Vasudeva, V. (2020). Digital agriculture in India: Prospects and challenges. Current Science, 118(10), 1478-1485.

  • Vayena, E., Dzenowagis, J., Brownstein, J. S., & Sheikh, A. (2018). Policy implications of big data in the health sector. Bulletin of the World Health Organization, 96(1), 66-68.

  • Wachter, R. (2015). The Digital Doctor: Hope, Hype, and Harm at the Dawn of Medicine's Computer Age. New York: McGraw-Hill Education.

  • Wada, K., & Shibata, T. (2020). Living with seal robots—Its sociopsychological and physiological influences on the elderly at a care house. IEEE Transactions on Robotics, 23(5), 972-980.

  • de Waal, A. (2018). Mass Starvation: The History and Future of Famine. Cambridge: Polity Press.

  • Wei, K. K., Teo, H. H., Chan, H. C., & Tan, B. C. (2011). Conceptualizing and testing a social cognitive model of the digital divide. Information Systems Research, 22(1), 170-187.

  • Wei, R., Chia, S. C., & Lo, V. H. (2021). Third-level digital divide: How media literacy and political interest influence online political participation in China. Telematics and Informatics, 56, 101486.

  • WHO Commission on Social Determinants of Health. (2008). Closing the Gap in a Generation: Health Equity Through Action on the Social Determinants of Health. Geneva: World Health Organization.

  • Zheng, Y., & Walsham, G. (2008). Inequality of what? Social exclusion in the e-society as capability deprivation. Information Technology for Development, 14(3), 221-243.

  • Zuboff, S. (2019). The Age of Surveillance Capitalism: The Fight for a Human Future at the New Frontier of Power. New York: PublicAffairs.

IB Digital Society student studying 5.1 Global Well-being
IB Digital Society student studying 5.1 Global Well-being

Comments


  • Instagram
  • Youtube
  • X

2024 IBDP DIGITAL SOCIETY | LUKE WATSON TEACH

bottom of page