Case study: Taking a deep dive into Digital Ability

Older man using a laptop outside

As digital devices and services become more embedded in daily routines, the Digital Ability needed to navigate them is crucial to ensuring fair and equitable access to essential services and online opportunities [1].

Just as digital tools constantly evolve, so too do the skills required to use them effectively. Identifying what online abilities you need to be considered digitally included is a moving target [2]. Without digital skills, a person cannot access and use the internet effectively. And, as new devices are released, maintaining up-to-date digital skills requires ongoing investment. Digital Ability is thus not a static set of skills, but rather an ongoing process of development and maintenance [3].

Researchers have used a variety of methods to measure and validate the components – including the basic through to advanced ‘skills’, ‘literacies’, and ‘capabilities’ – that underpin the concept of Digital Ability [4].

To understand the distribution of these skills, and where the gaps are, the ADII measures the level of Digital Ability held by Australians by considering what they do online, and their confidence in undertaking basic tasks through to more advanced activities. We do this by using a tailored version of the Internet Skills Scale (ISS).

The Internet Skills Scale

Developed in 2014 by leading digital inclusion researchers Van Deursen, Helsper and Eynon, the ISS offers a survey instrument for measuring digital skills that has been successfully used around the world. 

Where some approaches to measuring digital skills can be overly simple, the ISS identifies five skills domains, ranging from basic to advanced [5]: operational, informational navigation, social, creative, and mobile. In 2019, a sixth domain, consisting of Internet of Things-specific skills, was added [6].

Our modification of the ISS involved slightly condensing it and converting some questions from negative to positive statements [7]. Reflecting the increased utility and uptake of mobile devices, we chose not to distinguish between digital skills and mobile-specific skills. And focusing on the growing prevalence of IoT devices and technologies, and their broader consequences, we refer to ‘Internet of Things’ skills as ‘Automation’ skills.

For the ADII, we use the ISS to generate an index dimension, rather than a scale. A person with the highest Digital Ability score can perform the range of tasks across each component while those with the lowest score have basic to no operational skills, reducing their capacity to get online.

Table 1: Digital Ability components

Basic operational: Including downloading and opening files, connecting to the internet, and setting passwords.

Advanced operational: Including saving to the cloud, determining what is safe to download, customising devices and connections, and adjusting privacy settings.

Information navigation: Including searching and navigating, verifying trustworthy information, and managing third party data collection.

Social: Including deciding what to share, how, and who with, manage and monitor contacts, and communicate with others.

Creative: Including editing, producing, and posting content, as well as a broad understanding of the rules that may apply to these activities.

Automation: Including connecting, operating, and managing smart devices and IoT technologies.

In this case study, we look at Digital Ability scores outside of the total Index. When a dimension is added to the overall Index, it is equally weighted to derive the overall Index score. This means a person’s Digital Ability score can be higher or lower than their total Index score.

Digital Ability across Australia

In 2021, the national average score for Digital Ability was 64.4 [8], which is a very slight increase (up 0.8) from the 2020 score of 63.6. Almost all Digital Ability components increased at the national level between 2020 and 2021.

Table 2: National Digital Ability scores

2020 2021 Gap
Digital Ability
Operational basic
Operational advanced
Information navigation

These gains, however, have not been evenly shared by everyone in Australia.

Digital Ability declines with age...

Digital Ability scores align closely with age, with young adults under 34 receiving a score of 81.6 — 17.2 points higher than the national average (64.4), and 54.4 points higher than that of Australians over 75 (27.2). This age-gap is evident across each Digital Ability component and increases as tasks become more complex. 

Table 3: 2021 Digital Ability Age-Gap

National 18-34-year-olds +75-year-olds Age-Gap
Digital Ability
Operational basic
Operational advanced
Information navigation

While by far the most marked, it is not just at the extreme ends of the age range that we see the impact of age on Digital Ability scores. There is a significant drop in Digital Ability after the age of 55. 55–64-year-olds score 56.5 on this dimension (a gap of 7.9 to the national average), and 65–74-year-olds score only 41.8 (22.6 points less than the national). Continued work towards closing the age gap and enhancing the Digital Ability of senior Australians remains critical.

…but improves as education and income levels rise

As with the Index as a whole, Digital Ability scores improve as education and income levels rise.

While those with a bachelor’s degree or above have a Digital Ability score of 74.7 (10.3 points higher than the national score), people who did not complete secondary school record a score of 36.3. This is 28.1 points lower than the national score and 38.4 points lower than those with a bachelor’s degree or higher.

Those in the lowest income quintile (earning under $33,800 per annum) have a Digital Ability score of just 45.8; 18.6 points lower than the national score, and 33.8 points lower than the highest income quintile (79.6).

Australians receiving income support record a Digital Ability score of 52.3 (12.1 points less than the national score, and 17 points less than those who do not receive income support). Australians who are not in the labour force score only 50.6 for Digital Ability – 13.8 points lower than the national score, 6 points less than unemployed Australians (those who had actively looked for work in the past four weeks; 56.6), and 23 points less than those currently employed (73.6).

At the national level, women have slightly lower Digital Ability scores than men, but this pattern is reversed in regional Australia

Gender has a minimal impact on Digital Ability at the national level [9]. Women registered a Digital Ability score of 64.0 in 2021, which is slightly lower than that recorded for men (65.4). This disparity plays out across each of the Digital Ability components, apart from in their social activities. Here, women receive a score of 64.8, 1.7 points higher than the national average, and 2.9 points higher than the male score of 61.9.

Table 4: 2021 Digital Ability gender gap

National Male Female Gender-Gap
Digital Ability
Operational basic
Operational advanced
Information navigation

These dynamics play out differently in regional Australia. Here, women score much higher than their male counterparts across all Digital Ability measures. Women in regional Australia have a Digital Ability score of 62.2, compared to 57.5 recorded for men. Notably however, and aligned with the metropolitan-regional gap, these scores remain lower than the national average for Digital Ability (2.2 and 6.9 points lower, respectively), and the metropolitan average of 66.7.

Households with children are more digitally able than households without

All household types with children have higher Digital Ability scores than the national average. Couples with children have a Digital Ability score of 74.0 (9.6 points higher than the national average), one parent families score 69.0 on this dimension (4.6 points higher than the national), and multi-family/group/other households score 65.3 (0.9 points higher than the national).

Digital Ability scores diminish in households without children, with the most significant gap seen for single persons who recorded a Digital Ability score of 52.4 in 2021. This is 12 points lower than the national score, and 21.6 points lower than couple with children households.

This is particularly concerning when considered in the wake of ongoing COVID-19 restrictions. While digital technologies might have provided single householders a critical lifeline to social interaction, their Social skills score of 52.7 – 10.3 points lower than the national average and 19.2 points lower than couple with children households – suggests this might not have been easy or possible for all.

Mobile-only users have a below-average Digital Ability score, but this is improving

Mobile-only users, 9.6% of the Australian population in 2021, register Digital Ability scores that are substantially lower than the national average across all components.

However, mobile-only users had an increase in Digital Ability from 2020, up 7.7 points from 2020 to 52.9 in 2021.

Table 5: 2021 Digital Ability: Mobile-only users

2020 2021 Gap
Digital Ability
Operational basic
Operational advanced
Information navigation

What can we say about Digital Ability in Australia in 2021?

Maintaining and improving digital skills takes time [10]. These skills also improve with use, which means people who use the internet regularly and in a variety of ways – for example, for work, education, or recreation – are more likely to score higher on this measure. With these skills come the opportunities to benefit from digital technologies. Australians with high Digital Ability scores are therefore better enabled to manage their health, access education and services, participate in cultural activities, organise their finances, follow news and media, and connect with family, friends, and the wider world. For those Australians lagging on this dimension – particularly older Australians, those with lower education and income levels, those living in regional areas, single householders, and mobile-only users – these opportunities may be out of reach. Significant and coordinated efforts to address these Digital Ability gaps will be critical if we are to ensure a more equitable post-COVID-19 digital economy.

References and footnotes

[1] E Hargittai, “Second-Level Digital Divide: Differences in People’s Online Skills,” First Monday 7, no. 4 (2002).

N Selwyn, “Reconsidering Political and Popular Understandings of the Digital Divide,” New Media & Society 6, no. 3 (2004): 341–62.

L D Stanley, “Beyond Access: Psychosocial Barriers to Computer Literacy Special Issue: ICTs and Community Networking,” The Information Society 19, no. 5 (2003): 407–16.

J A G M van Dijk, “Digital Divide Research, Achievements and Shortcomings,” Poetics 34, no. 4-5 (2006): 221–35.

M Warschauer, “Reconceptualising the Digital Divide,” First Monday 7, no. 7 (2002).

[2] A J A M van Deursen, E J Helsper, and R Eynon, “Development and Validation of the Internet Skills Scale (ISS),” Information, Communication & Society 19, no. 6 (2016): 804–823.

[3] P Walton, T Kop, D Spriggs, and B Fitzgerald, “A Digital Inclusion: Empowering All Australians,” Australian Journal of Telecommunications and the Digital Economy 1, no. 1, (2013): 1–17.

[4] H Bonfadelli, “The Internet and Knowledge Gaps: A Theoretical and Empirical Investigation,” European Journal of communication 17, no. 1 (2002): 65–84,

U Bunz, “The Computer-Email-Web (CEW) Fluency Scale – Development and Validation,” International Journal of Human-Computer Interaction 17, no. 4 (2004): 479–506.

U Bunz, C Curry, and W Voon, “Perceived Versus Actual Computer-Email-Web Fluency,” Computers in Human Behavior 23, no. 5 (2007): 2321–2344.

A J A M Van Deursen, and J A G M van Dijk, Digital Skills: Unlocking the Information Society (New York: Palgrave Macmillan, 2014).

A J A M van Deursen, J A G M van Dijk, and O Peters, “Proposing a Survey Instrument for Measuring Operational, Formal, Information and Strategic Internet Skills,” International Journal of Human-Computer Interaction 28, no. 12 (2012): 827–837.

J van Dijk, and K Hacker, “The Digital Divide as a Complex and Dynamic Phenomenon,” The Information Society 19, no. 4 (2003): 315–326.

M S Eastin, and R LaRose, “Internet Self-Efficacy and the Psychology of the Digital Divide,” Journal of Computer-Mediated Communication 6, no. 1 (2000).

P Gilster, Digital Literacy (New York: Wiley, 1997).

E Hargittai, “Survey Measures of Web-Oriented Digital Literacy,” Social Science Computer Review 23, no. 3 (2005): 371–379.

E Hargittai, and Y P Hsieh, “Succinct Survey Measures of Web-Use Skills,” Social Science Computer Review 30, no. 1 (2012): 95–107.

E J Helsper, “A Corresponding Fields Model for the Links Between Social and Digital Exclusion,” Communication Theory 22, no. 4 (2012): 403–426.

E J Helsper, and R Eynon, “Distinct Skill Pathways to Digital Engagement,” European Journal of Communication 28, no. 6 (2013): 696–713.

E Litt, “Measuring Users’ Internet Skills: A Review of Past Assessments and a Look Toward the Future,” New Media & Society 15, no. 4 (2013): 612–630.

D Potosky, “The Internet Knowledge (iKnow) measure,” Computers in Human Behavior 23, no. 6 (2007): 2760–2777.

B H Spitzberg, “Preliminary Development of a Model and Measure of Computer-Mediated Communication (CMC) Competence,” Journal of Computer-Mediated Communication 11, no. 2 (2006): 629–666.

[5] A J A M van Deursen, E J Helsper, and R Eynon, “Development and Validation of the Internet Skills Scale (ISS),” Information, Communication & Society 19, no. 6 (2016): 804–823.

[6] A J A M van Deursen, A van der Zeeuw, P de Boer, G Jansen, and T van Rompay, “Digital Inequalities in the Internet of Things: Differences in Attitudes, Material Access, Skills, and Usage,” Information, Communication & Society 24, no. 2 (2021): 258–276.

[7] A J A M van Deursen, E J Helsper, and R Eynon, Measuring Digital Skills. From Digital Skills to Tangible Outcomes Project Report, 2014.

[8] Please note, total figures may not add up due to rounding to one decimal place.

[9] Please note, the ADII draws on a national sample size ranging from 2,798 to 2,287 which does not provide sufficient non-binary respondents to generate reliable data. The Index therefore does not provide a score for non-binary people.

[10] A Marshall, A Dale, H Babacan, and M Dezuanni, Connectivity and Digital Inclusion in Far North Queensland’s Agricultural Communities: Policy-Focused Report, 2019. Accessed January 7, 2020.

M Elliott, Out of the Maze: Building Digitally Inclusive Communities, 2018. Wellington: Vodafone New Zealand Foundation, Internet NZ, and The Workshop.

Department of Internal Affairs, The Digital Inclusion Blueprint, Te Te Mahere mō te Whakaurunga Matihiko, 2019. Wellington: Department of Internal Affairs. Accessed September 17, 2021.

Recent case studies