Digital Digest: Selecting an Assessment for Digital Literacy
Digital Digest: Selecting an Assessment for Digital Literacy
Types of Assessments
Learners enrolling in adult education programs come with a wide range of digital skills – they may know how to use their Android-based phones, but not an iOS-based computer; they may know how to access Facebook or WhatsApp in their native language, but not English; they may know how to use technology to communicate with friends and family, but need support when using it for online learning or communicating with their doctor or child’s teachers. Educators rely on assessments to understand skill development needs, inform instruction, and measure learning. Meanwhile, states need data to help them understand the extent of their population’s digital skills as they plan human and workforce services, including use of newly-available federal digital inclusion resources, such as the Digital Equity Act. With more clarity on existing assessments and their purposes, instructors, program leads, and policymakers can better support learners in developing digital resilience. With this need in mind, the Digital Resilience in the American Workforce (DRAW) landscape scan, slated to come out this summer, explored the range of assessments currently available, how practitioners and other stakeholders use assessments, and opportunities for more strategic use of assessments to advance digital resilience.
Assessments of digital literacy skills can come in various forms and serve different purposes: self-assessments and inventories; performance- or competency-based assessments; and formative, summative, or diagnostic assessments. Beyond informing instruction, assessment and associated skills validation (e.g. badges, certificates, micro-credentials, and credentials) can help learners communicate their digital skills to current and future employers, thus creating higher wages and career advancement opportunities. Instructors can also use assessments to evaluate their own digital skills. In Massachusetts, for example, teachers assess their skills on a continuum of expertise: whether they can perform a skill for themselves, show others how to perform a task using the skills, and finally, integrate that skill into instruction. The DRAW landscape scan revealed various assessments linked to different frameworks and standards, each of which can serve a different purpose. This variety creates an opportunity for strategic use of assessment but may also create confusion.
Assessment and associated skills validation can help learners communicate their digital skills to current and future employers, thus creating higher wages and career advancement opportunities.
The DRAW landscape scan found that many instructors and other stakeholders, including employers and workforce system partners, lack clarity on which frameworks and assessments to use for which purposes. This echoes prior research on the use of assessments – for example, Digital US notes that “digital skills mean different things to different people. Skill definitions and assessments vary depending on the skills they cover, causing confusion between educators, employers, and learners.” Similarly, the Digital Blindspot report from the Rework America Business Network, found that limited use of assessments and lack of alignment in digital skills frameworks across stakeholders hinders the ability to accelerate learning and identify, share, and scale what works.
With an array of assessments to choose from and confusion about how to use them, many adult education programs rely on the Northstar Digital Literacy Assessment. A program of Literacy Minnesota, Northstar defines, assesses, and teaches basic skills needed to use computers and the internet in daily life, employment, and higher education. It was found in our scan to be the most widely used assessment in adult education – it is a useful diagnostic tool to show what skills learners need to develop, it is easy to administer, and it is an assessment that teachers are comfortable using. However, like many assessments built for wide-scale use, most of the modules were not designed to measure proficiency with skills application in real contexts. Additionally, a certain level of English language proficiency and literacy is currently needed to complete the assessment, although Northstar is working to translate it into other languages.
Outside of the classroom, assessments play a role in signaling and certifying digital skills mastery. Given the importance of digital literacy for work and learning, some of the experts interviewed by the DRAW team recommended a comprehensive digital badging system that would make it easier for learners to demonstrate core digital competencies to employers and enable educational institutions to share “transferable” credentials. Northstar already integrates badges and certification, and research indicates that these signaling strategies build learner self-confidence and motivation.
Selecting an Assessment
Assessments help practitioners understand learner skill development needs and inform effective delivery of instruction. These assessments should emphasize learners’ prior knowledge and their ability to transfer that knowledge to new settings. Most adult learners, including those with emerging English or literacy skills, do have digital skills that can be built upon. Building on these skills can accelerate learning, and acknowledging what learners already know can help them feel more confident in filling in any skills gaps.
Our compilation of assessments and related research lists digital-related assessments that currently exist, but do not address these assessments’ efficacy and viability since the definitions, standards, and frameworks for digital skills and resilience have not yet been determined. Our scan identified a need for guidance on determining when digital literacy assessments are “a good fit.” One valuable resource for this is the International Telecommunications Union’s comprehensive, practical guidebook and analysis of national digital skills assessments. This guidebook includes an examination of existing work and the advantages and disadvantages of digital skills assessment tools that can be employed as part of a national-level assessment. The DRAW team created a checklist to guide the selection of an assessment based on purpose and context to support educators further.
Assessments have the potential to be powerful tools for supporting the development of digital resilience. The DRAW scan revealed a need for better understanding of existing assessments and how to use them, as well as new, asset-based assessments that measure digital resilience. An aligned and strategic approach to assessment would allow educators, and program leadership, researchers, and policymakers to tailor instruction to learners’ needs, understand their progress, target resources where most needed, and signal mastery of skills to employers and other stakeholders.
DRAW is an initiative from Jobs for the Future (JFF), World Education, and Safal Partners that is funded by the U.S. Department of Education’s Office of Career, Technical, and Adult Education under Contract GS10F0094X. The U.S. Department of Education does not endorse any of the assessments described in this blog.