The design of Skills Assessment and Anticipation (SAA) exercises is fundamentally driven by their intended purpose, which in turn dictates the choice of unit of analysis, geographical and sectoral scope, and time horizon. In the context of higher education, these design features are critical for ensuring that academic provision aligns with evolving labour market needs to reduce skills mismatches, which currently affect a significant portion of the EU workforce.
Core Design Features of SAA Exercises
According to the sources, the following features are central to designing an effective SAA exercise:
- Unit of Analysis: Most SAA exercises focus on occupations (13 out of 17 reviewed) because they are useful for a wide range of policies and their data is more readily available. However, about half also analyze skills (8 out of 17), and a minority look at qualifications (5 out of 17). While occupational analysis is common, it risks missing internal shifts in skill requirements within a single job. Exercises specifically targeting higher education often look directly at qualifications to inform prospective students and institutional provision.
- Geographical and Sectoral Coverage: High-performing systems often provide a national overview while offering granular information at the sectoral or regional level. Some exercises, like Estonia’s OSKA, use a bottom-up approach, starting with sectoral analysis that is later aggregated. Most exercises aim to cover all education levels to minimize costs, though some focus exclusively on higher education or vocational education.
- Time Horizon: Exercises range from assessing current needs to long-term projections (6+ years). Short-term forecasts are generally more accurate and useful for immediate training interventions, whereas long-term forecasts (like Sweden's 15-20 year study) support strategic planning but are more susceptible to unforeseen structural changes.
- Frequency of Updates: Most exercises are updated annually, particularly those using quantitative methods like online job vacancy (OJV) data. Frequent updates allow for continuous monitoring, but they can be resource-intensive and may outpace the ability of policy makers to react.
Implementation and Methodological Design
The sources emphasize that the choice of data and methodology is a key design consideration to ensure the results are robust and actionable:
- Methodology: Quantitative methods (e.g., simulation models and big data analysis) offer consistency and comparability across sectors and time. However, they often lack the granularity required to update specific curricula. Therefore, mixed methods—combining quantitative data with qualitative input from expert groups or stakeholders—are considered best practice.
- Data Sources: Common sources include administrative records, labour force surveys, and OJV data. Qualitative information is often used to validate quantitative results or to identify "new" skills that are not yet visible in historical datasets.
Context of Higher Education Adaptation
In the larger context of Adapting Higher Education, SAA results are used to inform policy in several ways:
- Informing Provision: SAA results help determine the number of study places (quotas) and the content of programs.
- Guidance and Incentives: Governments use SAA data to provide career guidance for students and to create financial incentives (like the Human Capital Initiative in Ireland) that encourage the supply of skills in high-demand fields.
- Specific Challenges for HE: Identifying skills supply in higher education is more complex than in Vocational Education and Training (VET). VET skills are often aligned with learning outcomes in a National Qualification Framework, whereas higher education skills are often defined at the institutional level, making them harder to document and track formally. Efforts are currently underway to use new technologies to extract these skills from program descriptors.
Ultimately, high-performing SAA systems move from being a single "exercise" to becoming a coordinated system. This involves a core SAA exercise supplemented by multiple complementary studies with different time horizons and high stakeholder engagement to ensure the information is clearly presented and used by education providers.
The implementation of Skills Assessment and Anticipation (SAA) exercises involves selecting appropriate data sources and methodologies to estimate current and future skill needs. In the context of adapting higher education, these methods must navigate the specific challenges of identifying high-level skills supply and demand to ensure that academic offerings remain relevant to the labour market.
Data Sources for SAA Implementation
Most SAA exercises rely on a combination of quantitative and qualitative data. The sources include:
- Quantitative Data: All reviewed exercises use quantitative information, with most leveraging at least two different sources.
- Administrative and Survey Data: Social security records and national labour force surveys provide representative data on employment by occupation but may lag in identifying real-time shortages.
- Online Job Vacancy (OJV) Data: This is a major innovation, providing high-frequency, granular data on skills demand and new occupations.
- Graduate Employability Surveys: Particularly relevant for higher education, these surveys (as seen in Hungary and Italy) track how recent graduates enter the labour market and whether they possess the required skills.
- Qualitative Data: This information is gathered through expert groups, stakeholder interviews, and validation exercises. It is crucial for identifying "new" skills or technological shifts not yet visible in historical datasets.
Methodological Approaches
SAA exercises utilize three primary methodological frameworks:
- Quantitative Methods: These include simulation models (like time-series or stock-and-flow models) to project long-term needs, and big data analysis for real-time demand. They offer consistent, transparent results across sectors but require high levels of statistical expertise.
- Qualitative Methods: Easier to implement than complex econometric models, these methods rely on expert observations to identify specific skill needs and mismatches. However, they can be subjective and may lack the broad comparability of quantitative models.
- Mixed Methods: This is considered best practice, as qualitative input is used to validate the assumptions or results of quantitative models. For instance, in France’s "Occupations 2030," qualitative information validates the quantitative model's assumptions.
Implementation Challenges in Higher Education
Implementing SAA specifically for higher education adaptation presents unique difficulties compared to Vocational Education and Training (VET):
- Identifying Skills Supply: While VET skills are often aligned with learning outcomes in a National Qualification Framework (NQF), higher education skills are frequently defined at the institutional level.
- Lack of Formalization: Because higher education institutions (HEIs) often have significant autonomy, their programme descriptors may not be standardized.
- Innovative Solutions: To bridge this gap, some systems are now using new technologies (such as AI) to extract and identify skills supply directly from higher education programme descriptors and curricula documentation.
- Graduate Tracking: Some countries, like Hungary, use a Graduate Career Tracking System that matches enrolment data with administrative labour market information to see exactly which qualifications are in high demand based on wages and employment rates.
By coordinating these implementation methods, high-performing systems move beyond simple "exercises" to create a coherent data infrastructure that supports policy makers in adjusting study quotas and updating curricula.
The institutional setup of Skills Assessment and Anticipation (SAA) is a critical success factor in ensuring that labour market insights effectively lead to the adaptation of higher education. High-performing systems move beyond isolated activities to create a coordinated SAA system involving diverse stakeholders, robust governance frameworks, and sustainable funding.
Stakeholder Involvement
The sources emphasize that engaging the right institutions and stakeholders—including social partners, education providers, and public bodies—ensures results are validated and tailored to user needs. Stakeholders participate in various stages:
- Design and Implementation: Entities like the Union of Chambers of Commerce in Italy (Excelsior) or the AlmaLaurea University Consortium not only set up and implement exercises but also use the results to update academic programs.
- Providing Inputs: Ministries, Public Employment Services (PES), and higher education institutions (HEIs) provide data on job vacancies, student enrollment, and demographic trends.
- Validation: In systems like the Netherlands' POA and Estonia's OSKA, social partners and experts review results to provide feedback and ensure "on the ground" accuracy.
- Extending Results: Groups like the Expert Group on Future Skills Needs in Ireland use core quantitative results as a starting point for deeper qualitative analysis to better align education with employer needs.
Governance Frameworks
Effective governance ensures that stakeholder contributions are coordinated and that there is a clear process for making policy recommendations. The sources identify two primary approaches:
- Project-level Governance: Focused on recurring data operations or one-off research projects (e.g., the Dutch POA or Slovenia’s Labour Market Platform), where governance is often established through contractual agreements between project owners and operators.
- Multi-stakeholder Governance: Dedicated bodies coordinate across multiple exercises and policy areas. Examples include Estonia's OSKA Coordination Council and Finland's Skills Anticipation Forum, which involve representatives from ministries of education and labour, regional bodies, and employer/employee associations.
The Specific Case of Higher Education
Historically, higher education stakeholders have been less involved in SAA governance compared to vocational education and training (VET). This is due to institutional autonomy, where HEIs have more freedom to set curricula, and a broader funding base that makes them less dependent on state directives. However, some countries are shifting toward more targeted engagement:
- Finland includes student unions and the Conference of University Rectors in its SAA steering group.
- Australia established an Education and Training Advisory Group in 2024 to ensure tertiary sector input reaches strategic decision-makers.
- Sweden mandates the Council for Higher Education to cooperate with other public bodies to develop a coherent data infrastructure on skills supply.
Funding and Resources
Most whole-of-economy SAA exercises are publicly funded, typically by the Ministry of Labour or the PES, reflecting their role in workforce planning. However, ministries of education often contribute when the goal is to update study places or curricula. Some systems, like Italy's AlmaLaurea, use a mix of university consortium funds and ministry support. EU funding, such as the European Social Fund Plus (ESF+), also supports the development of these systems in countries like Estonia and Slovenia.
From Exercise to System
A key evolution noted in the sources is the move from a single "exercise" to a coordinated SAA system. In these systems, a core quantitative exercise is often supplemented by thematic or sectoral studies. For example, the results of Ireland’s core SAA unit are further developed by regional skill fora to tailor education and training to specific local needs. This coordination avoids the duplication of work and ensures that all actors use consistent definitions and data, making results comparable across the entire education and labour market landscape.
The impact of Skills Assessment and Anticipation (SAA) on policy is realized when its insights are translated into actionable strategies for higher education (HE) providers, students, and government bodies. These sources highlight that while SAA results are used for diverse purposes—including migration policy and industrial planning—their primary impact in the context of higher education is felt through provision planning, financial incentives, and learner guidance.
Core Policy Use Cases
The sources identify several key areas where SAA results directly inform policy and practice:
- Updating Curricula: Insights on emerging skill needs help institutions revise existing programs or develop new ones to ensure graduates possess industry-relevant skills.
- Capacity Planning: Governments use SAA data to decide whether to increase or reduce the number of study places in specific fields based on projected demand.
- Career Guidance: One of the most frequent uses of SAA is providing prospective students with information on labour market prospects to support informed educational choices.
- Workforce Planning: While less common than education-focused uses, SAA results can also help private companies with their long-term talent management and recruitment strategies.
Policy Levers for Higher Education Adaptation
Unlike Vocational Education and Training (VET), which often has direct links to SAA, higher education typically requires indirect policy approaches due to institutional autonomy.
1. Regulatory Pathways
- Occupational Standards: Governments increasingly require HE programs to define learning outcomes aligned with occupational standards. In Estonia, the Qualifications Authority (Kutsekoda) ensures that OSKA results are integrated into occupational standards that HE programs must reference.
- Mandatory Quotas: In some systems, public authorities adjust state-funded study quotas based on SAA results. This is common for regulated professions (e.g., health and teaching) in the Netherlands, but countries like Denmark and Portugal apply these caps to a broader range of undergraduate programs.
2. Financial Incentives
- Subsidizing Provision: Initiatives like Ireland's Human Capital Initiative (HCI) use competitive funding to encourage HEIs to develop new programs in areas identified as high-priority skills needs, such as ICT and engineering.
- Stimulating Learner Demand: Policies may include tuition fee waivers or reductions for students entering high-demand fields. For example, Hungary provides state-funded places for doctoral students in mathematics, natural sciences, and engineering to address high talent demand.
- International Attraction: Countries like Scotland and Denmark use SAA results to refine talent attraction strategies, targeting international students to fill domestic skill gaps in growth sectors.
Supporting Learner Choices through Information
For SAA to impact student behavior, the data must be accessible and user-friendly.
- Career Education: High-quality guidance services in schools use SAA outputs to help students navigate early career uncertainty.
- Online Portals: Many countries maintain sophisticated platforms for this purpose. The French ONISEP portal is noted for its comprehensive integration of labour market trends into educational guidance for both students and professionals. In New Zealand, HEIs are required to publish Key Information for Students (KIS) for every program, including labour market outcomes.
Dissemination and Accessibility
The actual impact of an SAA exercise depends on how well its results are communicated. High-performing systems adapt complex data into various formats:
- Thematic and Sectoral Reports: Regular publications tailored to specific industries or regions.
- Innovative Formats: Estonia uses web-based "trend cards" to interactively display how trends like the green transition affect jobs and skills. Finland utilizes "needs cards" that succinctly present forecast data through graphics.
- Data Products: Some systems, like France's Occupations 2030, provide online visualization tools that allow researchers and policymakers to extract and compare regional skill needs.
No comments:
Post a Comment