5.3.C Prompt Templates

NoteLesson details

Estimated time: 20 minutes

Label: 5.3.C

Previous: 5.3.B Learning Activities | Next: 5.3.D Innovative Use Cases

Learning objectives

  • (Add learning objectives)

Prompt Templates

Scenario Planning Environmental Scan Brief

Troubleshooting Prompts

If your scan prompts are vague, over-simplified, or not aligned with higher education, try these alternatives:

  • “Act as a strategy officer for a UK university. Using the latest global higher education news, policy updates, and edtech reports, produce a one-page environmental scanning brief with sections on policy, technology, funding, and student expectations. Label strong vs weak signals.”

  • “You are supporting a university foresight workshop. Summarise this week’s major developments affecting higher education in policy, technology, and labour markets. For each, explain why it matters and suggest one reflective question for senior leadership.”

  • “For a research-intensive university focused on global health, generate an environmental scan across three domains: research funding, international collaboration, and AI in teaching. Distinguish confirmed trends from emerging uncertainties and flag any significant risks.”

  • “Prepare a briefing note for an academic board meeting. Synthesize recent HE policy consultations, AI-in-education debates, and microcredential developments into 3–5 key themes. For each theme, list likely implications for curriculum, assessment, and staff capability.”

  • “Create a concise ‘signal log’ for a university’s digital education team. Extract 5–7 signals from current news and sector reports that relate to AI-enabled learning, student data governance, and online assessment. Classify each as strong or weak and suggest next monitoring steps.”

  • “Working from current sector news and foresight reports, generate an environmental scan for a small specialist institution (e.g., public health). Highlight niche risks and opportunities that might be missed in generic HE analyses, and propose questions for a follow-up scenario exercise.”

Prompt Revision Lab

Initial version:
“Give me an environmental scan of higher education.”

Revised version:
“Act as a strategic analyst for a mid-sized UK university. Using up-to-date global higher education news and policy sources, produce a two-page environmental scan with sections on: policy/regulation, funding, technology (especially AI), student demand, and labour market trends. For each section, identify 2–3 key signals and briefly explain their possible impact on our teaching, research, and partnerships.”

Explanation of improvement:
The revised prompt clarifies the role, institutional type, scope of the scan, structure of the output, and explicit linkage to teaching, research, and partnerships. This narrows the AI’s focus, reduces generic commentary, and aligns the scan with the institutional intelligence goals described in the chapter.

Second refinement
“Act as a strategic analyst for a mid-sized UK university specialising in public and global health. Using up-to-date global higher education and health policy news, produce a two-page environmental scan with sections on: (1) policy/regulation (including AI and data governance), (2) funding for health-related research and programmes, (3) technology and AI in teaching and assessment, (4) student demand and mobility patterns. For each section, identify 2–3 strong signals and 1–2 weak signals, briefly explain their possible impact on our teaching, research, and partnerships, and suggest one reflective question for senior leadership.”

Short example outputs (indicative, truncated)

  • “Policy/regulation – Strong signal: expansion of national AI assessment regulations may require redesign of coursework in years 1–2…”

  • “Technology and AI – Weak signal: early pilots of multimodal AI analytics in comparable institutions could reshape how we monitor student engagement…”

Reflection question
“How did specifying institutional type, domain focus, signal categories (strong/weak), and explicit links to teaching, research, and partnerships change the relevance and depth of the scan compared with your initial, generic prompt?”

Trend Synthesis and Thematic Mapping for Foresight

Troubleshooting Prompts

If your “trend analysis” prompts produce shallow lists or ignore higher education context, use these alternatives:

  • “Using recent reports and commentary on higher education, identify 5–7 emerging trends that could shape curriculum design and delivery in the next five years. For each trend, summarise the pattern, give an example, and explain why it matters for programme-level planning.”

  • “Act as a trend analyst supporting a university teaching and learning committee. From current HE and labour market evidence, generate a thematic map of trends across: student expectations, AI in assessment, flexible learning, and global mobility. Describe how these trends interact and where there are contradictions.”

  • “You are preparing a briefing for a quality assurance review. Synthesise recent trends in AI-augmented pedagogy and microcredentials, focusing on their implications for accreditation, assessment integrity, and inclusive practice. Provide 3–5 questions the review panel should ask.”

  • “Produce a concise ‘trend packet’ for faculty deans. Cluster key higher education trends into three themes: participation and access, digital and AI innovation, and funding/governance. For each cluster, explain likely risks, opportunities, and blind spots if the trend is ignored.”

  • “From current global HE data, construct three contrasting narrative trends (e.g., ‘Platform University’, ‘Regulated AI Ecosystem’, ‘Reskilling at Scale’). For each, describe how teaching, research, and student services might change, and suggest early indicators to monitor.”

Prompt Revision Lab

Initial version: “List current trends in higher education.”

Revised version
“Act as a higher education trend analyst preparing a briefing for a university executive team. Identify 6–8 current trends affecting higher education across student demand, AI and digital learning, funding, and regulation. For each trend, provide: a short description, one concrete example from a real or plausible institution, and a brief comment on how it might affect curriculum, assessment, or academic workload.”

Explanation of improvement
The revised version moves from an unstructured list to a focused, multi-dimensional brief. It specifies audience, number of trends, domains, and required components (description, example, implications), directly reflecting the chapter’s emphasis on moving from data overload to actionable insights.

Second refinement
“Act as a higher education trend analyst preparing a briefing for a university executive team. Identify 6–8 current trends affecting higher education across student demand, AI and digital learning, funding, and regulation. For each trend, provide: (a) a short description, (b) one concrete example from a real or plausible institution, (c) likely implications for curriculum, assessment, or academic workload, and (d) 1–2 reflective questions the executive team should discuss. Conclude with a short paragraph highlighting how these trends might interact in the next five years.”

Short example outputs (indicative, truncated)

  • “Trend: Microcredential Expansion – Example: A national university consortium launching stackable short courses in data science for health professionals… Implications: pressure to redesign full programmes, potential workload spikes in curriculum review…”

  • “Trend: AI in Assessment – Reflective questions: How do we assure integrity while leveraging AI for feedback? Where are students’ equity risks most acute?”

Reflection question

“When you added explicit domains, institutional implications, and reflective questions, how did the AI’s output change in terms of strategic usefulness and alignment with the chapter’s focus on institutional intelligence?”

Scenario Matrix and Narrative Pack (with Table Output)

Troubleshooting Prompts

If prompts for scenarios are vague or produce generic stories, especially when you need structured formats such as tables and matrices, try these:

  • “Act as a foresight facilitator for a research-intensive university. Using PESTLE/ STEEPC thinking, propose a 2x2 scenario matrix based on two critical uncertainties affecting our institution over the next 10–15 years. Present the four scenarios in a table with columns for scenario name, brief description, key drivers, and early indicators.”

  • “Using current trends in AI regulation and global student mobility, construct four divergent 2035 higher education scenarios for a UK university. First, present them in a comparison table (scenario name, regulatory context, student mobility, digital learning model, key risks), then provide short narrative summaries.”

  • “As an AI-supported foresight assistant, develop three narrative-rich scenarios describing the future of AI in assessment (optimistic, disruptive, constrained). For each scenario, include quotes from fictional stakeholders (students, staff, regulators) to make implications vivid.”

  • “Generate a scenario workshop pack for faculty leadership. Include: (a) a simple 2x2 matrix description, (b) a comparison table of the four futures (curriculum, staffing, partnerships, technology), and (c) 3–4 discussion questions to test current assumptions.”

  • “Create a set of scenarios for the future of international partnerships in 2040, focusing on public health and global research. Present the scenarios in both tabular form (key axes, risks, opportunities, equity implications) and short stories that could be read aloud in a workshop.”

Prompt Revision Lab

Initial version
“Write some scenarios about the future of universities.”

Revised version
“Act as a strategic foresight facilitator working with a mid-sized UK university. Using current trends in AI, funding, and global mobility, construct a 2x2 scenario framework based on two critical uncertainties. Present your work in two parts: (1) a table with four scenarios (name, brief description, main drivers, key risks, early indicators); (2) a short narrative (150–200 words) for each scenario describing what daily life feels like for staff and students.”

Explanation of improvement
The revised prompt clarifies the institutional context, the drivers, the structure (2x2 framework), and the required formats (table plus narratives). It also foregrounds lived experience, aligning with the chapter’s emphasis on narrative-rich but analytically grounded scenarios.

Second refinement
“Act as a strategic foresight facilitator working with a mid-sized UK university specialising in global health. Using current trends in AI, research funding, and international mobility, construct a 2x2 scenario framework based on two critical uncertainties you identify. Present your work in three parts: (1) a clear explanation of the two uncertainties and why they matter; (2) a comparison table with four scenarios (name, brief description, main drivers, key risks, early indicators, equity implications); (3) a short narrative (150–200 words) for each scenario focusing on teaching, research, and student experience in 2035.”

Short example outputs (indicative, truncated)

  • “Uncertainty 1: Stringency of global AI regulation in education. Uncertainty 2: Openness of international mobility and research collaboration…”

  • Table row: “Scenario: Regulated Renaissance – Drivers: strict but supportive AI regulation, strong public funding… Equity implications: targeted support for under-represented learners through transparent AI tools…”

Reflection question

“How did specifying the table structure, column headings, and narrative focus (teaching, research, student experience, equity) change the usefulness and clarity of the AI-generated scenarios for your own planning or workshop design?”

Foresight-to-Operations Alignment Mapper

Troubleshooting Prompts

If prompts that aim to connect foresight outputs to operational strategy feel too abstract or disconnected from real planning, use these alternatives:

  • “Act as an AI assistant to a university planning office. Given a set of foresight outputs (environmental scans, trend analyses, and scenarios), map 5–7 concrete operational actions for the next three years. For each action, specify the responsible unit, indicative timeline, and a possible KPI.”

  • “You are helping translate a ‘Future of Research Collaboration 2035’ scenario into next year’s institutional plan. Generate a table that links scenario implications to operational domains (infrastructure, HR, partnerships, governance) and propose 1–2 actions per domain.”

  • “Using the idea of a foresight–operations alignment loop, outline a practical process for a university to revisit its strategic plan annually. Show how environmental scanning, trend analysis, scenario updates, and stress-testing feed into the revision of KPIs and budget priorities.”

  • “Create a set of questions a faculty dean could ask when reviewing operational plans against recent foresight findings (e.g., AI in assessment, demographic shifts). Group questions under curriculum, staffing, infrastructure, and partnerships.”

  • “Act as a facilitator preparing a cross-unit workshop on linking foresight to operations. Generate a structured agenda where participants use AI-generated scans and scenarios to propose concrete changes to policies, resource allocation, or programme portfolios.”

Prompt Revision Lab

Initial version
“Explain how foresight can be linked to operations.”

Revised version “Act as a planning advisor for a UK university that has just completed a foresight exercise on AI and higher education. Using the themes from environmental scanning, trend analysis, and scenario work, propose 6–8 concrete operational actions for the next three years. For each action, specify: the foresight insight it’s based on, the operational domain (e.g., curriculum, staffing, infrastructure, partnerships), the primary responsible unit, and one draft KPI. Present the result as a structured narrative rather than a table.”

Explanation of improvement
The revised version requests specific, actionable outputs (actions, domains, responsibilities, KPIs) explicitly linked back to different foresight inputs. It also constrains the format, helping ensure the AI produces an alignment map rather than an abstract essay. This mirrors the chapter’s focus on translating foresight into operational reality and institutional intelligence.

Second refinement
“Act as a planning advisor for a UK university that has just completed a foresight exercise on AI and higher education. Using the themes from environmental scanning, trend analysis, scenario building, and stress-testing simulations, propose 6–8 concrete operational actions for the next three years. For each action, specify: (a) the foresight insight or scenario it stems from, (b) the operational domain (curriculum, assessment, staffing, infrastructure, partnerships, governance), (c) the primary responsible unit, (d) one draft KPI and indicative baseline/target, and (e) a short note on potential risks or equity implications. Present the output as a clearly structured narrative with subheadings for each action.”

Short example outputs (indicative, truncated)

  • “Action 1 – AI-Ready Curriculum Review: Foresight source: scenario ‘AI-Led Personalisation’; Domain: curriculum; Responsible: Education Committee; KPI: % of programmes with AI literacy outcomes (baseline 10%, target 50% in three years)… Equity note: ensure AI literacy activities are inclusive and accessible.”

  • “Action 4 – Resilient Digital Infrastructure: Foresight source: stress-test on digital disruption; Domain: infrastructure; KPI: average recovery time from critical digital incident…”

Reflection question
“When you required explicit links between foresight insights, operational domains, responsibilities, KPIs, and equity considerations, how did that change the AI’s ability to produce outputs you could realistically use in planning and governance processes?”


Framework alignment

This lesson sits within: CloudPedagogy AI Capability Framework (2026 Edition)
Domains: Awareness, Co-Agency, Applied Practice & Innovation, Ethics, Equity & Impact, Decision-Making & Governance, Reflection, Learning & Renewal


Previous: 5.3.B Learning Activities | Next: 5.3.D Innovative Use Cases