Resources/Governance & Leadership
Template18 pages

Annual Security Review Template

Structured template for conducting and documenting annual cybersecurity reviews. Includes metrics dashboards, risk trending, and executive summary format.

01

Purpose and Scope

The annual security review serves as the definitive record of the district's cybersecurity posture over the preceding twelve months. It consolidates incident data, infrastructure changes, policy updates, compliance status, and risk trends into a single document that informs strategic planning and budget decisions. Without a structured annual review, districts rely on fragmented impressions of their security state, making it difficult to identify patterns, justify investments, or demonstrate due diligence to insurers and regulators.

The review should encompass all technology systems under district management, including on-premises infrastructure, cloud services, the iboss SASE platform, student information systems, learning management systems, financial systems, and communication platforms. Scope exclusions should be explicitly documented and justified. For example, if the district contracts with a third-party food service vendor that operates its own point-of-sale systems, the review should note that those systems are excluded and covered under the vendor's own security program, verified through the vendor risk management process.

The primary audience for the annual review is the governance committee and the board of education, but sections should be written to serve multiple stakeholders. The executive summary targets board members who need a high-level understanding in five minutes. The metrics dashboard serves the governance committee for trend analysis. The detailed sections serve the IT team as a baseline for the coming year's planning. Distribute the document on a need-to-know basis, as it contains sensitive information about security gaps and incident details that could be exploited if disclosed publicly.

02

Executive Summary Template

The executive summary must fit on a single page and convey the district's cybersecurity status to a reader who may spend no more than three minutes reviewing it. Open with the overall risk posture rating using a four-level scale: Strong, Adequate, Needs Improvement, or Critical Concern. Accompany the rating with a one-sentence justification and a trend arrow indicating whether posture has improved, remained stable, or degraded compared to the prior year.

Follow the risk posture with the top three achievements of the year. These should highlight measurable improvements such as reducing mean time to detect from 72 hours to 18 hours, achieving 98 percent staff training completion, or successfully deploying iboss Zero Trust Network Access across all remote staff. Each achievement should be stated in one sentence with a quantifiable metric where possible.

Next, present the top three risks or concerns. Frame each risk with its current rating from the risk register and its trend direction. For example, a risk might read: Unpatched legacy systems in three elementary buildings remain rated High and trending upward due to vendor end-of-life announcement. Follow risks with a budget summary showing total cybersecurity spending versus approved budget, and a compliance summary showing status against each applicable framework. Close with a year-over-year comparison table showing five key metrics side by side for the current and prior year, enabling the reader to immediately grasp the direction of travel.

  • Overall risk posture rating with trend indicator (Strong / Adequate / Needs Improvement / Critical Concern)
  • Top 3 cybersecurity achievements with quantifiable metrics
  • Top 3 risks or concerns with risk register rating and trend direction
  • Budget summary: total spend versus approved budget, variance explanation
  • Compliance status: green/yellow/red for each applicable framework
  • Year-over-year comparison table of 5 key performance metrics
03

Security Metrics Dashboard

The metrics dashboard translates raw security data into meaningful indicators of program effectiveness. Organize metrics into four categories: threat, operational, user, and compliance. Threat metrics include total incidents detected, percentage blocked automatically versus requiring human intervention, incidents escalated to the governance committee, and confirmed breaches if any. Present these as both absolute numbers and rates to provide context. A district blocking 200,000 threats per month is meaningless without knowing the block rate and whether human-escalated events are increasing or decreasing.

Operational metrics measure the efficiency of the security program. Key indicators include system uptime percentage across critical services, patch compliance rate measured as the percentage of systems patched within the defined window, mean time to detect threats from initial compromise to identification, and mean time to respond from detection to containment. These metrics should be tracked monthly and presented as twelve-month trend lines in the annual review. For iboss-specific operational metrics, include platform availability, policy update deployment time, and SSL inspection coverage percentage.

User metrics assess the human element of the security program. Track annual security awareness training completion by role category, separating administrators, teachers, support staff, and seasonal employees. Report phishing simulation results including click rates, reporting rates, and trend direction across simulation rounds. Identify departments or buildings with consistently high click rates for targeted intervention. Compliance metrics track the district's status against each regulatory framework and internal policy requirement. Report total audit findings by severity, findings remediated within the defined window, open findings with age, and policy exceptions granted with their expiration dates. Present each metric category with a twelve-month trend and a year-over-year comparison to establish a performance baseline.

04

Infrastructure Assessment Section

Document all significant changes to the district's technology infrastructure during the review period. This includes new network segments, building additions or renovations with network implications, wireless infrastructure upgrades, WAN circuit changes, and cloud service adoptions or migrations. For each change, note whether a security assessment was conducted prior to deployment and summarize the findings. This section creates an auditable record that the district considered security implications before implementing infrastructure changes.

Include a summary of vulnerability management activities. Report the total number of vulnerability scans conducted, the scan coverage percentage across district assets, and the remediation rate within defined SLA windows. Break down discovered vulnerabilities by severity: critical, high, medium, and low. Track the average time to remediate critical and high vulnerabilities and compare against the prior year. If penetration testing was conducted, summarize the scope, methodology, and key findings without disclosing specific exploit details in the review document. Reference the full penetration test report separately under restricted distribution.

iboss platform utilization metrics deserve their own subsection given the platform's central role in the district's security architecture. Report on total users protected, policies enforced, traffic volume inspected, SSL decryption coverage, threat categories blocked, and ZTNA application access patterns. Include iboss platform uptime and any service degradation events. Compare utilization metrics against licensing capacity to inform renewal planning and identify whether the district is under-utilizing capabilities that could address identified risks.

05

Policy Review Summary

The annual review should document the status of every cybersecurity-related policy, administrative regulation, and IT procedure in the district's policy hierarchy. For each document, record its last review date, any changes made during the review period, and its next scheduled review date. Policies should be reviewed at least annually, with administrative regulations and IT procedures reviewed semi-annually or whenever a significant change in the threat landscape or technology environment warrants an update.

List all policies that were updated during the year and provide a brief summary of what changed and why. Common triggers for policy updates include new regulatory requirements, lessons learned from incidents, changes in technology infrastructure, and insurance underwriter feedback. Separately list any new policies created during the year and the governance committee action that authorized their creation. Identify policies that are overdue for review and include a remediation timeline.

Maintain a policy exception log that records every approved deviation from established policy. For each exception, document the requesting party, the specific policy requirement being excepted, the business justification, compensating controls in place, the approval authority, the approval date, and the expiration date. Exceptions should be time-limited and reviewed at each governance committee meeting. The annual review should report on total exceptions granted, exceptions expired and closed, exceptions renewed, and any exceptions that resulted in a security incident. A high number of standing exceptions may indicate that a policy is impractical and should be revised rather than routinely excepted.

06

Incident Review Section

Aggregate all security incidents from the review period and categorize them by type: malware, phishing, unauthorized access, data exposure, denial of service, policy violation, and physical security. For each category, report total count, severity distribution, and trend compared to the prior year. Present this data in a table format that allows the governance committee to quickly identify which incident types are increasing and which are being effectively controlled.

Evaluate incident response effectiveness using quantitative metrics. Mean time to detect measures the interval between initial compromise or policy violation and its identification by district staff or automated systems. Mean time to respond measures the interval between detection and containment. Mean time to recover measures the interval from containment to full restoration of normal operations. Track each metric by incident severity level, as critical incidents should have shorter response targets than low-severity events. Compare actual performance against the response time targets defined in the district's incident response plan and flag any categories where targets were consistently missed.

Include a lessons learned summary drawing from post-incident reviews conducted throughout the year. Identify common root causes, recurring gaps in detection or response capabilities, and process improvements implemented as a result of incident analysis. Present a year-over-year trend analysis showing whether the district is experiencing more or fewer incidents, whether severity distribution is shifting, and whether response times are improving. For any incident that triggered board notification, parent notification, or regulatory reporting, include an anonymized summary with outcome and status of any resulting remediation actions.

07

Vendor and Third-Party Review

K-12 districts typically rely on dozens of third-party vendors that process student data, making vendor risk management a governance imperative. The annual review should report the total number of vendors in the district's technology inventory, new vendors added during the year, and vendors decommissioned. For each new vendor, confirm that a risk assessment was completed prior to contract execution and that a Data Processing Agreement is on file.

Summarize the results of vendor risk reassessments conducted during the year. Risk assessments should evaluate each vendor's security posture, data handling practices, incident response capabilities, and business continuity planning. Categorize vendors by risk tier: Tier 1 vendors process sensitive student data or provide critical infrastructure services and require annual assessment with evidence review; Tier 2 vendors have limited data access or provide non-critical services and require biennial assessment; Tier 3 vendors have no data access and require assessment at contract renewal. Report the percentage of Tier 1 vendors assessed during the year and any vendors that failed to meet minimum security requirements.

Document the status of Data Processing Agreements across the vendor portfolio. Report the percentage of data-processing vendors with current DPAs, DPAs executed during the year, and DPAs pending execution. Note any vendors operating without a DPA and the remediation timeline. If any vendor experienced a security incident that affected or could have affected district data during the year, summarize the event, the vendor's response, and any actions the district took in response. This section provides the governance committee with confidence that third-party risk is being actively managed rather than accepted by default.

08

Recommendations and Roadmap

Close the annual review with prioritized recommendations for the coming year. Each recommendation should be tied to a specific finding from the review, whether it is a risk register item, an incident trend, a compliance gap, or an infrastructure need. Use a prioritization framework that considers risk reduction impact, implementation complexity, resource requirements, and alignment with district strategic goals. Present the top five to eight recommendations in priority order with estimated cost, responsible party, and target completion quarter.

Budget implications should be presented clearly. For each recommendation, state whether it can be accomplished within the current budget, requires reallocation of existing funds, or requires new budget authority. Where new funding is needed, provide a cost-benefit analysis that quantifies the risk reduction achieved relative to the investment. Reference insurance premium impacts where applicable, as many security improvements directly reduce underwriter risk assessments and can offset their cost through premium reductions.

The implementation roadmap should span twelve months and align with the district's fiscal year and governance calendar. Map each recommendation to a target quarter, identify dependencies between initiatives, and note any recommendations that require board approval before proceeding. The roadmap becomes the working plan for the IT team and the governance committee's primary tracking tool for the coming year. At each quarterly governance meeting, report progress against the roadmap, noting completed items, items on track, delayed items with revised timelines, and any new recommendations arising from emerging risks or incidents.

  • Prioritize recommendations by risk reduction impact, implementation complexity, and cost
  • Tie each recommendation to a specific finding from the review
  • Include cost-benefit analysis for items requiring new budget authority
  • Reference cyber insurance premium impacts where applicable
  • Map recommendations to fiscal year quarters with dependencies identified
  • Identify items requiring board approval and align with board meeting calendar
← All Resources
18 pages · Template

Need help implementing this?

Calbrate configures iboss to meet every requirement covered in this resource. Free assessment included.

Free · No obligation · Response within 24 hours