You Completed the DIR Training. Now What?
If you have completed the Texas DIR AI Awareness Training - or if your agency is in the process of rolling it out - you have already taken the most important first step. Your employees understand what AI is, what risks it carries, and how to use it within the boundaries of agency policy. That one hour of training satisfies a legal mandate and, more importantly, lays a foundation.
But it is only a foundation.
The DIR compliance training is designed to be a broad baseline for all Texas government employees - from the front desk clerk to the executive director. It covers essential concepts: AI accuracy, hallucination risks, data privacy, bias, and legal compliance. It is exactly what it is designed to be: an awareness course.
What it is not designed to be is a skills course. And skills are where the real productivity gains - and the real career advantages - live.
In 2026, AI tools are no longer optional in most professional environments. They are embedded in email platforms, document editors, budget software, HR systems, and communication tools. Government employees who know how to use these tools with skill and judgment are not just more productive. They are less likely to make costly errors, more capable of serving constituents effectively, and better positioned for advancement.
This article explains the gap between compliance and competence - and what Texas government employees and their agencies can do to close it.
The Difference Between AI Awareness and AI Literacy
The Texas Legislature's intent in passing Government Code Section 2054.5193 was clear: ensure that every government employee who relies on a computer understands the basics of AI - what it is, what it can and cannot do, and what rules govern its use. The awareness training accomplishes that goal well.
AI literacy is a different concept entirely. The distinction is similar to the difference between a driver's education class and actually learning to drive. Driver's ed teaches the rules of the road. Behind the wheel is where you develop judgment, skill, and confidence.
| AI Awareness Training (DIR) | AI Literacy (Skills Training) |
|---|---|
| What AI is and how it works | How to write effective prompts for your specific tasks |
| Risks: hallucination, bias, privacy | How to evaluate AI output for accuracy and fit |
| Agency policy and authorized use | How to integrate AI tools into your daily workflow |
| Legal compliance (Texas and federal law) | How to protect data while getting value from AI |
| General principles of responsible AI use | Job-specific use cases and decision frameworks |
Both are necessary. Neither replaces the other. But organizations that stop at awareness training are leaving significant productivity and risk-reduction value on the table.
The AI Skills Gap in Texas Government
A consistent finding across public sector surveys in 2025 and 2026 is that most government employees have access to AI tools but are underusing them because they do not know how to use them well. They have heard about ChatGPT, Microsoft Copilot, or Google Gemini. Some have experimented informally. But very few have received structured training on how to apply these tools effectively to their actual work.
This creates several problems that agencies are beginning to recognize:
Underuse by cautious employees. Employees who completed awareness training know the risks. Without knowing the skills to manage those risks, some default to not using AI at all - missing productivity gains their peers in private industry are already capturing.
Risky use by overconfident employees. On the other end, some employees use AI tools without understanding their limitations. They submit AI-generated content without proper verification, upload sensitive data to unapproved tools, or rely on AI outputs without applying professional judgment.
Inconsistent use across departments. Without skills training, AI use in a government agency is ad hoc. One analyst in the budget office has learned to use AI effectively for data summarization. Two doors down, another analyst has no idea how to do the same thing. This creates efficiency disparities and knowledge silos.
Skills training addresses all three problems. It turns aware employees into capable ones.
Five AI Skills Every Government Employee Needs After Compliance Training
After completing the DIR awareness requirement, these are the practical skills that produce the greatest impact for government employees across all departments and roles.
1. Effective Prompting
Most employees' first frustration with AI tools is that the output is generic, vague, or not quite right. The problem is usually the prompt - the instruction given to the AI. Effective prompting is a skill. It involves being specific about the task, providing relevant context, specifying the desired format or tone, and knowing how to refine the prompt when the first response misses the mark.
A government communications officer who knows how to write a specific, contextual prompt will get a usable first draft of a press release. One who does not will get a generic template that requires more editing than writing from scratch would have taken.
2. Output Verification and Quality Control
The DIR training teaches employees that AI can hallucinate - that it can produce confident-sounding content that is factually wrong. The skill is knowing how to systematically verify AI output. This means knowing which types of content require verification (facts, statistics, legal references, dates, names), which sources to use for verification, and how to document the verification process when producing official records.
This is not the same as being suspicious of everything AI produces. It is professional judgment about risk - the same judgment a good editor applies to any draft document.
3. Data Classification and Safe Use
One of the most common compliance errors government employees make with AI tools is submitting data to AI platforms that have not been approved for that data classification level. Knowing how to classify information - public, sensitive, confidential, restricted - and knowing which AI tools are approved for which classification levels is a critical operational skill.
A procurement officer who can correctly classify contract data and route it to the appropriate AI tool avoids an incident. One who cannot creates one.
4. Workflow Integration
AI tools add the most value when they are integrated into specific workflows rather than used on an ad hoc basis. This skill involves identifying which repetitive or time-consuming tasks in your role are good candidates for AI assistance, building consistent processes for using AI within those tasks, and evaluating whether AI is actually saving time and improving quality or just adding a step.
The employees who get the most out of AI tools are not the ones who use them most often. They are the ones who use them most strategically.
5. Documenting and Disclosing AI Use
Government accountability standards require transparency about how decisions and documents are produced. As AI becomes a standard part of the workflow, employees need to know when and how to disclose AI use - in reports, in public communications, in procurement documents, and in records. This is both an ethical skill and an emerging compliance requirement at the federal and state level.
Developing consistent internal practices for documenting AI use before it becomes a formal requirement puts agencies ahead of the compliance curve.
High-Impact Roles: Where Job-Specific AI Training Pays Off Most
While the five skills above apply across all government roles, certain functions have particularly high potential for AI-driven productivity gains - and particularly significant risks if AI is used without proper skills training. Here is a role-by-role breakdown of where the opportunity and risk are greatest.
Human Resources
HR professionals in government agencies are using AI for job posting drafting, application screening assistance, policy document updates, and employee communication. The risks here are significant: AI bias in hiring processes has been well-documented, and HR data is among the most sensitive in any organization. HR teams that receive skills training specific to their function learn how to use AI to improve consistency and efficiency while maintaining human oversight and avoiding discriminatory outcomes.
Accounting and Finance
Finance staff are using AI for budget narrative drafting, grant report summarization, audit documentation, and financial analysis. The compliance stakes in government finance are extremely high - incorrect figures in public budget documents carry legal and reputational consequences. Finance-specific AI training focuses on verification protocols, appropriate use of AI for analytical tasks versus transactional ones, and documentation standards for AI-assisted financial work.
IT and Cybersecurity
IT teams face a dual challenge: they need to understand AI deeply enough to manage it as part of agency infrastructure, and they need to understand how malicious actors are using AI to develop new attack vectors. AI literacy for IT professionals goes well beyond general use - it includes threat modeling, understanding AI system vulnerabilities, evaluating AI vendor security postures, and building internal AI governance processes.
Procurement and Contracting
Procurement officers are using AI to analyze vendor proposals, review contract language, research comparable pricing, and draft solicitation documents. Given that procurement decisions involve public funds and are subject to transparency requirements, the ability to use AI tools skillfully while maintaining proper documentation and human oversight is essential. Procurement-specific training addresses contract data classification, vendor evaluation frameworks, and public records compliance for AI-assisted procurement.
Communications and Public Information
Public information officers, communications directors, and social media managers are among the heaviest users of AI tools in government. They use AI for press release drafting, social media content, constituent response templates, translation assistance, and public report summaries. Communications-specific AI training focuses on accuracy verification for public-facing content, tone and accessibility standards, disclosure requirements, and crisis communication protocols where AI should and should not be used.
How Agencies Are Building Internal AI Champions
The most effective approach emerging across state and local governments is the AI champion model: identifying one or two employees per department who receive deeper AI training, then empowering them to serve as internal resources for their colleagues.
This model works because AI tool adoption is inherently contextual. The best person to explain how AI can help a property tax assessment workflow is someone who understands that workflow - not an outside consultant. When agencies invest in role-specific AI training for high-potential employees and give them time to share that knowledge internally, the return on training investment multiplies.
The AI champion model also creates a feedback loop. Champions identify where AI is working and where it is not. They surface compliance issues before they become incidents. They help update internal policy as tools evolve. Over time, they become a significant organizational asset.
Several forward-thinking Texas agencies have already begun building AI champion programs alongside their DIR compliance training rollouts. The combination - broad awareness for all staff, deep skills for designated champions - is emerging as a best practice for government AI governance.
What AI Literacy Looks Like in Practice
It is worth being concrete about what AI literacy training actually teaches - and what it does not. Good AI literacy training is not about a specific tool. Tools change. ChatGPT, Copilot, Gemini, Claude - the landscape shifts constantly. Training tied to a single tool becomes obsolete quickly.
Effective AI literacy training builds transferable skills and frameworks that apply regardless of which tools are in use. These include:
- Prompt construction principles - how to specify context, format, tone, and constraints effectively
- Verification frameworks - systematic approaches to checking AI output for accuracy and appropriateness
- Data handling decision trees - a consistent process for deciding what data can and cannot be used with AI tools
- Output evaluation rubrics - criteria for assessing whether AI output meets professional standards before use
- Documentation templates - practical tools for recording AI use in official work products
- Scenario-based judgment exercises - realistic practice cases that build decision-making skill, not just knowledge recall
The best indicator that AI literacy training has worked is not a quiz score. It is behavioral change: employees who ask better questions of AI tools, verify outputs with professional judgment, and know when not to use AI at all.
What Texas Government Employees and Agencies Should Do Next
If your agency has completed the DIR compliance training - or is in the process of deploying it - here are concrete next steps to build on that foundation.
For individual employees: Identify your role and consider where AI tools could add value to your day-to-day work. Look for role-specific AI literacy training that teaches practical skills, not just concepts. Start applying what you learn in low-stakes tasks before moving to mission-critical workflows.
For HR directors and training coordinators: Survey your workforce to identify which departments have the highest AI tool usage and the lowest skills confidence. Prioritize role-specific training for those areas. Consider an AI champion program as a cost-effective way to scale skills development without training every employee individually.
For IT and security teams: Develop an approved AI tools list with corresponding data classification guidelines before widespread skills training begins. Employees who learn to use AI effectively will want to use it - and they need to know which tools are safe.
For agency leadership: Frame AI skills development as a workforce investment, not a compliance cost. The agencies that are investing in AI literacy now are building a workforce advantage that will compound over time. The cost of not developing these skills - in errors, in inefficiency, in risk - is higher than the cost of the training.
Compliance Is the Floor, Not the Ceiling
The Texas DIR AI awareness training requirement is one of the most forward-thinking workforce development mandates in state government. It reflects a genuine understanding that AI is a transformative force in public sector work - and that employees need to understand it to use it responsibly.
But the statute sets a floor. The most effective agencies in Texas - the ones that are delivering better constituent services, making more accurate decisions, and retaining skilled staff - will be the ones that treat the DIR requirement as a starting point and build genuine AI literacy on top of it.
The gap between awareness and competence is real, and it is closable. It requires intentional investment in role-specific training, practical skills development, and an organizational culture that treats AI as a tool that requires skill to use well - not just a risk to be managed.
Your workforce has already been introduced to AI. Now teach them to use it.