Thank you for sending your enquiry! One of our team members will contact you shortly.
Thank you for sending your booking! One of our team members will contact you shortly.
Course Outline
Foundations of Secure Local AI
- What local and on-prem AI mean in regulated environments
- Cloud AI versus internal deployment for sensitive workloads
- Common enterprise use cases for private assistants and workflow support
- Core components of a secure local AI architecture
Ollama and Open Model Basics
- How Ollama fits into a local development stack
- Pulling, running, and managing models locally
- Choosing models based on size, quality, hardware, and license
- Matching model options to practical business tasks
Preparing the On-Prem Environment
- Host, workstation, and server preparation
- Installing and configuring Ollama for local inference
- Using containers and internal development tooling
- Verifying API access and basic operational readiness
Working with Local Models Effectively
- Running prompts and shaping outputs with system instructions
- Reusing templates for consistent enterprise tasks
- Managing model versions and internal artifacts
- Basic performance tuning for CPU and GPU deployments
Building Practical Agentic Workflows
- What makes a workflow agentic in a controlled setting
- Simple patterns for planning, tool use, and response loops
- Designing task-focused assistants for internal operations
- Adding human review, fallback logic, and error handling
Private Retrieval Workflows
- Retrieval-augmented generation basics for internal knowledge access
- Preparing documents for chunking, indexing, and search
- Connecting a local vector store to an Ollama-based application
- Improving relevance and answer quality with better retrieval patterns
Security, Governance, and Compliance Practices
- Data handling boundaries and privacy considerations
- Access control, logging, and audit support
- Prompt safety, output controls, and guardrails
- Governance checkpoints for regulated deployment and operation
Enterprise Integration Patterns
- Exposing local AI capabilities through internal APIs
- Integrating assistants with internal applications and services
- Supporting assistant, batch, and workflow automation use cases
- Keeping solutions inside controlled network boundaries
Evaluating Local AI Solutions
- Assessing quality, reliability, and consistency
- Testing against business, policy, and safety requirements
- Comparing model options for specific enterprise tasks
- Establishing a practical improvement cycle for internal teams
Hands-On Implementation Lab
- Building a private assistant with Ollama and an open model
- Adding retrieval over approved internal documents
- Introducing simple agentic actions and safety controls
- Reviewing deployment, operations, and governance checkpoints
Adoption Planning and Next Steps
- Reviewing key design and deployment decisions
- Identifying common pitfalls in regulated AI projects
- Planning pilot use cases and stakeholder alignment
- Defining a roadmap for secure local AI adoption
Requirements
- Basic understanding of AI concepts and software development
- Familiarity with command line tools, containers, or local development environments
- Basic scripting or programming experience
Audience
- Developers and technical teams building private AI solutions on internal infrastructure
- Security, compliance, and platform professionals supporting AI in regulated environments
- Technical leaders in finance, healthcare, government, and defense evaluating on-prem AI adoption
21 Hours
Custom Corporate Training
Training solutions designed exclusively for businesses.
- Customised Content: We adapt the syllabus and practical exercises to the real goals and needs of your project.
- Flexible Schedule: Dates and times adapted to your team's agenda.
- Format: Online (live), In-company (at your offices), or Hybrid.
Price per private group, online live training, starting from £4800 + VAT*
Contact us for an exact quote and to hear our latest promotions