IoT Training Courses

IoT Training

Internet of Things, IoT Courses

Subcategories

IoT Course Outlines

Code Name Duration Overview
apakar Apache Karaf 21 hours Apache Karaf training is for developer and system administrators who need to discover and understand how to use Apache Karaf as an operating environment in the best possible way. Developers will learn the best practices for designing applications that fully integrate into the system, while administrators gain operational experience. Installing Apache Karaf Prerequisites Obtaining Apache Karaf Commanding the Runtime Command Review Common commands Remote console access Apache Karaf client Custom command creation Karaf command archetype Karaf custom command project JMX console Configuration and Tuning Startup properties Logging properties File logging Console logging System properties Configuring Karaf Environment variables Configuring hot deployment Console configuration commands Web console Failover configuration Startup properties Provisioning Apache Maven repositories The Karaf system repository Apache Karaf features Deploying Applications Deploying bundles Building bundle Deploying the bundle using Maven Deploying a bundle using the file handler Deploying a bundle using  HTTP Deploying a bundle using  hot deployments Deploying afeature description Deploying non-OSGi JARs (wrap) Deploying WAR Deploying Spring/Blueprint Creating and deploying a Karaf Archive Deploying Production-grade Apache Karaf Offline repositories Improving application logging Installing Karaf as a service Master-slave failover Child instances Basic security configuration Managing roles Password encription Locking down JMX access Apache Karaf Cellar Node discovery Cluster groups Cloud discovery  
bdbiga Big Data Business Intelligence for Govt. Agencies 35 hours Advances in technologies and the increasing amount of information are transforming how business is conducted in many industries, including government. Government data generation and digital archiving rates are on the rise due to the rapid growth of mobile devices and applications, smart sensors and devices, cloud computing solutions, and citizen-facing portals. As digital information expands and becomes more complex, information management, processing, storage, security, and disposition become more complex as well. New capture, search, discovery, and analysis tools are helping organizations gain insights from their unstructured data. The government market is at a tipping point, realizing that information is a strategic asset, and government needs to protect, leverage, and analyze both structured and unstructured information to better serve and meet mission requirements. As government leaders strive to evolve data-driven organizations to successfully accomplish mission, they are laying the groundwork to correlate dependencies across events, people, processes, and information. High-value government solutions will be created from a mashup of the most disruptive technologies: Mobile devices and applications Cloud services Social business technologies and networking Big Data and analytics IDC predicts that by 2020, the IT industry will reach $5 trillion, approximately $1.7 trillion larger than today, and that 80% of the industry's growth will be driven by these 3rd Platform technologies. In the long term, these technologies will be key tools for dealing with the complexity of increased digital information. Big Data is one of the intelligent industry solutions and allows government to make better decisions by taking action based on patterns revealed by analyzing large volumes of data — related and unrelated, structured and unstructured. But accomplishing these feats takes far more than simply accumulating massive quantities of data.“Making sense of thesevolumes of Big Datarequires cutting-edge tools and technologies that can analyze and extract useful knowledge from vast and diverse streams of information,” Tom Kalil and Fen Zhao of the White House Office of Science and Technology Policy wrote in a post on the OSTP Blog. The White House took a step toward helping agencies find these technologies when it established the National Big Data Research and Development Initiative in 2012. The initiative included more than $200 million to make the most of the explosion of Big Data and the tools needed to analyze it. The challenges that Big Data poses are nearly as daunting as its promise is encouraging. Storing data efficiently is one of these challenges. As always, budgets are tight, so agencies must minimize the per-megabyte price of storage and keep the data within easy access so that users can get it when they want it and how they need it. Backing up massive quantities of data heightens the challenge. Analyzing the data effectively is another major challenge. Many agencies employ commercial tools that enable them to sift through the mountains of data, spotting trends that can help them operate more efficiently. (A recent study by MeriTalk found that federal IT executives think Big Data could help agencies save more than $500 billion while also fulfilling mission objectives.). Custom-developed Big Data tools also are allowing agencies to address the need to analyze their data. For example, the Oak Ridge National Laboratory’s Computational Data Analytics Group has made its Piranha data analytics system available to other agencies. The system has helped medical researchers find a link that can alert doctors to aortic aneurysms before they strike. It’s also used for more mundane tasks, such as sifting through résumés to connect job candidates with hiring managers. Each session is 2 hours Day-1: Session -1: Business Overview of Why Big Data Business Intelligence in Govt. Case Studies from NIH, DoE Big Data adaptation rate in Govt. Agencies & and how they are aligning their future operation around Big Data Predictive Analytics Broad Scale Application Area in DoD, NSA, IRS, USDA etc. Interfacing Big Data with Legacy data Basic understanding of enabling technologies in predictive analytics Data Integration & Dashboard visualization Fraud management Business Rule/ Fraud detection generation Threat detection and profiling Cost benefit analysis for Big Data implementation Day-1: Session-2 : Introduction of Big Data-1 Main characteristics of Big Data-volume, variety, velocity and veracity. MPP architecture for volume. Data Warehouses – static schema, slowly evolving dataset MPP Databases like Greenplum, Exadata, Teradata, Netezza, Vertica etc. Hadoop Based Solutions – no conditions on structure of dataset. Typical pattern : HDFS, MapReduce (crunch), retrieve from HDFS Batch- suited for analytical/non-interactive Volume : CEP streaming data Typical choices – CEP products (e.g. Infostreams, Apama, MarkLogic etc) Less production ready – Storm/S4 NoSQL Databases – (columnar and key-value): Best suited as analytical adjunct to data warehouse/database Day-1 : Session -3 : Introduction to Big Data-2 NoSQL solutions KV Store - Keyspace, Flare, SchemaFree, RAMCloud, Oracle NoSQL Database (OnDB) KV Store - Dynamo, Voldemort, Dynomite, SubRecord, Mo8onDb, DovetailDB KV Store (Hierarchical) - GT.m, Cache KV Store (Ordered) - TokyoTyrant, Lightcloud, NMDB, Luxio, MemcacheDB, Actord KV Cache - Memcached, Repcached, Coherence, Infinispan, EXtremeScale, JBossCache, Velocity, Terracoqua Tuple Store - Gigaspaces, Coord, Apache River Object Database - ZopeDB, DB40, Shoal Document Store - CouchDB, Cloudant, Couchbase, MongoDB, Jackrabbit, XML-Databases, ThruDB, CloudKit, Prsevere, Riak-Basho, Scalaris Wide Columnar Store - BigTable, HBase, Apache Cassandra, Hypertable, KAI, OpenNeptune, Qbase, KDI Varieties of Data: Introduction to Data Cleaning issue in Big Data RDBMS – static structure/schema, doesn’t promote agile, exploratory environment. NoSQL – semi structured, enough structure to store data without exact schema before storing data Data cleaning issues Day-1 : Session-4 : Big Data Introduction-3 : Hadoop When to select Hadoop? STRUCTURED - Enterprise data warehouses/databases can store massive data (at a cost) but impose structure (not good for active exploration) SEMI STRUCTURED data – tough to do with traditional solutions (DW/DB) Warehousing data = HUGE effort and static even after implementation For variety & volume of data, crunched on commodity hardware – HADOOP Commodity H/W needed to create a Hadoop Cluster Introduction to Map Reduce /HDFS MapReduce – distribute computing over multiple servers HDFS – make data available locally for the computing process (with redundancy) Data – can be unstructured/schema-less (unlike RDBMS) Developer responsibility to make sense of data Programming MapReduce = working with Java (pros/cons), manually loading data into HDFS Day-2: Session-1: Big Data Ecosystem-Building Big Data ETL: universe of Big Data Tools-which one to use and when? Hadoop vs. Other NoSQL solutions For interactive, random access to data Hbase (column oriented database) on top of Hadoop Random access to data but restrictions imposed (max 1 PB) Not good for ad-hoc analytics, good for logging, counting, time-series Sqoop - Import from databases to Hive or HDFS (JDBC/ODBC access) Flume – Stream data (e.g. log data) into HDFS Day-2: Session-2: Big Data Management System Moving parts, compute nodes start/fail :ZooKeeper - For configuration/coordination/naming services Complex pipeline/workflow: Oozie – manage workflow, dependencies, daisy chain Deploy, configure, cluster management, upgrade etc (sys admin) :Ambari In Cloud : Whirr Day-2: Session-3: Predictive analytics in Business Intelligence -1: Fundamental Techniques & Machine learning based BI : Introduction to Machine learning Learning classification techniques Bayesian Prediction-preparing training file Support Vector Machine KNN p-Tree Algebra & vertical mining Neural Network Big Data large variable problem -Random forest (RF) Big Data Automation problem – Multi-model ensemble RF Automation through Soft10-M Text analytic tool-Treeminer Agile learning Agent based learning Distributed learning Introduction to Open source Tools for predictive analytics : R, Rapidminer, Mahut Day-2: Session-4 Predictive analytics eco-system-2: Common predictive analytic problems in Govt. Insight analytic Visualization analytic Structured predictive analytic Unstructured predictive analytic Threat/fraudstar/vendor profiling Recommendation Engine Pattern detection Rule/Scenario discovery –failure, fraud, optimization Root cause discovery Sentiment analysis CRM analytic Network analytic Text Analytics Technology assisted review Fraud analytic Real Time Analytic Day-3 : Sesion-1 : Real Time and Scalable Analytic Over Hadoop Why common analytic algorithms fail in Hadoop/HDFS Apache Hama- for Bulk Synchronous distributed computing Apache SPARK- for cluster computing for real time analytic CMU Graphics Lab2- Graph based asynchronous approach to distributed computing KNN p-Algebra based approach from Treeminer for reduced hardware cost of operation Day-3: Session-2: Tools for eDiscovery and Forensics eDiscovery over Big Data vs. Legacy data – a comparison of cost and performance Predictive coding and technology assisted review (TAR) Live demo of a Tar product ( vMiner) to understand how TAR works for faster discovery Faster indexing through HDFS –velocity of data NLP or Natural Language processing –various techniques and open source products eDiscovery in foreign languages-technology for foreign language processing Day-3 : Session 3: Big Data BI for Cyber Security –Understanding whole 360 degree views of speedy data collection to threat identification Understanding basics of security analytics-attack surface, security misconfiguration, host defenses Network infrastructure/ Large datapipe / Response ETL for real time analytic Prescriptive vs predictive – Fixed rule based vs auto-discovery of threat rules from Meta data Day-3: Session 4: Big Data in USDA : Application in Agriculture Introduction to IoT ( Internet of Things) for agriculture-sensor based Big Data and control Introduction to Satellite imaging and its application in agriculture Integrating sensor and image data for fertility of soil, cultivation recommendation and forecasting Agriculture insurance and Big Data Crop Loss forecasting Day-4 : Session-1: Fraud prevention BI from Big Data in Govt-Fraud analytic: Basic classification of Fraud analytics- rule based vs predictive analytics Supervised vs unsupervised Machine learning for Fraud pattern detection Vendor fraud/over charging for projects Medicare and Medicaid fraud- fraud detection techniques for claim processing Travel reimbursement frauds IRS refund frauds Case studies and live demo will be given wherever data is available. Day-4 : Session-2: Social Media Analytic- Intelligence gathering and analysis Big Data ETL API for extracting social media data Text, image, meta data and video Sentiment analysis from social media feed Contextual and non-contextual filtering of social media feed Social Media Dashboard to integrate diverse social media Automated profiling of social media profile Live demo of each analytic will be given through Treeminer Tool. Day-4 : Session-3: Big Data Analytic in image processing and video feeds Image Storage techniques in Big Data- Storage solution for data exceeding petabytes LTFS and LTO GPFS-LTFS ( Layered storage solution for Big image data) Fundamental of image analytics Object recognition Image segmentation Motion tracking 3-D image reconstruction Day-4: Session-4: Big Data applications in NIH: Emerging areas of Bio-informatics Meta-genomics and Big Data mining issues Big Data Predictive analytic for Pharmacogenomics, Metabolomics and Proteomics Big Data in downstream Genomics process Application of Big data predictive analytics in Public health Big Data Dashboard for quick accessibility of diverse data and display : Integration of existing application platform with Big Data Dashboard Big Data management Case Study of Big Data Dashboard: Tableau and Pentaho Use Big Data app to push location based services in Govt. Tracking system and management Day-5 : Session-1: How to justify Big Data BI implementation within an organization: Defining ROI for Big Data implementation Case studies for saving Analyst Time for collection and preparation of Data –increase in productivity gain Case studies of revenue gain from saving the licensed database cost Revenue gain from location based services Saving from fraud prevention An integrated spreadsheet approach to calculate approx. expense vs. Revenue gain/savings from Big Data implementation. Day-5 : Session-2: Step by Step procedure to replace legacy data system to Big Data System: Understanding practical Big Data Migration Roadmap What are the important information needed before architecting a Big Data implementation What are the different ways of calculating volume, velocity, variety and veracity of data How to estimate data growth Case studies Day-5: Session 4: Review of Big Data Vendors and review of their products. Q/A session: Accenture APTEAN (Formerly CDC Software) Cisco Systems Cloudera Dell EMC GoodData Corporation Guavus Hitachi Data Systems Hortonworks HP IBM Informatica Intel Jaspersoft Microsoft MongoDB (Formerly 10Gen) MU Sigma Netapp Opera Solutions Oracle Pentaho Platfora Qliktech Quantum Rackspace Revolution Analytics Salesforce SAP SAS Institute Sisense Software AG/Terracotta Soft10 Automation Splunk Sqrrl Supermicro Tableau Software Teradata Think Big Analytics Tidemark Systems Treeminer VMware (Part of EMC)
iotemi IoT (Internet of Things) for Entrepreneurs, Managers and Investors 21 hours Estimates for Internet of Things or IoT market value are massive, since by definition the IoT is an integrated and diffused layer of devices, sensors, and computing power that overlays entire consumer, business-to-business, and government industries. The IoT will account for an increasingly huge number of connections: 1.9 billion devices today, and 9 billion by 2018. That year, it will be roughly equal to the number of smartphones, smart TVs, tablets, wearable computers, and PCs combined. In the consumer space, many products and services have already crossed over into the IoT, including kitchen and home appliances, parking, RFID, lighting and heating products, and a number of applications in Industrial Internet. However the underlying technologies of IoT are nothing new as M2M communication existed since the birth of Internet. However what changed in last couple of years is the emergence of number of inexpensive wireless technologies added by overwhelming adaptation of smart phones and Tablet in every home. Explosive growth of mobile devices led to present demand of IoT. Due to unbounded opportunities in IoT business, a large number of small and medium sized entrepreneurs jumped on a bandwagon of IoT gold rush. Also due to emergence of open source electronics and IoT platform, cost of development of IoT system and further managing its sizable production is increasingly affordable. Existing electronic product owners are experiencing pressure to integrate their device with Internet or Mobile app. This training is intended for a technology and business review of an emerging industry so that IoT enthusiasts/entrepreneurs can grasp the basics of IoT technology and business. Course objectives Main objective of the course is to introduce emerging technological options, platforms and case studies of IoT implementation in home & city automation (smart homes and cities), Industrial Internet, healthcare, Govt., Mobile Cellular and other areas. Basic introduction of all the elements of IoT-Mechanical, Electronics/sensor platform, Wireless and wireline protocols, Mobile to Electronics integration, Mobile to enterprise integration, Data-analytics and Total control plane M2M Wireless protocols for IoT- WiFi, Zigbee/Zwave, Bluetooth, ANT+ : When and where to use which one? Mobile/Desktop/Web app- for registration, data acquisition and control –Available M2M data acquisition platform for IoT-–Xively, Omega and NovoTech, etc. Security issues and security solutions for IoT Open source/commercial electronics platform for IoT-Raspberry Pi, Arduino , ArmMbedLPC etc Open source /commercial enterprise cloud platform for IoT-Ayla, iO Bridge, Libellium, Axeda, Cisco fog cloud Studies of business and technology of some of the common IoT devices like Home automation, Smoke alarm, vehicles, military, home health etc Target Audience Investors and IoT entrepreneurs Managers and Engineers whose company is venturing into IoT space Business Analysts & Investors Pre-requisites Should have basic knowledge of business operation, devices, electronics systems and data systems Must have basic understanding of software and systems Basic understanding of Statistics ( in Excel levels) 1. Day 1, Session 1 — Business Overview of Why IoT is so important Case Studies from Nest, CISCO and top industries IoT adaptation rate in North American & and how they are aligning their future business model and operation around IoT Broad Scale Application Area Smart House and Smart City Industrial Internet Smart Cars Wearables Home Healthcare Business Rule Generation for IoT 3 layered architecture of Big Data — Physical (Sensors), Communication, and Data Intelligence 2. Day 1, Session 2 — Introduction of IoT: All about Sensors – Electronics Basic function and architecture of a sensor — sensor body, sensor mechanism, sensor calibration, sensor maintenance, cost and pricing structure, legacy and modern sensor network — all the basics about the sensors Development of sensor electronics — IoT vs legacy, and open source vs traditional PCB design style Development of sensor communication protocols — history to modern days. Legacy protocols like Modbus, relay, HART to modern day Zigbee, Zwave, X10,Bluetooth, ANT, etc. Business driver for sensor deployment — FDA/EPA regulation, fraud/tempering detection, supervision, quality control and process management Different Kind of Calibration Techniques — manual, automation, infield, primary and secondary calibration — and their implication in IoT Powering options for sensors — battery, solar, Witricity, Mobile and PoE Hands on training with single silicon and other sensors like temperature, pressure, vibration, magnetic field, power factor etc. 3. Day 1, Session 3 — Fundamental of M2M communication — Sensor Network and Wireless protocol What is a sensor network? What is ad-hoc network? Wireless vs. Wireline network WiFi- 802.11 families: N to S — application of standards and common vendors. Zigbee and Zwave — advantage of low power mesh networking. Long distance Zigbee. Introduction to different Zigbee chips. Bluetooth/BLE: Low power vs high power, speed of detection, class of BLE. Introduction of Bluetooth vendors & their review. Creating network with Wireless protocols such as Piconet by BLE Protocol stacks and packet structure for BLE and Zigbee Other long distance RF communication link LOS vs NLOS links Capacity and throughput calculation Application issues in wireless protocols — power consumption, reliability, PER, QoS, LOS Hands on training with sensor network PICO NET- BLE Base network Zigbee network-master/slave communication Data Hubs : MC and single computer ( like Beaglebone ) based datahub 4. Day 1, Session 4 — Review of Electronics Platform, production and cost projection PCB vs FPGA vs ASIC design-how to take decision Prototyping electronics vs Production electronics QA certificate for IoT- CE/CSA/UL/IEC/RoHS/IP65: What are those and when needed? Basic introduction of multi-layer PCB design and its workflow Electronics reliability-basic concept of FIT and early mortality rate Environmental and reliability testing-basic concepts Basic Open source platforms: Arduino, Raspberry Pi, Beaglebone, when needed? RedBack, Diamond Back 5. Day 2, Session 1 — Conceiving a new IoT product- Product requirement document for IoT State of the present art and review of existing technology in the market place Suggestion for new features and technologies based on market analysis and patent issues Detailed technical specs for new products- System, software, hardware, mechanical, installation etc. Packaging and documentation requirements Servicing and customer support requirements High level design (HLD) for understanding of product concept Release plan for phase wise introduction of the new features Skill set for the development team and proposed project plan -cost & duration Target manufacturing price 6. Day 2, Session 2 — Introduction to Mobile app platform for IoT Protocol stack of Mobile app for IoT Mobile to server integration –what are the factors to look out What are the intelligent layer that can be introduced at Mobile app level ? iBeacon in IoS Window Azure Linkafy Mobile platform for IoT Axeda Xively 7. Day 2, Session 3 — Machine learning for intelligent IoT Introduction to Machine learning Learning classification techniques Bayesian Prediction-preparing training file Support Vector Machine Image and video analytic for IoT Fraud and alert analytic through IoT Bio –metric ID integration with IoT Real Time Analytic/Stream Analytic Scalability issues of IoT and machine learning What are the architectural implementation of Machine learning for IoT 8. Day 2, Session 4 — Analytic Engine for IoT Insight analytic Visualization analytic Structured predictive analytic Unstructured predictive analytic Recommendation Engine Pattern detection Rule/Scenario discovery — failure, fraud, optimization Root cause discovery 9. Day 3, Session 1 — Security in IoT implementation Why security is absolutely essential for IoT Mechanism of security breach in IOT layer Privacy enhancing technologies Fundamental of network security Encryption and cryptography implementation for IoT data Security standard for available platform European legislation for security in IoT platform Secure booting Device authentication Firewalling and IPS Updates and patches 10. Day 3, Session 2 — Database implementation for IoT : Cloud based IoT platforms SQL vs NoSQL-Which one is good for your IoT application Open sourced vs. Licensed Database Available M2M cloud platform Axeda Xively Omega NovoTech Ayla Libellium CISCO M2M platform AT &T M2M platform Google M2M platform 11. Day 3, Session 3 — A few common IoT systems Home automation Energy optimization in Home Automotive-OBD IoT-Lock Smart Smoke alarm BAC ( Blood alcohol monitoring ) for drug abusers under probation Pet cam for Pet lovers Wearable IOT Mobile parking ticketing system Indoor location tracking in Retail store Home health care Smart Sports Watch 12. Day 3, Session 4 — Big Data for IoT 4V- Volume, velocity, variety and veracity of Big Data Why Big Data is important in IoT Big Data vs legacy data in IoT Hadoop for IoT-when and why? Storage technique for image, Geospatial and video data Distributed database Parallel computing basics for IoT
quarks Apache Quarks 21 hours Quarks provides is a programming model and micro-kernel style runtime that can be embedded in gateways and small footprint edge devices enabling local, real-time, analytics on the continuous streams of data coming from equipment, vehicles, systems, appliances, devices and sensors of all kinds (for example, Raspberry Pis or smart phones). Audience This course is directed at developers and engineers seeking to utilize Apache Quarks in their Internet of Things project   Getting started with Apache Quarks Setting up your environment Creating a simple application Further Examples Using the Console Visualizing and monitoring your application Adding the console web app to your application ConsoleWaterDetector sample ConsoleWaterDetector application scenario Detecting zero tuple counts Topology graph controls Counters Quarks Cookbook Writing a source function Detecting a sensor value out of expected range Applying different processing against a single stream Splitting a stream to apply different processing and combining the results into a single stream Using an external configuration file for filter ranges Changing a filter's range Changing a polled source stream's period Using an adaptable deadtime filter Dynamically Enabling Analytic Flows How can I run analytics on several tuples in parallel How can I run several analytics on a tuple concurrently? Sample Programs  

Upcoming Courses

CourseCourse DateCourse Price [Remote / Classroom]
IoT (Internet of Things) for Entrepreneurs, Managers and Investors - SheffieldMon, 2017-08-14 09:30£3900 / £4500
IoT (Internet of Things) for Entrepreneurs, Managers and Investors - SouthamptonMon, 2017-08-14 09:30£3900 / £4650
Big Data Business Intelligence for Govt. Agencies - Exeter - The SenateMon, 2017-08-14 09:30£5500 / £7000

Other regions

Weekend IoT courses, Evening IoT training, IoT boot camp, IoT instructor-led , IoT classes, IoT trainer , IoT one on one training , IoT instructor, Evening IoT courses, IoT training courses, IoT coaching,Weekend IoT training, IoT private courses

Course Discounts

Course Discounts Newsletter

We respect the privacy of your email address. We will not pass on or sell your address to others.
You can always change your preferences or unsubscribe completely.

Some of our clients