AI AT THE EDGE: DESIGNING INTELLIGENT EMBEDDED SYSTEMS FOR REAL-TIME, CLOUD-FREE DECISION MAKING
Talk Abstract
Key takeaways: 1. Edge AI Defined: Understanding the concept of running AI/ML algorithms directly on embedded systems without relying on cloud infrastructure. 2. Benefits of Local Intelligence: Highlights include reduced latency, enhanced privacy, real-time responsiveness, lower bandwidth usage, and increased reliability. 3. Model Optimization: Techniques such as quantization, pruning, and knowledge distillation to fit complex models into resource-constrained devices. 4. Deployment Frameworks: Introduction to tools like TensorFlow Lite, ONNX Runtime, and Edge Impulse for building and deploying models on embedded platforms. 5. Use Cases: Real-world applications including predictive maintenance, smart surveillance, autonomous vehicles, wearable health monitors, and industrial automation.