latentbrief
Back to news
Launch4d ago

Pet-Cam Startup Reduces AI Costs with AWS Chips

AWS ML Blog1 min brief

In brief

  • Furbo, a Taiwan-based startup known for its pet cameras, has found a cost-effective solution using Amazon's Inferentia2 chips.
  • Previously relying on expensive GPU instances, Furbo switched to EC2 Inf2 instances, cutting costs while maintaining real-time monitoring accuracy.
  • The move allowed them to scale their vision-language models across hundreds of thousands of devices without major code changes, ensuring reliable pet behavior detection for owners worldwide.
  • Furbo's system now processes images through two layers of EC2 Auto Scaling groups, with the first handling API requests and the second running model inference on Inf2 instances.
    • This setup ensures scalability and efficiency, proving that purpose-built AI chips can match GPUs in performance while being more budget-friendly.
  • Looking ahead, this approach could set a precedent for other startups seeking to balance cost and performance in real-time AI applications.
  • Furbo's success highlights how leveraging specialized hardware like Inferentia2 can unlock new possibilities for pet-tech innovation.

Terms in this brief

Inferentia2
A type of AI chip designed by Amazon Web Services (AWS) for machine learning inference. It's optimized to handle real-time AI tasks efficiently and cost-effectively, making it suitable for scaling applications like Furbo's pet camera system.

Read full story at AWS ML Blog

More briefs