Eco

Overview

Eco is a distributed AI framework for training and serving miscellaneous AI model at the edge. With your private group of mobile and edge devices (mobile phones, tablets, embedded devices, etc.), Eco provides a unified implementation framework to collaborate them as a assembled resource pool for efficient edge AI services.

Features

😊 Optimized Computation

  • Language models
  • Vision perceptrons
  • Graph nets

βš’οΈ Heterogeneity Awareness

  • Mobile phones
  • Embedded devices
  • Edge servers

πŸ„ Resilient Elasticity

  • Device breakdown
  • Load variation
  • Bandwidth fluctution

Video

Publications

  • Generative Language Models
    • INFOCOM'24 Galaxy: A Resource-Efficient Collaborative Edge AI System for In-Situ Transformer Inference.
    • MobiCom'24 Asteroid: Resource-Efficient Hybrid Pipeline Parallelism for Collaborative DNN Training on Heterogeneous Edge Devices.
    • WCM'24 Implementation of Big AI Models for Wireless Networks with Collaborative Edge Computing.
  • Convolutional Vision Models
    • ICPP'22 Eco-FL: Adaptive Federated Learning with Efficient Edge Collaborative Pipeline Training.
    • TON'20 CoEdge: Cooperative DNN Inference with Adaptive Workload Partitioning over Heterogeneous Edge Devices.
    • TWC'20 Edge AI: On-Demand Accelerating Deep Neural Network Inference via Edge Computing.
  • Graph Neural Networks
    • TON'23 Serving Graph Neural Networks with Distributed Fog Servers for Smart IoT Services.
    • JSAC'22 GNN at the Edge: Cost-Efficient Graph Neural Network Processing over Distributed Edge Servers.
    • WWW'22 Fograph: Enabling Real-Time Deep Graph Inference with Fog Computing.

Team

Frequently Asked Questions

1. Why collaborative edge AI? +

Prevalent edge environments usually comprise a diverse collection of accompanying trusted edge devices with untapped idle resources. As reported, each family today owns on average more than 10 connected smart devices, which will rise to 50 by 2025. This motivates to collaborate them through a unified framework, which leverage distributed edge resources for improving edge AI performance.

2. What is the difference between running distributed AI on edge and cloud? +

Supporting efficient distributed AI over a bunch of edge devices is non-trivial. Compared to the well-provisioned, powerful cloud, edge devices are much more resource-constrained with limited computing capability and scarce memory space. Besides, edge devices surrounding people are heterogeneous (e.g., high-end mobile phones, smartwatches, and laptops) and their connections are usually dynamic (e.g., due to their mobility across network operators). These edge-specific characteristics of edge devices pose unqiue challenges to distributed AI over them, requiring a tailoered solution beyond cloud implementations. Eco is thereby rised.

3. Where can I access the codebase? +

We plan to release the codebase of Eco later. Stay tuned!

4. What is the next plan? +

We are currently working on supporting Eco for large language models and multimodal models. Welcome to join us if interested!

5. Can I join the project? +

Of course! Welcome feedbacks and discussions. Do not hesitate to contact the project leads if you have any questions.