On-device model training framework
- Project
- 19045 STACK
- Type
- New system
- Description
In this work, we show that the limited memory resources on mobile devices are the main constraint for on-device DNN training and propose Sage as a framework for efficiently optimizing memory resources for on-device DNN training. Sage configures a flexible computation graph for DNN gradient evaluation and reduces the memory footprint of the graph using operator- and graph-level optimizations. In run-time, Sage employs a hybrid of gradient checkpointing and micro-batching techniques to dynamically adjust its memory use to the available system memory budget.
- Contact
- JeongGil Ko
- jeonggil.ko@yonsei.ac.kr
- Research area(s)
- Machine learning
- Technical features
Technical implementation of Sage framework
- Integration constraints
N/A
- Targeted customer(s)
Academics and Industrial Researchers
- Conditions for reuse
GPLv3 licence
- Confidentiality
- Public
- Publication date
- 17-10-2023
- Involved partners
- Yonsei University (KOR)