USENIX NSDI ’24 – MegaScale: Scaling Large Language Model Training to More Than 10,000 GPUs

Authors/Presenters:Ziheng Jiang, Haibin Lin, Yinmin Zhong, Qi Huang, Yangrui Chen, Zhi Zhang, Yanghua Peng, Xiang Li, Cong Xie, Shibiao Nong, Yulu Jia, Sun He, Hongmin Chen, Zhihao Bai, Qi Hou, Shipeng Yan, Ding Zhou, Yiyao Sheng, Zhuo Jiang, Haohan Xu, Haoran Wei, Zhang Zhang, Pengfei Nie, Leqi Zou, Sida Zhao, Liang Xiang, Zherui Liu, Zhe Li, Xiaoying Jia, Jianxi Ye, Xin Jin, Xin Liu

Our sincere thanks to USENIX, and the Presenters & Authors for publishing their superb 21st USENIX Symposium on Networked Systems Design and Implementation (NSDI ’24) content, placing the organizations enduring commitment to Open Access front and center. Originating from the conference’s events situated at the Hyatt Regency Santa Clara; and via the organizations YouTube channel.

Permalink

The post USENIX NSDI ’24 – MegaScale: Scaling Large Language Model Training to More Than 10,000 GPUs appeared first on Security Boulevard.

This article has been indexed from Security Boulevard

Read the original article: