FLM-101B: Training 101 Billion Parameter Language Model with a $100K Budget
24 September 2023
FLM-101B: Training 101 Billion Parameter Language Model with a $100K Budget
Researchers from Beijing University present FLM-101B, an open-source large language model (LLM) with 101 billion parameters trained from scratch with a budget of only $100K. Training LLMs at large scales…