🤔The Falcon 180B is an enormous model with 180 billion parameters.
🚀It is currently topping the open source LLM leaderboards and rivals proprietary models like GPT-2.
💪The Falcon 180B is a scaled-up version of Falcon 40B, another foundational model.
⏱️It was trained on 3.5 trillion tokens on up to 4096 GPUs simultaneously.
🛠️Although commercially viable, there are restrictive conditions for its use.