💥 Gate Square Event: #PTB Creative Contest# 💥
Post original content related to PTB, CandyDrop #77, or Launchpool on Gate Square for a chance to share 5,000 PTB rewards!
CandyDrop x PTB 👉 https://www.gate.com/zh/announcements/article/46922
PTB Launchpool is live 👉 https://www.gate.com/zh/announcements/article/46934
📅 Event Period: Sep 10, 2025 04:00 UTC – Sep 14, 2025 16:00 UTC
📌 How to Participate:
Post original content related to PTB, CandyDrop, or Launchpool
Minimum 80 words
Add hashtag: #PTB Creative Contest#
Include CandyDrop or Launchpool participation screenshot
🏆 Rewards:
🥇 1st
Google Open Source Gemma-3: Comparable to DeepSeek, Computing Power Plummets
Jinshi data news on March 13th, last night, Google (GOOG.O) CEO Sundar Pichai announced that the latest multimodal large model GEMMA-3 open source, featuring low cost and high performance. Gemma-3 has four sets of parameters: 1 billion, 4 billion, 12 billion, and 27 billion. Even with the largest 27 billion parameters, only one H100 is needed for efficient inference, which is at least 10 times more Computing Power efficient than similar models to achieve this effect, making it the currently most powerful small parameter model. According to blind test LMSYS ChatbotArena data, Gemma-3 ranks second only to DeepSeek's R1-671B, higher than OpenAI's o3-mini, Llama3-405B, and other well-known models.