Google Research’s cover photo
Google Research

Google Research

Technology, Information and Internet

Impossible? Let's see.

About us

From conducting fundamental research to influencing product development, our research teams have the opportunity to impact technology used by billions of people every day. We aspire to make discoveries that impact everyone, and sharing our research and tools to fuel progress in the field is fundamental to our approach.

Website
https://research.google/
Industry
Technology, Information and Internet
Company size
1,001-5,000 employees

Updates

  • What if we could predict tropical cyclones with more accuracy and up to 15 days in advance? Our new experimental AI model, Weather lab, developed In partnership with @GoogleDeepmind, makes advanced predictions, which are as accurate as, and often more accurate than, current physics-based methods. Let’s go deeper into the breakthrough with our research team. Learn more at https://goo.gle/4l9hsiJ

  • Google Research reposted this

    View profile for Yossi Matias

    Vice President, Google. Head of Google Research.

    Today, Google is sharing the energy, emissions, and water impact of Gemini prompts through a technical paper that details our comprehensive methodology. Our analysis shows that the median Gemini Apps text prompt substantially lower than many public estimates: ⚡️ Energy: 0.24 watt-hours (Wh), comparable to watching television for less than nine seconds. ⚛️ Carbon: 0.03 grams of carbon dioxide equivalent (gCO2e). 💧Water: 0.26 milliliters, or about five drops of water. At the same time, our AI systems are becoming more efficient through research innovations and software and hardware efficiency improvements. Over a recent 12-month period, the median Gemini Apps text prompt reduced its: 💥 energy consumption by 33x,  💥 carbon footprint by 44x  all while delivering higher-quality responses. These figures are a testament to our full-stack approach to AI development. This includes: 1/ Efficient model architectures: Using methods like Mixture-of-Experts (MoE) to activate only a small subset of a large model. 2/ Optimized inference and serving: Improving AI model delivery with technologies such as speculative decoding (https://lnkd.in/dvSkw3bp) and distillation. 3/ Custom-built hardware: Designing our TPUs from the ground up for maximum performance per watt. 🧮♻️ How are we calculating the environmental footprint of AI at Google? A a comprehensive approach that considers the realities of serving AI at Google’s scale includes: ♻️ Full system dynamic power: Accounting for actual chip utilization, which can be lower than theoretical maximums. ♻️ Idle machines: Factoring in the energy consumed by provisioned capacity that is idle but ready to handle traffic spikes. ♻️ CPU and RAM: Recognizing the energy consumed by the host CPU and RAM. ♻️ Data center overhead: Including the energy consumed by cooling systems and power distribution. ♻️ Data center water consumption: Accounting for water used for cooling. Our comprehensive methodology’s estimates account for all critical elements of serving AI globally. We believe this is the most complete view of AI’s overall footprint. We are sharing this methodology to encourage industry-wide consistency and collaboration. Continued innovation in AI efficiency is critical to meet growing demands responsibly. Read the blog: https://lnkd.in/dRw7tsSN Read more about the technical aspects: https://lnkd.in/dY8ks6wz Technical Paper: https://lnkd.in/d8-Y2TYV

Affiliated pages

Similar pages