Today, we’re announcing that GitHub will make Gemini models – starting with Gemini 1.5 Pro – available to developers on its platform for the first time through a new partnership with Google Cloud. Developers value flexibility and control in choosing the best model suited to their needs — and this partnership shows that the next phase of AI code generation will not only be defined by multi-model functionality, but also by multi-model choice.
In the coming weeks, developers using GitHub Copilot will be able to use Gemini 1.5 Pro, which excels in common developer use cases such as code generation, analysis, and optimization.Gemini 1.5 Pro is natively multimodal and features a long context window of up to two million tokens — the longest of any large-scale foundation model — so developers can process more than 100,000 lines of code, suggest helpful modifications, and explain how different parts of the code work.
Developers will be able to select Gemini 1.5 Pro during conversations with GitHub Copilot Chat on github.com, Visual Studio Code, and with Copilot extensions for Visual Studio in the coming weeks.
Gemini models support developer experiences across many of the most popular platforms and environments today — via the Gemini API, Google AI Studio and Vertex AI, or through assistance directly in Google Cloud, Workspace, Android Studio, Firebase and Colab. In addition, Google’s own code-assistance tool, Gemini Code Assist, helps developers complete code as they write across popular integrated development environments (IDE) like Visual Studio Code and JetBrains IDEs (like IntelliJ, PyCharm, GoLand, WebStorm, and more).
Read more about our new partnership with GitHub here.
Our new AI system accurately identifies errors inside quantum computers, helping to make this new…
Estimating the density of a distribution from samples is a fundamental problem in statistics. In…
Swiss Re & PalantirScaling Data Operations with FoundryEditor’s note: This guest post is authored by our customer,…
As generative AI models advance in creating multimedia content, the difference between good and great…
Large language models (LLMs) give developers immense power and scalability, but managing resource consumption is…
We dive into the most significant takeaways from Microsoft Ignite, and Microsoft's emerging leadership in…