Arm64 on GitHub Actions: Powering faster, more efficient build systems
GitHub Actions now offers Arm-hosted runners with images built by Arm for developers to begin building on the latest and most sustainable processors on the market.
Explore resources, insights, and technical content about artificial intelligence (AI), generative AI, and machine learning (ML) tailored for developers looking to learn more about this quickly evolving field.
GitHub engineers and industry thought leaders offer tips, best practices, and practical explainers about various aspects of AI and ML, ranging from fundamental concepts to advanced techniques and real-world applications. Find out how to leverage GitHub’s tools, platforms, and integrations to streamline the development, testing, and use of AI and ML models.
For more detailed documentation and practical guides on GitHub’s own AI coding tool, GitHub Copilot, check out GitHub’s official documentation.
GitHub Actions now offers Arm-hosted runners with images built by Arm for developers to begin building on the latest and most sustainable processors on the market.
Here’s how SAST tools combine generative AI with code scanning to help you deliver features faster and keep vulnerabilities out of code.
Here’s how retrieval-augmented generation, or RAG, uses a variety of data sources to keep AI models fresh with up-to-date information and organizational knowledge.
Learn how your organization can customize its LLM-based solution through retrieval augmented generation and fine-tuning.
Explore the capabilities and benefits of AI code generation, and how it can improve the developer experience for your enterprise.
Learn how we’re experimenting with generative AI models to extend GitHub Copilot across the developer lifecycle.
Here’s everything you need to know to build your first LLM app and problem spaces you can start exploring today.
Explore how LLMs generate text, why they sometimes hallucinate information, and the ethical implications surrounding their incredible capabilities.
Open source generative AI projects are a great way to build new AI-powered features and apps.
The team behind GitHub Copilot shares its lessons for building an LLM app that delivers value to both individuals and enterprise users at scale.
GitHub’s design experts share 10 tips and lessons for designing magical user experiences for AI applications and AI coding tools.
Prompt engineering is the art of communicating with a generative AI model. In this article, we’ll cover how we approach prompt engineering at GitHub, and how you can use it to build your own LLM-based application.
Developers behind GitHub Copilot discuss what it was like to work with OpenAI’s large language model and how it informed the development of Copilot as we know it today.
With a new Fill-in-the-Middle paradigm, GitHub engineers improved the way GitHub Copilot contextualizes your code. By continuing to develop and test advanced retrieval algorithms, they’re working on making our AI tool even more advanced.
Explore how generative AI coding tools are changing the way developers and companies build software.
Rapid advancements in generative AI coding tools like GitHub Copilot are accelerating the next wave of software development. Here’s what you need to know.
Generative AI has been dominating the news lately—but what exactly is it? Here’s what you need to know, and what it means for developers.
GitHub Copilot boosts developer productivity, but using it responsibly still requires good developer and DevSecOps practices.
We’re launching new improvements to GitHub Copilot to make it more powerful and more responsive for developers.
Build what’s next on GitHub, the place for anyone from anywhere to build anything.
Get tickets to the 10th anniversary of our global developer event on AI, DevEx, and security.