What is Ogre?

The vision for Ogre as a developer tool

·9 min read
Cover Image for What is Ogre?

Introduction

What is Ogre?

Ogre is a metacompiler that analyzes source code to create the best runtime environment for it.

Differently from traditional compilers, Ogre has AI at its core and thus enables users to generate an entire execution environment, instead of only executable files, from input source code.

In essence, for each code base it is presented with, Ogre tries to answer the question: which packages/libraries need to be installed so the code runs flawlessly in the host system? And it does it with one command/click.

Ogre was designed to:

  • Increase code usability by eliminating the need for developers to figure out and install dependencies manually, especially when working with third-party and/or poorly-documented code bases.
  • Help CTOs and engineering managers maximize the value of R&D code by guaranteeing that research code will be runnable out-of-the-box anywhere.
  • Amplify the performance of engineering teams by providing them with DevOps power, reducing their dependency on cumbersome processess (e.g. back-and-forth with DevOps team) that slow down product iteration.

The product comes in two flavors: a command-line interface (CLI), easy to be used from any terminal, and a Chrome browser extension, very handy when navigating GitHub and GitLab web pages. We also offer the Ogre box, a server targeted at enterprises whose sensitive data can't be stored in the cloud.

Ogre comes with a data platform that helps teams track key metrics for each environment, shedding a light on the actual relevance of a code base to the company's business performance.

Why build this

In our experience as entrepreneurs and consultants to a handful of tech companies, we have witnessed engineering teams struggling to run code out-of-the-box in different machines and environments. In fact, every single client of ours (mainly AI companies) has this problem.

There is a lot of unusable code piling up in private and public repositories. Take for example GitHub: with over 330M repositories, it enables access to usable and dependable software as well as stale, undocumented, difficult-to-run code. In the last few years, GitHub (and similar platforms) has slowly become a dumpster where a lot of code goes to die. That code may remain untouched for years, and as a consequence, companies waste ideas and developer-hours (nobody has the patience and/or time to figure out what is necessary to run the code), while frictions among developers increase -- they can't immediately iterate on each other's work.

This is a waste of innovation: we are letting many good, potentially profitable, ideas (code) to die because we don't have the time, tools, and the right incentives to continue exploring them.

This can't be solved by CI/CDs, README files, Dockerfiles: it is naive to think that all developers will get out of their confort zone to learn the devops skills necessary to guarantee code reproducibility (I have seen many teams fail miserably) -- plus, nobody is rewarded for writing a flawless documentation. This is specially true in AI development, where innovation published by researchers lacks the proper pipeline to be deployed anywhere, slowing down their spread.

At Ogre.run, we envision a solution for this by reimagining the role of code compilers in a world where AI is part of any product infrastructure (particularly generative AI). By offering developers a metacompiler that generates the perfect environment for a code base, Ogre can unlock the potential of many ideas to break into mainstream.

Simplifying application packaging and deployment

As source code becomes a commodity easily accessible by anyone (for example, by searching on GitHub or generating functional blocks of code with OpenAI tools), the main development roadblocks now lie in the deployment infrastructure, i.e., the underlying software and hardware where that source code is supposed to run.

With so many options — from personal computers running single ARM chips to servers in the cloud with thousands of GPUs — developers spend a great chunk of their time considering what is necessary to make their product run in their client’s infrastructure. Here are a few questions they keep cycling through: Can we leverage GPUs? Do they require a specific linux kernel? What is the CPU architecture?

Ogre was built to remove these questions from the product critical path. Developers just need to focus on their code and Ogre will create a functional environment that works out of the box in the target host system.

Additionally, for the most part, sites like GitHub have become a place to dump source code, lacking organization and consistency. Who hasn't tried to run code out of a repository and was faced with some sort of inconsistency of packages/libraries? This is beyond a nuisance — it is a major problem that hinders the execution of product roadmaps and increases project costs.

What is unique about Ogre?

Competition

The main competitors can be divided into three groups of tools, each made by different players:

  1. Traditional DevOps tools like Docker, Conda, Poetry, CI/CDs (e.g. GitLab, Jenkins, Dagger)
  2. AI-driven coding copilots such as Microsoft Copilot, GitHub Copilot, OpenAI APIs
  3. Standard compilation frameworks such as GCC, clang (LLVM), Intel, IBM

As applications get more and more complex, developers tend to mix and match several of those tools in the same pipeline. We've seen this in many teams and identified the following unsolved issues that hinder development progress:

  • Many points of failure: to run a code base, one needs to orchestrate multiple different tools. This increases the number of potential points of failure when deploying.

  • Difficult to assemble performance data: the many points of failure make it even more difficult to adopt a data-driven development approach that measures the effective impact of a code base to the company's ROI.

  • Knowledge fragmentation: most developers are not proficient in all the tools in the pipeline, preventing them from fully own a project from inception to production.

  • Counter-productive team communication: to overcome knowledge fragmentation, team members need to increase internal communication, which then increases context-switch, resulting in overall team-performance degradation and slowing down delivery.

These issues combined result in a negative impact on business outcome and increase in frustration of executives who struggle to justify their investiment in R&D, especially in AI. And existing tools have not being able to fix that.

At Ogre.run, we see this as a major problem for companies moving forward, as their revenue becomes more and more dependent on their code-development teams' performance. Our assumption is that by offering a metacompiler such as Ogre, with AI at its core, those companies can remove the unlock a new level of performance and increase their ROI.

Ogre's scope goes beyond the narrow vision of optimizing code speed. In fact, Ogre prioritizes developer speed so they can get stuff out of the door fast.

How is it different from Docker and CI/CD pipelines?

Container engines like Docker and CI/CD pipelines are great tools if and only if you know how to use them properly. They require DevOps skills that most developers don’t have (and don't have the time to learn). This way, development teams need to rely on extra management processes to synchronize with DevOps specialists and guarantee reproducibility, slowing down prototype development.


Ogre aims to reduce dependency on cumbersome processes, eliminating layers of bureaucracy, speeding up development, and freeing AI engineering teams to do what they are good at: bringing technology to functional prototypes.

That said, Ogre is not meant to be a substitute for Docker or CI/CDs. It plays nicely with those tools and can actually improve the way they are currently used in your development pipeline.

How is it different from GitHub Copilot and OpenAI’s ChatGPT?

At time of writing, GitHub Copilot and OpenAI's ChatGPT -- both subsidiaries of Microsoft -- are the two most popular generative-AI tools for assisted code development. While their performance for code generation is impressive, their product philosophy is restricted to code correctness, i.e., they do not take into consideration the environment (OS, hardware) where that code is supposed to be executed -- which, as previously stablished, is the main roadblock in code development these days.

In the real world, code correctness does not guarantee it will run properly -- the user still has to spend precious development time adapting/optimizing that code to the specific constraints of the target infrastructure. That's where Ogre focus its efforts, making sure that your code runs regardless of the underlying computing system.

For that reason, we at Ogre.run don't see Copilot and ChatGPT as competitors to our products. In fact, we can leverage them inside our products via APIs, if need be. Even as Copilot and ChatGPT mature and add new features, e.g., ChatGPT's new code interpreter, Ogre.run's focus on the enterprise market (more below) allows it to differentiate itself from those tools -- It can cater to the specific needs of enterprise clients, who will never store their precious IP in platforms like GitHub and risk them to be leaked due the its use in training Microsoft's LLMs.

Our vision

Ogre as a source-code building framework

Compilers and source-code building frameworks are cornerstones of code development. They take a set of files (the source code) and a list of instructions (e.g. Makefile), and return a set of executable files that are tuned for the target computer (architecture) specified in the list of instructions.

A metacompiler takes the core idea of compilers (i.e. take a source code as input and return an executable file) and generalizes it to include entire runtime environments as the output. This way, differently from traditional compilers, one can always run the output (the runtime environment) in any computational system.

The long-term vision for Ogre is to become a one-stop solution to convert, with the click of a button, a code base into a functional, executable system, without manual work.

By framing Ogre as a metacompiler, we can integrate it with multiple tools already familiar to millions of developers, e.g., GCC, LLVM, Bazel, leveraging them to do the compilation to executable files while using Ogre's core technology (details below) to build the runtime environment. It can even be integrated to fresh programming languages such as Mojo, designed to provide Python users with access bare-metal capabilties.

Ogre is the tool that ties everything together when it comes to production-driven development.

Using AI agents to simplify development flow

To scale performance and enforce the simplest possible development flow, Ogre is taking advantage from the recent multiplication of generative-AI tools. Its core technology is based on a set of custom large-language models (LLMs) that are specialized to understand source code and their graph of dependencies (operating-system, hardware, and software libraries).

Each model is fine-tuned from foundational LLMs made available by the AI community (and with a permissive license that allows their use in commercial products). Currently, we are leveraging OpenAI’s GPT-4 and Falcon 7B inside the product. We also use Meta's LLAMA-based models for research purposes only.

Creating our own LLM-based AI agents enables Ogre.run to deliver more DevOps value to clients while using less resources than traditional solutions (e.g. dedicated teams of DevOps -- or AIOps -- engineers). Maximizing developer's performance and reducing context switching is the secret to successful product design and deployment.