How Tecton Helps ML Teams Build Smarter Models, Faster

In the race to infuse intelligence into every product and application, the speed at which machine learning (ML) teams can innovate is not just a metric of efficiency. It’s what sets industry leaders apart, empowering them to constantly improve and... View article

Basics of Instruction Tuning with OLMo 1B

Large Language Models (LLMs) are trained on vast corpora of text, giving them impressive language comprehension and generation capabilities. However, this training does not inherently provide them with the ability to directly answer questions or follow instructions. To achieve this,... View article

MLflow on AWS with Pulumi: A Step-by-Step Guide

Many data science and machine learning teams grapple with the challenge of effectively tracking numerous experiments and their corresponding results. Often, they resort to using cumbersome methods such as Excel spreadsheets and manual record-keeping, leading to overwhelming data management and... View article

Audio Generation with Mamba using Determined AI

Training the new Mamba architecture on speech + music data! As you might have noticed from my past blogs, most of my experience is in computer vision. But, recently, for obvious reasons (read: ChatGPT, LLaMas, Alpacas, etc…), I realized it’s... View article

The Role of AI Safety Standards in Modern MLOps

With the recent explosive growth of AI, particularly in Generative AI, the importance of safety and reliability has surged as a paramount concern for businesses, consumers, and regulatory bodies.  Recent safety standards and regulations as outlined in the EU AI... View article

Evaluation Survey Insights

In September 2023 we conducted a survey with the MLOps Community on evaluating LLM systems. More than 115 people participated. All of the response data is free for anyone to look at and examine. We encourage you to dig into... View article