AWS Machine Learning Blog
Category: Announcements
Best practices and lessons for fine-tuning Anthropic’s Claude 3 Haiku on Amazon Bedrock
In this post, we explore the best practices and lessons learned for fine-tuning Anthropic’s Claude 3 Haiku on Amazon Bedrock. We discuss the important components of fine-tuning, including use case definition, data preparation, model customization, and performance evaluation.
Transitioning from Amazon Rekognition people pathing: Exploring other alternatives
After careful consideration, we made the decision to discontinue Rekognition people pathing on October 31, 2025. New customers will not be able to access the capability effective October 24, 2024, but existing customers will be able to use the capability as normal until October 31, 2025. This post discusses an alternative solution to Rekognition people pathing and how you can implement this solution in your applications.
Automate fine-tuning of Llama 3.x models with the new visual designer for Amazon SageMaker Pipelines
In this post, we will show you how to set up an automated LLM customization (fine-tuning) workflow so that the Llama 3.x models from Meta can provide a high-quality summary of SEC filings for financial applications. Fine-tuning allows you to configure LLMs to achieve improved performance on your domain-specific tasks.
Amazon Bedrock Custom Model Import now generally available
We’re pleased to announce the general availability (GA) of Amazon Bedrock Custom Model Import. This feature empowers customers to import and use their customized models alongside existing foundation models (FMs) through a single, unified API.
Bria 2.3, Bria 2.2 HD, and Bria 2.3 Fast are now available in Amazon SageMaker JumpStart
In this post, we discuss Bria’s family of models, explain the Amazon SageMaker platform, and walk through how to discover, deploy, and run inference on a Bria 2.3 model using SageMaker JumpStart.
Introducing SageMaker Core: A new object-oriented Python SDK for Amazon SageMaker
In this post, we show how the SageMaker Core SDK simplifies the developer experience while providing API for seamlessly executing various steps in a general ML lifecycle. We also discuss the main benefits of using this SDK along with sharing relevant resources to learn more about this SDK.
Enable or disable ACL crawling safely in Amazon Q Business
Amazon Q Business recently added support for administrators to modify the default access control list (ACL) crawling feature for data source connectors. Amazon Q Business is a fully managed, AI powered assistant with enterprise-grade security and privacy features. It includes over 40 data source connectors that crawl and index documents. By default, Amazon Q Business […]
Exploring alternatives and seamlessly migrating data from Amazon Lookout for Vision
In this post we discuss how you can maintain access to Lookout for Vision after it is closed to new customers, some alternatives to Lookout for Vision, and how you can export your data from Lookout for Vision to migrate to an alternate solution.
Transitioning off Amazon Lookout for Metrics
In this post, we provide an overview of the alternate AWS services that offer anomaly detection capabilities for customers to consider transitioning their workloads to.
Maintain access and consider alternatives for Amazon Monitron
This post discusses how customers can maintain access to Amazon Monitron after it is closed to new customers and what some alternatives are to Amazon Monitron.