Listing Thumbnail

    Ollama with Open WebUI on Ubuntu 24.04

     Info
    AWS Free Tier
    This product has charges associated with it for support. Ollama is a cutting-edge AI tool that empowers users to set up and run large language models, such as Llama 2 and 3, directly on their local machines. This innovative solution caters to a wide range of users, from experienced AI professionals to enthusiasts, enabling them to explore natural language processing without depending on cloud-based services.
    Listing Thumbnail

    Ollama with Open WebUI on Ubuntu 24.04

     Info

    Overview

    Ollama with Open WebUI on Ubuntu 24.04

    This is a repackaged software product wherein additional charges apply for support by Cloud Infrastructure Services. Ollama is a cutting-edge AI tool that empowers users to set up and run large language models, such as Llama 2 and 3, directly on their local machines. This innovative solution caters to a wide range of users, from experienced AI professionals to enthusiasts, enabling them to explore natural language processing without depending on cloud-based services.

    Ollama Features

  • Local Execution: Ollama allows for the execution of large language models directly on your local machine, providing fast and efficient AI processing capabilities.
  • Support for Llama 3: Utilize the sophisticated Llama 2 & 3 models to perform a variety of natural language processing tasks.
  • Model Customization: Ollama offers the flexibility to modify and create custom models, making it suitable for specialized applications.
  • User-Friendly Interface: The tool is designed for ease of use, ensuring a quick and hassle-free setup process.
  • Enhanced Data Privacy: By processing data locally, Ollama ensures that your information remains secure and private.
  • Customization: Tailor models to meet your specific needs, enhancing the relevance and effectiveness of your AI solutions.
  • Independence from Internet Constraints: Operate large language models without the need for a continuous internet connection, allowing for uninterrupted AI experiences.
  • Resource Optimization: Local processing with Ollama optimizes your hardware usage, ensuring efficient AI operations.
  • Open WebUI Features

  • Docker Integration: Open WebUI runs on a local installation of Docker. Access via web browser.
  • OpenAI API Integration: Integrate OpenAI-compatible APIs for versatile conversations alongside Ollama models. Customize the OpenAI API URL to link with LMStudio, GroqCloud, Mistral, OpenRouter, and more.
  • Intuitive Interface: Chat interface takes inspiration from Chat GPT user interface, ensuring a user-friendly experience.
  • Model Builder: Easily create Ollama models via the Web UI. Create and add custom characters/agents, customize chat elements, and import models effortlessly through Open WebUI Community integration.
  • Web Search for RAG: Perform web searches using providers like SearXNG, Google PSE, Brave Search, serpstack, and serper, and inject the results directly into your local Retrieval Augmented Generation (RAG) experience.
  • Image Generation Integration: Seamlessly incorporate image generation capabilities using options such as AUTOMATIC1111 API or ComfyUI (local), and OpenAI's DALL-E (external), enriching your chat experience with dynamic visual content.
  • Disclaimer: Ollama is licensed under MIT license. No warrantee of any kind, express or implied, is included with this software. Use at your risk, responsibility for damages (if any) to anyone resulting from the use of this software rest entirely with the user. The author is not responsible for any damage that its use could cause.

    Highlights

    • Ollama exposes a local API, allowing developers to seamlessly integrate LLMs into their applications and workflows. This API facilitates efficient communication between your application and the LLM, enabling you to send prompts, receive responses, and leverage the full potential of these powerful AI models.
    • This Ollama image provides a command-line interface for advanced users, it also comes with user-friendly graphical interface Open WebUI. This interface enhances the overall experience by providing intuitive chat-based interactions like ChatGPT, visual model selection, and parameter adjustment capabilities.
    • OpenAI API Integration: Integrate OpenAI-compatible APIs for versatile conversations alongside Ollama models. Customize the OpenAI API URL to link with LMStudio, GroqCloud, Mistral, OpenRouter, and more.

    Details

    Delivery method

    Delivery option
    64-bit (x86) Amazon Machine Image (AMI)

    Latest version

    Operating system
    Ubuntu 24.04 LTS

    Typical total price

    This estimate is based on use of the seller's recommended configuration (t3.large) in the US East (N. Virginia) Region. View pricing details

    $0.133/hour

    Features and programs

    Financing for AWS Marketplace purchases

    AWS Marketplace now accepts line of credit payments through the PNC Vendor Finance program. This program is available to select AWS customers in the US, excluding NV, NC, ND, TN, & VT.
    Financing for AWS Marketplace purchases

    Pricing

    Ollama with Open WebUI on Ubuntu 24.04

     Info
    Pricing is based on actual usage, with charges varying according to how much you consume. Subscriptions have no end date and may be canceled any time.
    Additional AWS infrastructure costs may apply. Use the AWS Pricing Calculator  to estimate your infrastructure costs.

    Usage costs (587)

     Info
    • ...
    Instance type
    Product cost/hour
    EC2 cost/hour
    Total/hour
    t1.micro
    $0.05
    $0.02
    $0.07
    t2.nano
    $0.05
    $0.006
    $0.056
    t2.micro
    AWS Free Tier
    $0.05
    $0.012
    $0.062
    t2.small
    $0.05
    $0.023
    $0.073
    t2.medium
    $0.05
    $0.046
    $0.096
    t2.large
    $0.05
    $0.093
    $0.143
    t2.xlarge
    $0.05
    $0.186
    $0.236
    t2.2xlarge
    $0.05
    $0.371
    $0.421
    t3.nano
    $0.05
    $0.005
    $0.055
    t3.micro
    AWS Free Tier
    $0.05
    $0.01
    $0.06

    Additional AWS infrastructure costs

    Type
    Cost
    EBS General Purpose SSD (gp3) volumes
    $0.08/per GB/month of provisioned storage

    Vendor refund policy

    Please contact us via our website

    Legal

    Vendor terms and conditions

    Upon subscribing to this product, you must acknowledge and agree to the terms and conditions outlined in the vendor's End User License Agreement (EULA) .

    Content disclaimer

    Vendors are responsible for their product descriptions and other product content. AWS does not warrant that vendors' product descriptions or other product content are accurate, complete, reliable, current, or error-free.

    Usage information

     Info

    Delivery details

    64-bit (x86) Amazon Machine Image (AMI)

    Amazon Machine Image (AMI)

    An AMI is a virtual image that provides the information required to launch an instance. Amazon EC2 (Elastic Compute Cloud) instances are virtual servers on which you can run your applications and workloads, offering varying combinations of CPU, memory, storage, and networking resources. You can launch as many instances from as many different AMIs as you need.

    Version release notes

    Initial version

    Additional details

    Resources

    Vendor resources

    Support

    Vendor support

    Please contact us via our website:

    AWS infrastructure support

    AWS Support is a one-on-one, fast-response support channel that is staffed 24x7x365 with experienced and technical support engineers. The service helps customers of all sizes and technical abilities to successfully utilize the products and features provided by Amazon Web Services.

    Customer reviews

    Ratings and reviews

     Info
    5
    1 ratings
    5 star
    4 star
    3 star
    2 star
    1 star
    100%
    0%
    0%
    0%
    0%
    1 AWS reviews
    Moslem

    I am amazed

    Reviewed on Oct 23, 2024
    Purchase verified by AWS

    I was looking for generative AI tools to make my E-commerce platform AI powered. Today I am happy to reach out to the right place.
    Best wishes for ollama.

    View all reviews