Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Build with curl support #595

Merged
merged 1 commit into from
Jan 16, 2025
Merged

Build with curl support #595

merged 1 commit into from
Jan 16, 2025

Conversation

pepijndevos
Copy link
Contributor

@pepijndevos pepijndevos commented Jan 16, 2025

This allows the standalone container to download models

As proposed in #575 this is useful to use ramalama containers as Home Assistant addons.

Summary by Sourcery

Build:

  • Build the standalone container with cURL.

Copy link
Contributor

sourcery-ai bot commented Jan 16, 2025

Reviewer's Guide by Sourcery

This pull request introduces curl support to the standalone container image, enabling it to download models. This change is beneficial for using the container as a Home Assistant addon.

Sequence diagram for model download process with curl

sequenceDiagram
    participant Container as Ramalama Container
    participant Curl as Curl Library
    participant Repo as Model Repository

    Container->>Curl: Initialize curl
    Container->>Curl: Request model download
    Curl->>Repo: HTTP GET request
    Repo-->>Curl: Send model file
    Curl-->>Container: Return downloaded model
Loading

File-Level Changes

Change Details Files
Added libcurl development files to the list of dependencies.
  • Added libcurl-devel to the dnf_install function's rpm_list variable to install the necessary development files for curl support during the container build process.
container-images/scripts/build_llama_and_whisper.sh
Enabled the LLAMA_CURL CMake option.
  • Added the -DLLAMA_CURL=ON flag to the common_flags variable in the configure_common_flags function to enable curl support at compile time.
container-images/scripts/build_llama_and_whisper.sh

Tips and commands

Interacting with Sourcery

  • Trigger a new review: Comment @sourcery-ai review on the pull request.
  • Continue discussions: Reply directly to Sourcery's review comments.
  • Generate a GitHub issue from a review comment: Ask Sourcery to create an
    issue from a review comment by replying to it.
  • Generate a pull request title: Write @sourcery-ai anywhere in the pull
    request title to generate a title at any time.
  • Generate a pull request summary: Write @sourcery-ai summary anywhere in
    the pull request body to generate a PR summary at any time. You can also use
    this command to specify where the summary should be inserted.

Customizing Your Experience

Access your dashboard to:

  • Enable or disable review features such as the Sourcery-generated pull request
    summary, the reviewer's guide, and others.
  • Change the review language.
  • Add, remove or edit custom review instructions.
  • Adjust other review settings.

Getting Help

This allows the standalone container to download models

Signed-off-by: Pepijn de Vos <pepijndevos@gmail.com>
Copy link
Contributor

@sourcery-ai sourcery-ai bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Hey @pepijndevos - I've reviewed your changes - here's some feedback:

Overall Comments:

  • Please add documentation covering the new curl-based download capability, including security considerations (network access, model validation) and any safeguards in place for the download process.
Here's what I looked at during the review
  • 🟢 General issues: all looks good
  • 🟢 Security: all looks good
  • 🟢 Testing: all looks good
  • 🟢 Complexity: all looks good
  • 🟢 Documentation: all looks good

Sourcery is free for open source - if you like our reviews please consider sharing them ✨
Help me be more useful! Please click 👍 or 👎 on each comment and I'll use the feedback to improve your reviews.

@ericcurtin
Copy link
Collaborator

I guess we can add this, it's a little unnecessary as "ramalama run/ramalama pull" has it's own http client for pulling

@rhatdan
Copy link
Member

rhatdan commented Jan 16, 2025

I am fine with this.

@ericcurtin ericcurtin merged commit 62e2693 into containers:main Jan 16, 2025
9 of 12 checks passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants