Skip to content

📦Containerization of the app#1

Open
math-x-io wants to merge 5 commits intopsyray:release/v0.5.0from
math-x-io:master
Open

📦Containerization of the app#1
math-x-io wants to merge 5 commits intopsyray:release/v0.5.0from
math-x-io:master

Conversation

@math-x-io
Copy link

@math-x-io math-x-io commented Mar 12, 2025

This pull request adds a Dockerfile to the repository, along with detailed instructions in the README.md file on how to build and run it.

Summary by Sourcery

Introduce Docker containerization for the application, including a Dockerfile and instructions for building and running the container.

Build:

  • Add a Dockerfile to containerize the application.
  • Include instructions in the README on how to build and run the Docker container.

@sourcery-ai
Copy link
Contributor

sourcery-ai bot commented Mar 12, 2025

Reviewer's Guide by Sourcery

This pull request containerizes the application using Docker. It introduces a Dockerfile, along with instructions in the README.md file on how to build and run the Docker image. A changelog entry was also added.

Sequence diagram for running the application inside a Docker container

sequenceDiagram
    actor User
    participant Docker
    participant OasisScanner
    participant Ollama

    User->>Docker: Builds image with repo URL and model number
    Docker->>Docker: Clones repository
    Docker->>OasisScanner: Runs Oasis Scanner
    OasisScanner->>Ollama: Pulls models (mistral, nomic-embed-text)
    Ollama-->>OasisScanner: Models ready
    OasisScanner->>OasisScanner: Analyzes repository
    OasisScanner->>Docker: Generates security reports
    Docker->>User: Reports available in volume
Loading

File-Level Changes

Change Details Files
Introduced Docker support for containerizing the application.
  • Added a Dockerfile for building the application image.
  • Included instructions in the README on how to build and run the Docker image.
  • Added a changelog entry for Docker integration.
README.md
CHANGELOG.md
Dockerfile

Tips and commands

Interacting with Sourcery

  • Trigger a new review: Comment @sourcery-ai review on the pull request.
  • Continue discussions: Reply directly to Sourcery's review comments.
  • Generate a GitHub issue from a review comment: Ask Sourcery to create an
    issue from a review comment by replying to it. You can also reply to a
    review comment with @sourcery-ai issue to create an issue from it.
  • Generate a pull request title: Write @sourcery-ai anywhere in the pull
    request title to generate a title at any time. You can also comment
    @sourcery-ai title on the pull request to (re-)generate the title at any time.
  • Generate a pull request summary: Write @sourcery-ai summary anywhere in
    the pull request body to generate a PR summary at any time exactly where you
    want it. You can also comment @sourcery-ai summary on the pull request to
    (re-)generate the summary at any time.
  • Generate reviewer's guide: Comment @sourcery-ai guide on the pull
    request to (re-)generate the reviewer's guide at any time.
  • Resolve all Sourcery comments: Comment @sourcery-ai resolve on the
    pull request to resolve all Sourcery comments. Useful if you've already
    addressed all the comments and don't want to see them anymore.
  • Dismiss all Sourcery reviews: Comment @sourcery-ai dismiss on the pull
    request to dismiss all existing Sourcery reviews. Especially useful if you
    want to start fresh with a new review - don't forget to comment
    @sourcery-ai review to trigger a new review!
  • Generate a plan of action for an issue: Comment @sourcery-ai plan on
    an issue to generate a plan of action for it.

Customizing Your Experience

Access your dashboard to:

  • Enable or disable review features such as the Sourcery-generated pull request
    summary, the reviewer's guide, and others.
  • Change the review language.
  • Add, remove or edit custom review instructions.
  • Adjust other review settings.

Getting Help

Copy link
Contributor

@sourcery-ai sourcery-ai bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Hey @math-x-io - I've reviewed your changes - here's some feedback:

Overall Comments:

  • Consider using multi-stage builds to reduce the final image size by not including the build tools.
  • It's good practice to specify the application's user in the Dockerfile using the USER instruction.
Here's what I looked at during the review
  • 🟢 General issues: all looks good
  • 🟡 Security: 1 issue found
  • 🟢 Testing: all looks good
  • 🟢 Complexity: all looks good
  • 🟢 Documentation: all looks good

Sourcery is free for open source - if you like our reviews please consider sharing them ✨
Help me be more useful! Please click 👍 or 👎 on each comment and I'll use the feedback to improve your reviews.

FROM debian:12.9-slim

# Set non-root user for better security
RUN useradd -ms /bin/bash appuser
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

🚨 suggestion (security): Switch to non-root user for improved security.

While the Dockerfile creates a non-root user, it doesn’t switch to that user before executing subsequent commands. Adding a 'USER appuser' instruction after setting up the environment could further restrict privileges during runtime.

@math-x-io
Copy link
Author

Fix security issue

@math-x-io
Copy link
Author

@psyray it's okay to merge ?

@psyray
Copy link
Owner

psyray commented Mar 14, 2025

@psyray it's okay to merge ?

You need to reply to my request about Ollama volume to keep persistence on the downloaded models
Look above
Then I'll quickly test and merge

Dockerfile Outdated
RUN curl -fsSL https://ollama.com/install.sh | sh

# Pull required models
RUN ollama pull mistral && ollama pull nomic-embed-text

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

That means that models will be integrated in the docker image ? If so it will raise the image size drastically ...

Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I'm not sure if the user intends to extract the models locally and run the image with locally loaded models and save image space, or if they prefer to extract the models directly from the image to solve all the Llama issues.

Copy link

@nervousapps nervousapps Mar 20, 2025

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Ollama will run directly in the same docker container ? If not the user already have an instance of ollama running, so having models in oasis container does not make sense, you don't even need to install ollama

Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Okkay, I'd fix that

@psyray psyray changed the base branch from master to release/v0.5.0 April 11, 2025 08:37
@psyray
Copy link
Owner

psyray commented Apr 11, 2025

Thanks for your modification, I've changed the target branch to release/0.5.0.
I will test & add your PR in the 0.5.0 release
Thanks for your contribution

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants