Skip to content
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
4 changes: 2 additions & 2 deletions AGENTS.md
Original file line number Diff line number Diff line change
@@ -1,6 +1,6 @@
# Project Overview

DAIV is an AI-powered development assistant built on Django with Celery for async task processing, LangChain/LangGraph for LLM integration, and includes `daiv-sandbox` for sandboxed command execution. It integrates with GitLab and GitHub to automate issue resolution, code reviews, and CI/CD pipeline repairs.
DAIV is an AI-powered development assistant built on Django with Django Tasks for async task processing, LangChain/LangGraph for LLM integration, and includes `daiv-sandbox` for sandboxed command execution. It integrates with GitLab and GitHub to automate issue resolution, code reviews, and CI/CD pipeline repairs.

## Project Structure

Expand All @@ -10,7 +10,7 @@ DAIV is an AI-powered development assistant built on Django with Celery for asyn
* `chat/` - Chat module with the OpenAI compatible API.
* `core/` - Core module with common logic.
* `quick_actions/` - Quick actions module.
* `daiv/` - Main logic of the Django project: settings, urls, wsgi, asgi, celery, etc.
* `daiv/` - Main logic of the Django project: settings, urls, wsgi, asgi, tasks, etc.
* `docker/` - Dockerfiles and configurations for local and production deployments.
* `docs/` - Documentation for the project.
* `evals/` - Evaluation suite for the project (openevals + langsmith + pytest).
Expand Down
4 changes: 4 additions & 0 deletions CHANGELOG.md
Original file line number Diff line number Diff line change
Expand Up @@ -15,6 +15,7 @@ and this project adheres to [Semantic Versioning](https://semver.org/spec/v2.0.0
- `daiv-max`: Use high-performance mode with `CLAUDE_OPUS_4_5` model and `HIGH` thinking level for both planning and execution
- Added `MAX_PLANNING_MODEL_NAME`, `MAX_EXECUTION_MODEL_NAME`, `MAX_PLANNING_THINKING_LEVEL`, and `MAX_EXECUTION_THINKING_LEVEL` configuration settings for high-performance mode
- Added support for `gpt-5.2` model from OpenAI
- Added `django-crontask` integration and scheduler service scaffolding for periodic tasks.

### Changed

Expand All @@ -33,6 +34,8 @@ and this project adheres to [Semantic Versioning](https://semver.org/spec/v2.0.0
- Migrated `PullRequestDescriberAgent` evaluation tests to use data-driven approach with JSONL test cases and reference outputs
- Deferred sandbox session creation until the first `bash` tool invocation.
- Updated merge request creation to return full metadata, including web URLs, for GitHub and GitLab clients.
- Migrated background processing from Celery to Django Tasks using the `django-tasks` database backend.
- Simplified task definitions to use Django Tasks async support directly.

### Fixed

Expand All @@ -42,6 +45,7 @@ and this project adheres to [Semantic Versioning](https://semver.org/spec/v2.0.0

- Removed builtin `maintaining-changelog` skill in favor of the new changelog subagent
- Removed `pull_request.branch_name_convention` from `.daiv.yml` configuration file. **BREAKING CHANGE**: Branch name convention must now be defined in the `AGENTS.md` file instead.
- Removed Celery worker configuration and bootstrap scripts.

## [1.1.0] - 2025-12-04

Expand Down
2 changes: 1 addition & 1 deletion README.md
Original file line number Diff line number Diff line change
Expand Up @@ -25,7 +25,7 @@ DAIV is an open-source automation assistant designed to enhance developer produc
## Technology Stack

- **Backend Framework**: [Django](https://www.djangoproject.com/) for building robust APIs and managing database models.
- **Async Tasks**: [Celery](https://docs.celeryproject.org/) with Redis, applying actions in the background and scaling the agents to handle multiple requests.
- **Async Tasks**: [Django Tasks](https://docs.djangoproject.com/en/6.0/topics/tasks/) with the [`django-tasks` backend](https://pypi.org/project/django-tasks/) and [`django-crontask`](https://pypi.org/project/django-crontask/) for periodic scheduling.
- **LLM Frameworks**: [LangChain](https://python.langchain.com/) and [LangGraph](https://langchain-ai.github.io/langgraph), integrating various LLM agents for intent understanding, query transformation, and natural language reasoning about code changes.
- **Code Executor**: [Sandbox](https://github.com/srtab/daiv-sandbox/) for running commands in a secure sandbox to allow the agents to perform actions on the codebase.
- **Observability**: [LangSmith](https://www.langchain.com/langsmith) for tracing and monitoring all the interactions between DAIV and your codebase.
Expand Down
76 changes: 30 additions & 46 deletions daiv/codebase/clients/github/api/callbacks.py
Original file line number Diff line number Diff line change
Expand Up @@ -2,8 +2,6 @@
from functools import cached_property
from typing import Any, Literal

from asgiref.sync import sync_to_async

from codebase.api.callbacks import BaseCallback
from codebase.clients import RepoClient
from codebase.clients.base import Emoji
Expand Down Expand Up @@ -53,13 +51,9 @@ def accept_callback(self) -> bool:
)

async def process_callback(self):
await sync_to_async(
address_issue_task.si(
repo_id=self.repository.full_name,
issue_iid=self.issue.number,
should_reset_plan=self.should_reset_plan(),
).delay
)()
await address_issue_task.aenqueue(
repo_id=self.repository.full_name, issue_iid=self.issue.number, should_reset_plan=self.should_reset_plan()
)

def should_reset_plan(self) -> bool:
"""
Expand Down Expand Up @@ -109,47 +103,39 @@ async def process_callback(self):
)

if self._action_scope == Scope.ISSUE:
await sync_to_async(
execute_issue_task.si(
repo_id=self.repository.full_name,
comment_id=self.comment.id,
action_command=self._quick_action_command.command,
action_args=" ".join(self._quick_action_command.args),
issue_id=self.issue.number,
).delay
)()
await execute_issue_task.aenqueue(
repo_id=self.repository.full_name,
comment_id=self.comment.id,
action_command=self._quick_action_command.command,
action_args=" ".join(self._quick_action_command.args),
issue_id=self.issue.number,
)
elif self._action_scope == Scope.MERGE_REQUEST:
await sync_to_async(
execute_merge_request_task.si(
repo_id=self.repository.full_name,
comment_id=self.comment.id,
action_command=self._quick_action_command.command,
action_args=" ".join(self._quick_action_command.args),
merge_request_id=self.issue.number,
).delay
)()
await execute_merge_request_task.aenqueue(
repo_id=self.repository.full_name,
comment_id=self.comment.id,
action_command=self._quick_action_command.command,
action_args=" ".join(self._quick_action_command.args),
merge_request_id=self.issue.number,
)

elif self._is_issue_comment:
self._client.create_issue_note_emoji(
self.repository.full_name, self.issue.number, Emoji.EYES, self.comment.id
)
await sync_to_async(
address_issue_task.si(
repo_id=self.repository.full_name, issue_iid=self.issue.number, mention_comment_id=self.comment.id
).delay
)()
await address_issue_task.aenqueue(
repo_id=self.repository.full_name, issue_iid=self.issue.number, mention_comment_id=self.comment.id
)

elif self._is_merge_request_review:
# The webhook doesn't provide the source branch, so we need to fetch it from the merge request.
merge_request = self._client.get_merge_request(self.repository.full_name, self.issue.number)

await sync_to_async(
address_mr_comments_task.si(
repo_id=self.repository.full_name,
merge_request_id=self.issue.number,
merge_request_source_branch=merge_request.source_branch,
).delay
)()
await address_mr_comments_task.aenqueue(
repo_id=self.repository.full_name,
merge_request_id=self.issue.number,
merge_request_source_branch=merge_request.source_branch,
)

@property
def _is_quick_action(self) -> bool:
Expand Down Expand Up @@ -257,13 +243,11 @@ async def process_callback(self):

GitLab Note Webhook is called multiple times, one per note/discussion.
"""
await sync_to_async(
address_mr_review_task.si(
repo_id=self.repository.full_name,
merge_request_id=self.pull_request.number,
merge_request_source_branch=self.pull_request.head.ref,
).delay
)()
await address_mr_review_task.aenqueue(
repo_id=self.repository.full_name,
merge_request_id=self.pull_request.number,
merge_request_source_branch=self.pull_request.head.ref,
)


class PushCallback(GitHubCallback):
Expand Down
72 changes: 30 additions & 42 deletions daiv/codebase/clients/gitlab/api/callbacks.py
Original file line number Diff line number Diff line change
Expand Up @@ -2,8 +2,6 @@
from functools import cached_property
from typing import Any, Literal

from asgiref.sync import sync_to_async

from codebase.api.callbacks import BaseCallback
from codebase.base import NoteType
from codebase.clients import RepoClient
Expand Down Expand Up @@ -64,59 +62,49 @@ async def process_callback(self):
self._client.create_merge_request_note_emoji(
self.project.path_with_namespace, self.merge_request.iid, Emoji.EYES, self.object_attributes.id
)
await sync_to_async(
execute_merge_request_task.si(
repo_id=self.project.path_with_namespace,
comment_id=self.object_attributes.discussion_id,
action_command=self._quick_action_command.command,
action_args=" ".join(self._quick_action_command.args),
merge_request_id=self.merge_request.iid,
).delay
)()
await execute_merge_request_task.aenqueue(
repo_id=self.project.path_with_namespace,
comment_id=self.object_attributes.discussion_id,
action_command=self._quick_action_command.command,
action_args=" ".join(self._quick_action_command.args),
merge_request_id=self.merge_request.iid,
)
elif self._action_scope == Scope.ISSUE:
self._client.create_issue_note_emoji(
self.project.path_with_namespace, self.issue.iid, Emoji.EYES, self.object_attributes.id
)
await sync_to_async(
execute_issue_task.si(
repo_id=self.project.path_with_namespace,
comment_id=self.object_attributes.discussion_id,
action_command=self._quick_action_command.command,
action_args=" ".join(self._quick_action_command.args),
issue_id=self.issue.iid,
).delay
)()
await execute_issue_task.aenqueue(
repo_id=self.project.path_with_namespace,
comment_id=self.object_attributes.discussion_id,
action_command=self._quick_action_command.command,
action_args=" ".join(self._quick_action_command.args),
issue_id=self.issue.iid,
)

elif self._is_issue_comment:
self._client.create_issue_note_emoji(
self.project.path_with_namespace, self.issue.iid, Emoji.EYES, self.object_attributes.id
)
await sync_to_async(
address_issue_task.si(
repo_id=self.project.path_with_namespace,
issue_iid=self.issue.iid,
mention_comment_id=self.object_attributes.discussion_id,
).delay
)()
await address_issue_task.aenqueue(
repo_id=self.project.path_with_namespace,
issue_iid=self.issue.iid,
mention_comment_id=self.object_attributes.discussion_id,
)

elif self._is_merge_request_review:
if self.object_attributes.type in [NoteType.DIFF_NOTE, NoteType.DISCUSSION_NOTE]:
await sync_to_async(
address_mr_review_task.si(
repo_id=self.project.path_with_namespace,
merge_request_id=self.merge_request.iid,
merge_request_source_branch=self.merge_request.source_branch,
).delay
)()
await address_mr_review_task.aenqueue(
repo_id=self.project.path_with_namespace,
merge_request_id=self.merge_request.iid,
merge_request_source_branch=self.merge_request.source_branch,
)
elif self.object_attributes.type is None: # This is a comment note.
await sync_to_async(
address_mr_comments_task.si(
repo_id=self.project.path_with_namespace,
merge_request_id=self.merge_request.iid,
merge_request_source_branch=self.merge_request.source_branch,
mention_comment_id=self.object_attributes.discussion_id,
).delay
)()
await address_mr_comments_task.aenqueue(
repo_id=self.project.path_with_namespace,
merge_request_id=self.merge_request.iid,
merge_request_source_branch=self.merge_request.source_branch,
mention_comment_id=self.object_attributes.discussion_id,
)
else:
logger.warning("Unsupported note type: %s", self.object_attributes.type)

Expand Down
6 changes: 3 additions & 3 deletions daiv/codebase/context.py
Original file line number Diff line number Diff line change
Expand Up @@ -17,12 +17,12 @@
class RuntimeCtx:
"""
Context to be used across the application layers.
It needs to be set as early as possible on the request lifecycle or celery task.
It needs to be set as early as possible on the request lifecycle or task execution.

With this context, we ensure that application layers that need the repository files can access them without doing
API calls by accessing the defined `repo_dir` directory, which is a temporary directory with the repository files.

The context is reset at the end of the request lifecycle or celery task.
The context is reset at the end of the request lifecycle or task execution.
"""

git_platform: GitPlatform
Expand Down Expand Up @@ -115,7 +115,7 @@ def get_runtime_ctx() -> RuntimeCtx:
if ctx is None:
raise RuntimeError(
"Runtime context not set. "
"It needs to be set as early as possible on the request lifecycle or celery task. "
"It needs to be set as early as possible on the request lifecycle or task execution. "
"Use the `codebase.context.set_runtime_ctx` context manager to set the context."
)
return ctx
9 changes: 5 additions & 4 deletions daiv/codebase/tasks.py
Original file line number Diff line number Diff line change
@@ -1,16 +1,17 @@
import logging

from django.tasks import task

from codebase.clients import RepoClient
from codebase.context import set_runtime_ctx
from codebase.managers.issue_addressor import IssueAddressorManager
from codebase.managers.review_addressor import CommentsAddressorManager
from core.utils import locked_task
from daiv import async_task

logger = logging.getLogger("daiv.tasks")


@async_task()
@task
@locked_task(key="{repo_id}:{issue_iid}")
async def address_issue_task(repo_id: str, issue_iid: int, mention_comment_id: str, ref: str | None = None):
"""
Expand All @@ -30,7 +31,7 @@ async def address_issue_task(repo_id: str, issue_iid: int, mention_comment_id: s
)


@async_task()
@task
@locked_task(key="{repo_id}:{merge_request_id}")
async def address_mr_review_task(repo_id: str, merge_request_id: int, merge_request_source_branch: str):
"""
Expand All @@ -45,7 +46,7 @@ async def address_mr_review_task(repo_id: str, merge_request_id: int, merge_requ
# await ReviewAddressorManager.process_review_comments(merge_request_id=merge_request_id, runtime_ctx=runtime_ctx) # noqa: E501 ERA001


@async_task()
@task
@locked_task(key="{repo_id}:{merge_request_id}")
async def address_mr_comments_task(
repo_id: str, merge_request_id: int, merge_request_source_branch: str, mention_comment_id: str
Expand Down
2 changes: 1 addition & 1 deletion daiv/core/utils.py
Original file line number Diff line number Diff line change
Expand Up @@ -115,7 +115,7 @@ def locked_task(key: str = "", blocking: bool = False):
Default is False.

Example:
@shared_task
@task
@locked_task(key="{repo_id}:{issue_iid}") # Lock key will be: "task_name:repo123:issue456"
def process_issue(repo_id: str, issue_iid: int):
pass
Expand Down
7 changes: 1 addition & 6 deletions daiv/daiv/__init__.py
Original file line number Diff line number Diff line change
@@ -1,9 +1,4 @@
# This will make sure the app is always imported when
# Django starts so that shared_task will use this app.
from .celeryapp import app as celery_app
from .celeryapp import async_task

__version__ = "1.1.0"
USER_AGENT = f"python-daiv-agent/{__version__}"

__all__ = ("celery_app", "async_task", "USER_AGENT")
__all__ = ("USER_AGENT",)
Loading
Loading