Skip to content
This repository was archived by the owner on Dec 8, 2024. It is now read-only.
This repository was archived by the owner on Dec 8, 2024. It is now read-only.

Developing the Model #16

@DRedDevil04

Description

@DRedDevil04

DDoS Detection Model Challenge

Issue Description

🚀 Challenge: Compete to create the most accurate model for detecting DDoS attacks! Participants are tasked with developing a machine learning model that assigns labels of 1 for DDoS attacks and 0 for normal traffic based on packet features. Additionally, participants are required to maintain a list of source IPs associated with detected DDoS packets.

Evaluation Criteria

  • Model Accuracy: The accuracy of the machine learning model in correctly classifying packets.
  • False Positive Rate: Minimize false positives to enhance precision.
  • List of Source IPs: Maintain an accurate list of source IPs for all packets classified as DDoS attacks.

Data Details

  • Three datasets are provided for training and evaluation.
  • If you wish to add another dataset, feel free to contribute to the other open-for-all issue and get your data added.
  • Features include packet metadata such as IP addresses, TCP/UDP ports, and flags.
  • Labels should be binary, with 1 denoting DDoS attacks and 0 denoting normal traffic.

Submission Guidelines

  • Participants are encouraged to use a variety of machine learning algorithms and techniques.
  • Submissions should include a Jupyter notebook or Python script containing the model implementation.
  • Clearly document and comment your code for transparency and understanding.

Reward

  • The participant with the most accurate model and well-maintained list of source IPs will be recognized and only their PR will be merged.

Additional Information

  • Participants can discuss their approaches, findings, and seek clarifications in the project's discussions.
  • Please adhere to project coding standards and guidelines during implementation.

Note

  • Refer to /data/CONTRIBUTING.md for information on dataset usage and /scripts/model_evaluation.ipynb for existing evaluation conventions.

Metadata

Metadata

Assignees

No one assigned

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions