Skip to content

preprocessing pipeline and watchdog #3

@tayheau

Description

@tayheau

Hey !
Thanks for the great work, it comes very handy as i was myself working on this subject !
So again, a huge thanks !

Im opening this issue because i have sevral questions.
For all the preprocessing part, spikeinterface integrates a pipeline feature. My guess is that you went with your current structure to keep a more verbose experience. But i was wondering if it could not be better to rather use this method, which could add modularity to the preprocessing and a common reference for the json dictionnary building. And it still allows you to have a predefinned preprocessing pipeline that people can overwrite.

For the watchdog part, i was wondering if but in my mind the use case of a pipeline is to not bother having a verbose output unless there is an issue and then i can go check the logs. So why not having the possibility to have a instance (a Watcher class for example) watching a given folder (on a remote drive as the most usual usecase i think). Then, a FileSystemEventHandler acting as a router ( based on regex patterns ) could determine a file type processor to correctly handle the file given a general path matched by the eventhandler. And then we finaly apply your pipeline on the returned bin and return the results to a user selected folder.

Image

PS: for the processing args dictionnary, in the second case, the user could add it to the filder in a json file before uploading it.

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions