Skip to content
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
Show all changes
60 commits
Select commit Hold shift + click to select a range
d078a4b
log activation function added
Advaitgaur004 Jul 19, 2025
5490c0a
exp activation added
Advaitgaur004 Jul 19, 2025
b84d531
trignometric activation function added
Advaitgaur004 Jul 19, 2025
7c02cad
sigmoid activation added
Advaitgaur004 Jul 19, 2025
14a8503
tanh activation added
Advaitgaur004 Jul 19, 2025
58e1871
elu activation function added
Advaitgaur004 Jul 19, 2025
4c15e20
SELU activation function added
Advaitgaur004 Jul 19, 2025
eda9b18
feat: add alpha parameter to nn_elu function
Advaitgaur004 Jul 21, 2025
a40ecf3
Merge pull request #28 from Advaitgaur004/test
PrimedErwin Jul 21, 2025
c8c2e97
Tensor_abs implemenation added
Advaitgaur004 Jul 23, 2025
ec9b7f8
test cases for Tensor_abs added
Advaitgaur004 Jul 23, 2025
bca55eb
Loss function : nn_mse_loss is implemented
Advaitgaur004 Jul 23, 2025
2115ee9
nn_mse_loss declaration added in cten.h
Advaitgaur004 Jul 23, 2025
d6c512a
Loss function : nn_mae_loss is implemented and added
Advaitgaur004 Jul 23, 2025
adf72ce
Loss function : nn_huber_loss is implemented
Advaitgaur004 Jul 23, 2025
0c117a5
Cherry picked from operator-6 :ec9b7f86755cc0446b3f7f69fc8ec0b166bf9f5e
Advaitgaur004 Jul 23, 2025
02dc6d0
Merge pull request #29 from Advaitgaur004/operator-6
PrimedErwin Jul 24, 2025
2a5ecd2
Merge pull request #30 from Advaitgaur004/loss-function
PrimedErwin Jul 24, 2025
3cd8adc
fix(nn): Correct softmax gradient calculation and backward pass
Advaitgaur004 Jul 25, 2025
9313168
ultra minor cleanup in operator.c
Advaitgaur004 Jul 25, 2025
6ea516c
extented the softmax to include dim as parameter
Advaitgaur004 Aug 1, 2025
cff22a3
fix definitions - nn_softmax and changed GradNode structure
Advaitgaur004 Aug 1, 2025
51b1312
Merge pull request #31 from Advaitgaur004/gradfn_softmax_fix
PrimedErwin Aug 1, 2025
e77e227
forward-test cases for softmax added.
Advaitgaur004 Aug 1, 2025
e7f3f6f
backward softmax test added
Advaitgaur004 Aug 1, 2025
ecb990e
added input in backward test - softmax
Advaitgaur004 Aug 1, 2025
8c3684d
softmax test added in cten_test.c
Advaitgaur004 Aug 1, 2025
891e3da
Merge pull request #32 from Advaitgaur004/test
PrimedErwin Aug 1, 2025
47337ca
AdaGrad Optimizer added
Advaitgaur004 Aug 2, 2025
5285d2d
RMSPROP optimizer added
Advaitgaur004 Aug 2, 2025
657faa2
adam optmizer added
Advaitgaur004 Aug 2, 2025
dd72d46
Declaration of all optmizer-1 is added
Advaitgaur004 Aug 2, 2025
e26ac76
assert statement added in optimizers
Advaitgaur004 Aug 3, 2025
a771b28
removing adagrad, rmsprop and adam from sgd.c file
Advaitgaur004 Aug 3, 2025
9965d32
adagrad added in sep file
Advaitgaur004 Aug 3, 2025
6b94e0d
adam added in sep file
Advaitgaur004 Aug 3, 2025
fba4e80
rmsprop added in sep file
Advaitgaur004 Aug 3, 2025
00ef857
Merge pull request #33 from Advaitgaur004/optimizer-1
PrimedErwin Aug 4, 2025
f6c201d
Merge branch 'pocketpy:main' into optimizer-1
Advaitgaur004 Aug 4, 2025
2d1f1a6
feat(optimizer): implement momentum for sgd
Advaitgaur004 Aug 4, 2025
e5ac661
Merge pull request #34 from Advaitgaur004/optimizer-1
PrimedErwin Aug 4, 2025
060675d
weightdecay in adagrad added
Advaitgaur004 Aug 11, 2025
c1921d2
weight decay in adam added
Advaitgaur004 Aug 11, 2025
4430b52
weight decay in rmsprop added
Advaitgaur004 Aug 11, 2025
617be3c
weight decay in sgdm added
Advaitgaur004 Aug 11, 2025
2fec48b
weight decay declaration added in cten.h
Advaitgaur004 Aug 11, 2025
1b29f6f
Add gradient clipping utilities for training stabilization
Advaitgaur004 Aug 11, 2025
ad622a1
Added gradient clipping function declarations in cten.h
Advaitgaur004 Aug 11, 2025
889df03
Merge pull request #35 from Advaitgaur004/weight-decay
PrimedErwin Aug 11, 2025
6d8890f
Merge pull request #36 from Advaitgaur004/clipping
PrimedErwin Aug 11, 2025
a133592
style: Apply clang-format to entire codebase
Advaitgaur004 Aug 19, 2025
0b77b08
Make the build process simpler for linux user
Advaitgaur004 Aug 19, 2025
1dffc75
Merge pull request #37 from Advaitgaur004/cleanup
PrimedErwin Aug 20, 2025
fd5beab
Merge pull request #38 from Advaitgaur004/cleanup-ii
PrimedErwin Aug 20, 2025
a94cac9
minor-cleanup left
Advaitgaur004 Aug 20, 2025
144a397
Doxygen-style doc written - cten.h
Advaitgaur004 Aug 20, 2025
0951921
api reference added
Advaitgaur004 Aug 20, 2025
7753f79
minor change in heading
Advaitgaur004 Aug 20, 2025
ba6f469
better representation of max,min,sum and mean
Advaitgaur004 Aug 20, 2025
b9eb279
Merge pull request #39 from Advaitgaur004/docs
PrimedErwin Aug 21, 2025
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
769 changes: 769 additions & 0 deletions API.md

Large diffs are not rendered by default.

3 changes: 3 additions & 0 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -317,6 +317,9 @@ cTensor/
│ └── main.c # Iris dataset example
└── tests/ # Test suite
```
## API Reference

For a detailed API reference, refer to [API Documentation](API.md).

## Contributing

Expand Down
10 changes: 10 additions & 0 deletions build.sh
Original file line number Diff line number Diff line change
@@ -0,0 +1,10 @@
#!/bin/sh

set -e
mkdir -p build
cd build
cmake ..
cmake --build .

echo "\nBuild complete. Ctensor Tests Executable is in the 'build/bin' directory. Run 'build/bin/cten_tests' to run the tests. \n"
echo "\nFor running the main(Demo) executable, run 'build/cten_exe' \n"
15 changes: 0 additions & 15 deletions build_g.sh

This file was deleted.

2 changes: 1 addition & 1 deletion include/common/vector.h
Original file line number Diff line number Diff line change
Expand Up @@ -60,7 +60,7 @@ void c11_vector__swap(c11_vector* self, c11_vector* other);
#define c11_vector__erase(T, self, index) \
do { \
T* p = (T*)(self)->data + (index); \
memmove(p, p + 1, ((self)->length - (index)-1) * sizeof(T)); \
memmove(p, p + 1, ((self)->length - (index) - 1) * sizeof(T)); \
(self)->length--; \
} while(0)

Expand Down
Loading
Loading