We present a collection of fundamental machine learning algorithms implemented from scratch using Python and NumPy, without relying on high-level ML frameworks. Implementations include supervised methods (linear regression, logistic regression, decision trees, random forests, SVMs, naive Bayes, k-nearest neighbors), unsupervised methods (k-means, DBSCAN, PCA), and neural networks with backpropagation. Each implementation includes a standalone example script demonstrating training on synthetic or standard datasets. Polynomial regression on a sinusoidal target achieves Mean Squared Error of approximately 0.4 after 1000 gradient descent steps.